- 1 General Description

- 2 Early Applications
- 3 Different Versions
- 1 Fortran Library, Portable
- 2 Turbo C, Intreactive
- 3 Unix C++, Interactive
- 4 MS-Windows C#, Interactive
- 5 Java JDK1.0, Interactive
- 6 Java JDK1.1, Interactive
- 7 Java JDK1.1, Interactive
- 8 Java JDK1.2, Interactive

- 4 Web-Sites

3 Introduction to Software

- four versions of the Bayesian search,
- a version of clustering,
- a version of the uniform deterministic grid,
- a version of the pure Monte Carlo search.

There are three local optimization methods:

- a method of sequential quadratic programming for constrained optimization of smooth functions [Schittkowski, 1985],
- a simplex type method of Nelder and Mead with penalty functions for constrained optimization of non-differentiable functions [Himmelblau, 1972],
- a method of stochastic approximation type for "noisy" functions [Mockus, 1989a].

1 Global Methods

**Bayes1**- is the Bayesian method [Mockus et al., 1997].
**Region**- : rectangular.
**Objective**- : continuous (possibly with "noise").
**Convergence**- : to global minimum (in probability, if noisy).
**Comments**- :

for "expensive" objectives using not too many observations.

**Mig1**- is the Monte Carlo search.
**Region**- : rectangular.
**Objective**- : general.
**Convergence**- : in probability.
**Comments**- :

for inexpensive and irregular objectives using great number of observations.

**Unt**- is the extrapolation type method
[Torn and Zilinskas, 1989].
**Region**- : rectangular.
**Objective**- : continuous.
**Convergence**- : to global minimum.
**Comments**- :

for expensive objectives using not too many observations.

**Exkor**- is the Bayesian coordinate line search method [Torn and Zilinskas, 1989].
**Region**- : rectangular.
**Objective**- : continuous.
**Convergence**- : to global minimum on the search line.
**Comments**- :

for approximately "separable" objectives and for preliminary exploration using the projection windows.

**Glopt**- is the clustering method [Torn and Zilinskas, 1989].
**Region**- : rectangular.
**Objective**- : continuous.
**Convergence**- : not provided.
**Comments**- :

works well in many practical problems with a moderate number of local minima.

**Lpmin**- is the uniform deterministic search [Sobolj, 1967,Dzemyda et al., 1984].
**Region**- : rectangular.
**Objective**- : general.
**Convergence**- : to global minimum.
**Comments**- :

for inexpensive objectives using many observations.

2 Local Methods

**Nlp**- is the non linear programming method [Schittkowski, 1985].
**Region**- : defined by linear and non-linear constraints.
**Objective**- : differentiable.
**Convergence**- : to local minimum.

**Flexi**- is the simplex method
[Himmelblau, 1972].
**Region**- : defined by non-linear constraints.
**Objective**- : non differentiable.
**Convergence**- : not provided.

**Lbayes**- is the stochastic approximation method with the
Bayesian step size control [Mockus, 1989a].
**Region**- : rectangular.
**Objective**- : continuous with noise.
**Convergence**- : to local minimum, in probability.

- The Bayesian methods are recommended for
"expensive"
^{}functions. Those methods need auxiliary calculations to make each observation more efficient. - For "cheap" functions, simple grid methods, like the Monte Carlo or the uniform deterministic grid [Sobolj, 1967], can be better. Here observations are not so efficient, but auxiliary calculations are negligible. This explains the relative efficiency of simple methods when optimizing simple functions.
- The clustering techniques [Torn and Zilinskas, 1989] may be the best choice, if we expect the number of local minima to be small.
- A simple Bayesian technique is available [Torn and Zilinskas, 1989] for global optimization of one-dimensional functions.
- There are optimization problems
where objective functions can be roughly represented as a sum of
components depending on different variables. Here the Bayesian
method doing a line search along each coordinate shows good
results, as usual. This method globally optimizes one variable
at a time by the one-dimensional Bayesian search.
Therefore, results depend on the starting point.
This is the difference
of this method from other methods of global optimization.
However, a deviation from the global minimum can be made as small as desired by applying a multi-start search from different uniformly distributed starting points. The important advantage are good . They show how an objective function depends on different variables (see Figures 9.4 and 9.5). The other methods present messy projections (see Figures 2.3 (bottom), and 2.4). The reason is that all the variables change together.

5 Computing Environment

- a portable Fortran library,
- interactive software for Turbo C compilers and DOS operating systems,
- interactive software for C++ compilers and Unix operating systems,
- three versions of interactive software using Java ( for JDK1.0, for JDK1.1, and for JDK1.2).

- Maximization of general yield of differential amplifiers [Mockus, 1989a].
- Optimization of a mechanical system of shock-absorber [Mockus, 1989a].
- Evaluation of parameters of non-linear regression of an immunological model [Mockus, 1989a].

- The engineering design is a large source of difficult global optimization problems. One optimizes parameters of some mathematical models, non-linear, as usual. The optimization of composite laminates [Mockus et al., 1997] serves as an example.
- Many laws of nature could be defined in terms of global optimization. An example is the "Disk" problem: minimization of the potential energy of organic molecules [Mockus et al., 1997].
- Often one cannot describe the behavior of new materials and technologies by mathematical models, because the information and knowledge are not complete. Here one optimizes by changing control variables and observing the results of direct experimentation. An example is the planning of extremal experiments of thermostable polymeric compositions [Mockus et al., 1997].

1 Fortran Library, Portable

The portable Fortran version can run on any computer with a
standard Fortran compiler. Users represent objective
functions as

FUNCTION FI(X,N). Here X is an array of variables
and N is its dimension. The lower bound array and the
upper bound array define rectangular constraints. Using local
methods of simplex and sequential quadratic programming type, one represents
constraints by the subroutine

CONSTR(X,N,G,MC). Here, is an
array of length that contains values of constraints at a
point . is the number of constraints.

The advantage of the Fortran library is the portability. A disadvantage is limited interactive possibilities. Therefore, this version can be conveniently used for some well-defined optimization problems. For preliminary investigations of new problems, some interaction is essential.

2 Turbo C, Intreactive

**GRAPH**- : shows how the best value of an objective function depends on observation numbers,
**PROJECTION**- : shows how current objective function values depend on a variable.

Good interactive facilities is an advantage of the Turbo C version. A disadvantage is that this version can be used only in the DOS-compatible environment. Often one prefers Unix systems while solving large scale optimization problems.

3 Unix C++, Interactive

4 MS-Windows C#, Interactive

This is a global optimization software in C# designed for Windows98 and later. The graphical interface is similar to that of the Unix C++ version. It can be used as a standalone application, in a way similar to that of the C++ version.

5 Java JDK1.0, Interactive

6 Java JDK1.1, Interactive

7 Java JDK1.1, Interactive

8 Java JDK1.2, Interactive

starts the applet in the directory of the server

. This way one bypasses a browser but must write the complete path.

The main new features:

- functions can be involved by all optimization methods and other user defined functions,
- each objective function is accompanied by its own set of visualization algorithms and graphical interfaces, they appear only when this objective function is chosen.

4 Web-Sites

Convenient reading at any place is an obvious advantage of hard copy. The theoretical considerations, including descriptions of mathematical models and algorithms, are clear and understandable, on a printed page. The static presentation is a disadvantage. A hard copy cannot be updated. Therefore, new results wait for the next edition.

The aim of the web-sites is to supplement the models and algorithms by interactive software systems. These systems can be tested directly by users in different computing environments, using different languages and operating systems (see Figures 3.1, 3.2, 3.3).

The dynamic presentation is the important advantage of web-sites. Results can be easily updated and made accessible for many readers, free of charge. Therefore, the author considers this book as a static introduction to dynamic web-sites. The mirror-sites:The second web-site is experimental. It is updated frequently, and has larger volume. The accompanying text, mostly in English, is in , , , or files. The site is used by graduate students of Lithuanian universities. Therefore, a part of the text is both in English and in Lithuanian. Sometimes, during holidays this site is closed.

In the future, the web-sites will be regularly updated by new developments. However, the software will remain compatible with the present description.

Contact e-mail addresses: