Subsections

# 1 General Description

## 1 Background

The software system Global Minimizer (GM) was initiated in early eighties. An initial set of algorithms was selected considering results of the international "competition" of different global optimization methods [Dixon and Szego, 1978]. The experience in real life optimization problems was used updating the set. The set of global optimization algorithms includes
• four versions of the Bayesian search,
• a version of clustering,
• a version of the uniform deterministic grid,
• a version of the pure Monte Carlo search.
As usual, the global optimization is concuded by some local method. Exceptions are two global algorithms: the Torn version of clustering [Torn and Zilinskas, 1989], and the Zilinskas version of the Bayesian technique [Torn and Zilinskas, 1989]. Both these algorithms contain some simple local search algorithms. The additional local search is not necessary for those two methods, but it may be useful.

There are three local optimization methods:

• a method of sequential quadratic programming for constrained optimization of smooth functions [Schittkowski, 1985],
• a simplex type method of Nelder and Mead with penalty functions for constrained optimization of non-differentiable functions [Himmelblau, 1972],
• a method of stochastic approximation type for "noisy" functions [Mockus, 1989a].

## 2 Description of Methods

### 1 Global Methods

Bayes1
is the Bayesian method [Mockus et al., 1997].
Region
: rectangular.
Objective
: continuous (possibly with "noise").
Convergence
: to global minimum (in probability, if noisy).
:
for "expensive" objectives using not too many observations.
Mig1
is the Monte Carlo search.
Region
: rectangular.
Objective
: general.
Convergence
: in probability.
:
for inexpensive and irregular objectives using great number of observations.
Unt
is the extrapolation type method [Torn and Zilinskas, 1989].
Region
: rectangular.
Objective
: continuous.
Convergence
: to global minimum.
:
for expensive objectives using not too many observations.
Exkor
is the Bayesian coordinate line search method [Torn and Zilinskas, 1989].
Region
: rectangular.
Objective
: continuous.
Convergence
: to global minimum on the search line.
:
for approximately "separable" objectives and for preliminary exploration using the projection windows.
Glopt
is the clustering method [Torn and Zilinskas, 1989].
Region
: rectangular.
Objective
: continuous.
Convergence
: not provided.
:
works well in many practical problems with a moderate number of local minima.
Lpmin
is the uniform deterministic search [Sobolj, 1967,Dzemyda et al., 1984].
Region
: rectangular.
Objective
: general.
Convergence
: to global minimum.
:
for inexpensive objectives using many observations.

### 2 Local Methods

Nlp
is the non linear programming method [Schittkowski, 1985].
Region
: defined by linear and non-linear constraints.
Objective
: differentiable.
Convergence
: to local minimum.
Flexi
is the simplex method [Himmelblau, 1972].
Region
: defined by non-linear constraints.
Objective
: non differentiable.
Convergence
: not provided.
Lbayes
is the stochastic approximation method with the Bayesian step size control [Mockus, 1989a].
Region
: rectangular.
Objective
: continuous with noise.
Convergence
: to local minimum, in probability.

## 3 Application Areas

Each subroutine represents a global or a local method. The choice of methods follows the comparability idea. That means that the computational complexity of the method should roughly correspond to that of the objective function:
• The Bayesian methods are recommended for "expensive" functions. Those methods need auxiliary calculations to make each observation more efficient.
• For "cheap" functions, simple grid methods, like the Monte Carlo or the uniform deterministic grid [Sobolj, 1967], can be better. Here observations are not so efficient, but auxiliary calculations are negligible. This explains the relative efficiency of simple methods when optimizing simple functions.
• The clustering techniques [Torn and Zilinskas, 1989] may be the best choice, if we expect the number of local minima to be small.
• A simple Bayesian technique is available [Torn and Zilinskas, 1989] for global optimization of one-dimensional functions.
• There are optimization problems where objective functions can be roughly represented as a sum of components depending on different variables. Here the Bayesian method doing a line search along each coordinate shows good results, as usual. This method globally optimizes one variable at a time by the one-dimensional Bayesian search. Therefore, results depend on the starting point. This is the difference of this method from other methods of global optimization.

However, a deviation from the global minimum can be made as small as desired by applying a multi-start search from different uniformly distributed starting points. The important advantage are good . They show how an objective function depends on different variables (see Figures 9.4 and 9.5). The other methods present messy projections (see Figures 2.3 (bottom), and 2.4). The reason is that all the variables change together.

## 4 Constraints

All the global methods optimize in rectangular regions. Therefore, one represents linear and non-linear inequality constraints as penalty functions. The same applies to the local method of stochastic approximation type. In the local methods of simplex and sequential quadratic programming type, the linear and non-linear constraints are defined directly. This is done by constraint subroutines, supplied by users in addition to the objective function.

## 5 Computing Environment

There are several versions of global optimization software
• a portable Fortran library,
• interactive software for Turbo C compilers and DOS operating systems,
• interactive software for C++ compilers and Unix operating systems,
• three versions of interactive software using Java ( for JDK1.0, for JDK1.1, and for JDK1.2).
One may notice a cycle of portability in this sequence of software versions. The sequence is started by the portable Fortran library and is concluded by the Java systems. The systems between those two are more difficult to port. Fortran, Turbo C, and C++ versions are in [Mockus et al., 1997]. All the software is on web-sites (see section 4 and Figure 3.1). # 2 Early Applications

Besides examples of this book, we briefly mention the examples considered in early publications [Mockus, 1989a,Mockus et al., 1997]. Many of them are related to the optimization of parameters of mathematical models, represented as some systems of non-linear differential equations. The objective function depends on a solution of these equations. Variables represent the controlled parameters of the system. The following examples illustrate this family of problems.
• Maximization of general yield of differential amplifiers [Mockus, 1989a].
• Optimization of a mechanical system of shock-absorber [Mockus, 1989a].
• Evaluation of parameters of non-linear regression of an immunological model [Mockus, 1989a].
The last example suggests a broad area for applications of global optimization. It is well known that in non-linear regression the square deviation and the likelihood function could be multi-modal, for some data. The number of local minima may be very large, even in simple cases. An example is the evaluation of unknown parameters of ARMA models [Mockus et al., 1997]. There are other important families of global optimization problems.
• The engineering design is a large source of difficult global optimization problems. One optimizes parameters of some mathematical models, non-linear, as usual. The optimization of composite laminates [Mockus et al., 1997] serves as an example.
• Many laws of nature could be defined in terms of global optimization. An example is the "Disk" problem: minimization of the potential energy of organic molecules [Mockus et al., 1997].
• Often one cannot describe the behavior of new materials and technologies by mathematical models, because the information and knowledge are not complete. Here one optimizes by changing control variables and observing the results of direct experimentation. An example is the planning of extremal experiments of thermostable polymeric compositions [Mockus et al., 1997].
The Bayesian algorithms of the continuous global optimization are an essential part of BHA. One applies them to optimize the parameters of randomized heuristics while solving various discrete optimization problems [Mockus et al., 1997].

# 3 Different Versions

## 1 Fortran Library, Portable

The description of the global optimization software starts from the Fortran library (see Chapter 4). In other versions, mostly the same algorithms are used. Their parameters have a similar meaning. Therefore, the Turbo C, C++ and Java users may obtain some useful information by reading the Fortran library description.

The portable Fortran version can run on any computer with a standard Fortran compiler. Users represent objective functions as
FUNCTION FI(X,N). Here X is an array of variables and N is its dimension. The lower bound array and the upper bound array define rectangular constraints. Using local methods of simplex and sequential quadratic programming type, one represents constraints by the subroutine
CONSTR(X,N,G,MC). Here, is an array of length that contains values of constraints at a point . is the number of constraints.

The advantage of the Fortran library is the portability. A disadvantage is limited interactive possibilities. Therefore, this version can be conveniently used for some well-defined optimization problems. For preliminary investigations of new problems, some interaction is essential.

## 2 Turbo C, Intreactive

This version is for objective functions represented in C. It needs a Turbo C compiler. Users represent the objective function as the C subroutine . Users can select a global or a local method by a menu system. Current results are observed in both, tabular and graphical, forms. There are two graphical forms.
GRAPH
: shows how the best value of an objective function depends on observation numbers,
PROJECTION
: shows how current objective function values depend on a variable.
The set of methods, in both the Fortran and the Turbo C versions, remains the same, except a non-linear programming method. The Biggs method [Biggs, 1975] is implemented in Fortran to solve non-linear programming problems. In all other languages, the sequential quadratic programming [Schittkowski, 1985] is used for these problems.

Good interactive facilities is an advantage of the Turbo C version. A disadvantage is that this version can be used only in the DOS-compatible environment. Often one prefers Unix systems while solving large scale optimization problems.

## 3 Unix C++, Interactive

This is a global optimization software designed for UNIX and X -Window systems. It includes a usual set of interactive facilities. The main feature is the possibility to port to any environment that supports Unix and X-Window, including super computers and distributed computing systems.

## 4 MS-Windows C#, Interactive

This is a global optimization software in C# designed for Windows98 and later. The graphical interface is similar to that of the Unix C++ version. It can be used as a standalone application, in a way similar to that of the C++ version.

## 5 Java JDK1.0, Interactive

This is a global optimization software in Java, designed for the JDK1.0 tool kit. The graphical interface is similar to that of the Unix C++ version. It can be used as a standalone application, in a way similar to that of the C++ version. The software can be run directly by Internet as an applet, too. Any Internet browser works, if it supports JDK1.0. That includes older versions of Netscape and Internet Explorer. This is an advantage, because these versions are still used.

## 6 Java JDK1.1, Interactive

This is a global optimization software in Java, designed for JDK1.1. The graphical interface is open and can be extended by including new methods, additional tasks and updated result analysis facilities. The software can be used as a standalone application or as an applet by the Internet browsers supporting JDK1.1. That excludes early browsers such as Netscape-3. Rectangular constraints are defined by the graphical interface. Other constraints are involved by adding the penalty functions that are defined by users.

## 7 Java JDK1.1, Interactive

This is a global optimization software in Java, designed for JDK1.1. The graphical interface is open and can be extended by including new methods, additional tasks and updated result analysis facilities. The software can be used as a standalone application or as an applet by the Internet browsers supporting JDK1.1. That excludes early browsers such as Netscape-3. Rectangular constraints are defined by the graphical interface. Other constraints are involved by adding the penalty functions that are defined by users.

## 8 Java JDK1.2, Interactive

This is a new version of global optimization software. It exploits new possibilities of JDK1.2. This Java version works on browsers that support JDK1.2. Otherwise, the JDK1.2 kit should be installed. Then the command starts the applet in the directory of the server . This way one bypasses a browser but must write the complete path.

The main new features:

• functions can be involved by all optimization methods and other user defined functions,
• each objective function is accompanied by its own set of visualization algorithms and graphical interfaces, they appear only when this objective function is chosen.
The detail description of different software versions follows. Most versions use the same algorithms. Their parameters are similar. The software is on the web-sites.

# 4 Web-Sites

Convenient reading at any place is an obvious advantage of hard copy. The theoretical considerations, including descriptions of mathematical models and algorithms, are clear and understandable, on a printed page. The static presentation is a disadvantage. A hard copy cannot be updated. Therefore, new results wait for the next edition.

The aim of the web-sites is to supplement the models and algorithms by interactive software systems. These systems can be tested directly by users in different computing environments, using different languages and operating systems (see Figures 3.1, 3.2, 3.3).  The dynamic presentation is the important advantage of web-sites. Results can be easily updated and made accessible for many readers, free of charge. Therefore, the author considers this book as a static introduction to dynamic web-sites. The mirror-sites:
1. 2. 3. 4. The first web-site is designed for the international audience. Only selected items are included in this site. The updates are done by replacing the existing items, only when obviously better results appear. Therefore, updating is less frequent. The site is available always.

The second web-site is experimental. It is updated frequently, and has larger volume. The accompanying text, mostly in English, is in , , , or files. The site is used by graduate students of Lithuanian universities. Therefore, a part of the text is both in English and in Lithuanian. Sometimes, during holidays this site is closed.

In the future, the web-sites will be regularly updated by new developments. However, the software will remain compatible with the present description.   