Subsections


3 Introduction to Software

1 General Description

1 Background

The software system Global Minimizer (GM) was initiated in early eighties. An initial set of algorithms was selected considering results of the international "competition" of different global optimization methods [Dixon and Szego, 1978]. The experience in real life optimization problems was used updating the set. The set of global optimization algorithms includes As usual, the global optimization is concuded by some local method. Exceptions are two global algorithms: the Torn version of clustering [Torn and Zilinskas, 1989], and the Zilinskas version of the Bayesian technique [Torn and Zilinskas, 1989]. Both these algorithms contain some simple local search algorithms. The additional local search is not necessary for those two methods, but it may be useful.

There are three local optimization methods:

2 Description of Methods


1 Global Methods

Bayes1
is the Bayesian method [Mockus et al., 1997].
Region
: rectangular.
Objective
: continuous (possibly with "noise").
Convergence
: to global minimum (in probability, if noisy).
Comments
:
for "expensive" objectives using not too many observations.
Mig1
is the Monte Carlo search.
Region
: rectangular.
Objective
: general.
Convergence
: in probability.
Comments
:
for inexpensive and irregular objectives using great number of observations.
Unt
is the extrapolation type method [Torn and Zilinskas, 1989].
Region
: rectangular.
Objective
: continuous.
Convergence
: to global minimum.
Comments
:
for expensive objectives using not too many observations.
Exkor
is the Bayesian coordinate line search method [Torn and Zilinskas, 1989].
Region
: rectangular.
Objective
: continuous.
Convergence
: to global minimum on the search line.
Comments
:
for approximately "separable" objectives and for preliminary exploration using the projection windows.
Glopt
is the clustering method [Torn and Zilinskas, 1989].
Region
: rectangular.
Objective
: continuous.
Convergence
: not provided.
Comments
:
works well in many practical problems with a moderate number of local minima.
Lpmin
is the uniform deterministic search [Sobolj, 1967,Dzemyda et al., 1984].
Region
: rectangular.
Objective
: general.
Convergence
: to global minimum.
Comments
:
for inexpensive objectives using many observations.


2 Local Methods

Nlp
is the non linear programming method [Schittkowski, 1985].
Region
: defined by linear and non-linear constraints.
Objective
: differentiable.
Convergence
: to local minimum.
Flexi
is the simplex method [Himmelblau, 1972].
Region
: defined by non-linear constraints.
Objective
: non differentiable.
Convergence
: not provided.
Lbayes
is the stochastic approximation method with the Bayesian step size control [Mockus, 1989a].
Region
: rectangular.
Objective
: continuous with noise.
Convergence
: to local minimum, in probability.

3 Application Areas

Each subroutine represents a global or a local method. The choice of methods follows the comparability idea. That means that the computational complexity of the method should roughly correspond to that of the objective function:

4 Constraints

All the global methods optimize in rectangular regions. Therefore, one represents linear and non-linear inequality constraints as penalty functions. The same applies to the local method of stochastic approximation type. In the local methods of simplex and sequential quadratic programming type, the linear and non-linear constraints are defined directly. This is done by constraint subroutines, supplied by users in addition to the objective function.


5 Computing Environment

There are several versions of global optimization software One may notice a cycle of portability in this sequence of software versions. The sequence is started by the portable Fortran library and is concluded by the Java systems. The systems between those two are more difficult to port. Fortran, Turbo C, and C++ versions are in [Mockus et al., 1997]. All the software is on web-sites (see section 4 and Figure 3.1).
Figure 3.1: The Software Systems page.
\begin{figure}\centerline{ \epsfig{file=open1.eps,width=12.0cm}
}\end{figure}

2 Early Applications

Besides examples of this book, we briefly mention the examples considered in early publications [Mockus, 1989a,Mockus et al., 1997]. Many of them are related to the optimization of parameters of mathematical models, represented as some systems of non-linear differential equations. The objective function $f(x)$ depends on a solution of these equations. Variables $x$ represent the controlled parameters of the system. The following examples illustrate this family of problems. The last example suggests a broad area for applications of global optimization. It is well known that in non-linear regression the square deviation and the likelihood function could be multi-modal, for some data. The number of local minima may be very large, even in simple cases. An example is the evaluation of unknown parameters of ARMA models [Mockus et al., 1997]. There are other important families of global optimization problems. The Bayesian algorithms of the continuous global optimization are an essential part of BHA. One applies them to optimize the parameters of randomized heuristics while solving various discrete optimization problems [Mockus et al., 1997].

3 Different Versions


1 Fortran Library, Portable

The description of the global optimization software starts from the Fortran library (see Chapter 4). In other versions, mostly the same algorithms are used. Their parameters have a similar meaning. Therefore, the Turbo C, C++ and Java users may obtain some useful information by reading the Fortran library description.

The portable Fortran version can run on any computer with a standard Fortran compiler. Users represent objective functions as
FUNCTION FI(X,N). Here X is an array of variables and N is its dimension. The lower bound array $A$ and the upper bound array $B$ define rectangular constraints. Using local methods of simplex and sequential quadratic programming type, one represents constraints by the subroutine
CONSTR(X,N,G,MC). Here, $G$ is an array of length $MC$ that contains values of constraints at a point $X$. $MC$ is the number of constraints.

The advantage of the Fortran library is the portability. A disadvantage is limited interactive possibilities. Therefore, this version can be conveniently used for some well-defined optimization problems. For preliminary investigations of new problems, some interaction is essential.


2 Turbo C, Intreactive

This version is for objective functions represented in C. It needs a Turbo C compiler. Users represent the objective function $f(x)$ as the C subroutine $fi.c$. Users can select a global or a local method by a menu system. Current results are observed in both, tabular and graphical, forms. There are two graphical forms.
GRAPH
: shows how the best value of an objective function depends on observation numbers,
PROJECTION
: shows how current objective function values depend on a variable.
The set of methods, in both the Fortran and the Turbo C versions, remains the same, except a non-linear programming method. The Biggs method [Biggs, 1975] is implemented in Fortran to solve non-linear programming problems. In all other languages, the sequential quadratic programming [Schittkowski, 1985] is used for these problems.

Good interactive facilities is an advantage of the Turbo C version. A disadvantage is that this version can be used only in the DOS-compatible environment. Often one prefers Unix systems while solving large scale optimization problems.


3 Unix C++, Interactive

This is a global optimization software designed for UNIX and X -Window systems. It includes a usual set of interactive facilities. The main feature is the possibility to port to any environment that supports Unix and X-Window, including super computers and distributed computing systems.


4 MS-Windows C#, Interactive

This is a global optimization software in C# designed for Windows98 and later. The graphical interface is similar to that of the Unix C++ version. It can be used as a standalone application, in a way similar to that of the C++ version.


5 Java JDK1.0, Interactive

This is a global optimization software in Java, designed for the JDK1.0 tool kit. The graphical interface is similar to that of the Unix C++ version. It can be used as a standalone application, in a way similar to that of the C++ version. The software can be run directly by Internet as an applet, too. Any Internet browser works, if it supports JDK1.0. That includes older versions of Netscape and Internet Explorer. This is an advantage, because these versions are still used.


6 Java JDK1.1, Interactive

This is a global optimization software in Java, designed for JDK1.1. The graphical interface is open and can be extended by including new methods, additional tasks and updated result analysis facilities. The software can be used as a standalone application or as an applet by the Internet browsers supporting JDK1.1. That excludes early browsers such as Netscape-3. Rectangular constraints are defined by the graphical interface. Other constraints are involved by adding the penalty functions that are defined by users.


7 Java JDK1.1, Interactive

This is a global optimization software in Java, designed for JDK1.1. The graphical interface is open and can be extended by including new methods, additional tasks and updated result analysis facilities. The software can be used as a standalone application or as an applet by the Internet browsers supporting JDK1.1. That excludes early browsers such as Netscape-3. Rectangular constraints are defined by the graphical interface. Other constraints are involved by adding the penalty functions that are defined by users.


8 Java JDK1.2, Interactive

This is a new version of global optimization software. It exploits new possibilities of JDK1.2. This Java version works on browsers that support JDK1.2. Otherwise, the JDK1.2 kit should be installed. Then the command
\begin{codesamp}
appletviewer http://mockus.org/optimum/gmjg2j/gmj.html
\end{codesamp}
starts the applet $gmj.html$ in the directory $optimum$ of the server
$mockus.org$. This way one bypasses a browser but must write the complete path.

The main new features:

The detail description of different software versions follows. Most versions use the same algorithms. Their parameters are similar. The software is on the web-sites.


4 Web-Sites

Convenient reading at any place is an obvious advantage of hard copy. The theoretical considerations, including descriptions of mathematical models and algorithms, are clear and understandable, on a printed page. The static presentation is a disadvantage. A hard copy cannot be updated. Therefore, new results wait for the next edition.

The aim of the web-sites is to supplement the models and algorithms by interactive software systems. These systems can be tested directly by users in different computing environments, using different languages and operating systems (see Figures 3.1, 3.2, 3.3).

Figure 3.2: Examples of Global Optimization.
\begin{figure}\centerline{\epsfig{file=open2.eps,width=12.0cm}
}\end{figure}
Figure 3.3: Examples of Discrete Optimization, Linear and Dynamic Programming.
\begin{figure}\centerline
{
\epsfig{file=open3.eps,width=12.0cm}
}\protect\end{figure}
The dynamic presentation is the important advantage of web-sites. Results can be easily updated and made accessible for many readers, free of charge. Therefore, the author considers this book as a static introduction to dynamic web-sites. The mirror-sites:
  1. $http://mockus.org/optimum$
  2. $http://optimum2.mii.lt/\tilde{~}jonas2$
  3. $http://soften.ktu.lt/\tilde{~}mockus$
  4. $http://www.vtu.lt/mockus$

The first web-site is designed for the international audience. Only selected items are included in this site. The updates are done by replacing the existing items, only when obviously better results appear. Therefore, updating is less frequent. The site is available always.

The second web-site is experimental. It is updated frequently, and has larger volume. The accompanying text, mostly in English, is in $txt$, $html$, $pdf$, or $ps$ files. The site is used by graduate students of Lithuanian universities. Therefore, a part of the text is both in English and in Lithuanian. Sometimes, during holidays this site is closed.

In the future, the web-sites will be regularly updated by new developments. However, the software will remain compatible with the present description.

Contact e-mail addresses:
$jonas@optimum.mii.lt$
$jonas2@optimum2.mii.lt$
$audris@mockus.org$

jonas mockus 2004-03-03