The early Global Minimizer was in Fortran (GMF). Therefore, we describe the Fortran version as an introduction to later versions. The GMF system is to minimize a continuous function
A feature of all versions of the Global Minimum software, including GMF, is the Bayesian approach.
The Bayesian methods are optimal, in the sense of an
average deviation. Here the next observation is defined by
minimizing the risk function . The risk is an expected
deviation from the global minimum. The minimization of the risk
function is justified, if the objective function
needs
considerable computing. Otherwise, it could be better to do
more observations, which need not to be planned optimally.
In such a case, a good idea is to use a uniform search, for
example, of LP type [Sobolj, 1967]. Using the LP-search we
lose some efficiency of observations but still have the
convergence. However, the LP-search can be too expensive, if
observations are very cheap. Then we apply the uniform random
(Monte Carlo) search (Monte Carlo) which is very simple. Using
the Monte Carlo we obtain the convergence only in probability.
The LP-search (LPMIN) and the uniform random search (MIG1 and
MIG2) algorithms. Both these algorithms are in GMF.
The method of clustering [Torn, 1990] is included as a good heuristic technique under the title GLOPT. It usually works well, if the number of local minima is small and known in advance, and if the "attraction area" of the global minimum is large. GLOPT is comparable to LPMIN by the computing time.
The global line search EXKOR is a "semi-global" method designed for approximately "additive" functions. EXKOR is convenient for visualization. One see clearly how values of the objective function depend on variables. That is important for preliminary investigations.
Global methods search the whole area, as usual. Otherwise, one can miss the global minimum. Obviously, the global search is not the most efficient way for local optimization. Therefore, in GM the global search is completed by the local one, as usual. The best result of a global search is defined by default as an initial point for a local search.
Methods of variable
metrics type are widely used for local optimization of continuously
differentiable functions without noise with non-linear constraints. The local mathod, called Recursive
Quadratic Programming [Biggs, 1975], is included under the
name REQP.
A specific Variable Metrics version for rectangular constraints [Tiesis, 1975] is called MIVAR4. For local optimization of non-differentiable functions with non-linear constraints the simplex type method [Himmelblau, 1972] is used under the name FLEXI.
Sometimes, one expects that the influence of some variables and their groups is considerably greater than that of others. Then the method LPMIN should be used for analysis of the structure before we start the regular optimization. The LPMIN orders variables by their "importance" using a sort of variance analysis techniques [Saltenis, 1989].
LBAYES is a stochastic approximation type algorithm with the
"Bayesian"
step length control. It is designed for local optimization of uni-modal functions with a noise.
In GMF, no machine-dependent routines are used. Therefore, the software can be adapted to any computer with a standard FORTRAN compiler by adjusting standard machine-dependent constants.
No | Name | Method truecm No. of control parameters | ||
Integer IPAR Real PAR | ||||
Global optimization with rectangular constants | ||||
1 | BAYES1 | Bayesian | 3 | 0 |
2 | UNT | Extrapolation | 4 | 0 |
3 | LPMIN | Uniform deterministic | N+3 | 0 |
4 | GLOPT | Clustering | 3 | 0 |
5 | MIG1, MIG2 | Uniform random | 2 | 0 |
6 | EXTR | Bayesian one-dimensional | 3 | 2 |
7 | EXKOR | Extension of EXTR to | ||
multi-dimensional case | 5 | N+1 | ||
Local optimization | ||||
8 | MIVAR4 | Variable metric with rectangular | ||
constants | 4 | 4 | ||
9 | REQP | Variable metrics with non-linear | ||
constraints | 4 | 4 | ||
10 | FLEXI | simplex with non-linear constraints | ||
4 | 2 | |||
Local optimization with noise | ||||
11 | LBAYES | stochastic approximation with | ||
Bayes step length and | ||||
rectangular constraints | ||||
3 | 2 | |||
It is well known that local methods of optimization are
sensitive to the scales of variables, as usual. The parameters
of local methods in GMF are adjusted to the
case when and
, as usual. Therefore, it is convenient to reduce the rectangular constraints
to the
-dimensional hypercube
. This can be done
by the following reduction formulae
).
If needed, the subroutine CONSTR(X, N, G, MC) is included.
It evaluates the constraints at the point X. The length of
arrays depends on the subroutines, as usual. The exception is the
arrays IPAR and PAR. The length of these arrays is always
thirty. The parameters of methods are included in the arrays
IPAR and PAR by the given sequence of methods. Formal parameters
IPA and IPAA are fixed using the rules that are given in the previous
section. In the case, when only one method is used,
.
It follows from Table 4.1 that, in the subroutine
BAYES1, there are three integer parameters. In the subroutine MIVAR4,
four integer parameters are used. In MIVAR4 four real
parameters should be defined, too. This means that seven
elements of IPAR and four elements of PAR should be fixed.
The
main program is in Figure 4.2