Extensive research has been conducted to determine the minimum of a given function of n variables. Many authors have developed and presented algorithms that provide appropriate results for unimodal functions. Many of these algorithms attempt to improve the minimization efficiency by using derivative evaluations of the cost functions. These local search methods, however, do not work satisfactorily when the cost function is multi-modal within the domain of interest, because they tend to stop at the first minimum encountered. Several new approaches have emerged for handling global optimization problems. These approaches include simulated annealing, genetic algorithms, neural networks, and tabu search.
Much research on minimizing analytical and numerical functions of several variables has been published in the past few years. Several authors have proposed various discretization step control schemes. In the present study, the authors propose a new global optimization algorithm for high-dimensional continuous problems; their algorithm is derived from the basic simulated annealing method. This study also demonstrates an effective step control strategy involving a balanced use of large steps as well as short steps for each of the function’s variables. Moreover, the authors show how the discretization control scheme is related to the temperature-decreasing strategy.
The study involves solving higher-dimension optimization problems. Let n be the problem dimension. The study shows how successive sets of p variables are selected from the n variables of the original set. Selection rules that yield the greatest possible number of different p-variable subsets are generated with a uniform probability. Several complementary stopping criteria contribute to efficiency by reducing the number of objective function evaluations. The authors further demonstrate the validity of their approach by minimizing some classical highly multimodal functions. Also, numerical examples with corresponding numbers of objective function evaluations are presented. The authors propose to use the Nelder and Mead simplex algorithm as a natural continuation to the enhanced simulated annealing algorithm. This additional optimization stage yields more accurate results, while keeping the number of objective function evaluations reasonable.