Abstract
This work continues the investigation in Ref. [1]: designing minimization algorithms in the framework of supervisor and searcher cooperation (SSC). It explores a wider range of possible supervisors and search engines to be used in the construction of SSC algorithms. Global convergence is established for algorithms with general supervisors and search engines in the absence of noise, and the convergence rate is studied. Both theoretical analysis and numerical results illustrate the appealing attributes of the proposed algorithms.
Original language | English |
---|---|
Pages (from-to) | 247-264 |
Number of pages | 17 |
Journal | Optimization Methods and Software |
Volume | 18 |
Issue number | 3 |
DOIs | |
Publication status | Published - Jun 2003 |
Keywords
- supervisor searcher cooperation
- global convergence
- stochastic optimization
- stochastic approximation