Novel Algorithms for Noisy Minimization Problems with Applications to Neural Networks Training

K. Sirlantzis, John Douglas Lamb, W. B. Liu

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

The supervisor and searcher cooperation framework (SSC), introduced in Refs. 1 and 2, provides an effective way to design efficient optimization algorithms combining the desirable features of the two existing ones. This work aims to develop efficient algorithms for a wide range of noisy optimization problems including those posed by feedforward neural networks training. It introduces two basic SSC algorithms. The first seems suited for generic problems. The second is motivated by neural networks training problems. It introduces also inexact variants of the two algorithms, which seem to possess desirable properties. It establishes general theoretical results about the convergence and speed of SSC algorithms and illustrates their appealing attributes through numerical tests on deterministic, stochastic, and neural networks training problems.

Original languageEnglish
Pages (from-to)325-340
Number of pages15
JournalJournal of Optimization Theory and Applications
Volume129
Issue number2
DOIs
Publication statusPublished - May 2006

Keywords

  • nonlinear programming
  • stochastic programming
  • noisy optimization
  • neural networks training
  • GRADIENT-METHOD
  • BARZILAI

Cite this

Novel Algorithms for Noisy Minimization Problems with Applications to Neural Networks Training. / Sirlantzis, K.; Lamb, John Douglas; Liu, W. B.

In: Journal of Optimization Theory and Applications, Vol. 129, No. 2, 05.2006, p. 325-340.

Research output: Contribution to journalArticle

@article{3edda65029db430ba8c63c985503d5a9,
title = "Novel Algorithms for Noisy Minimization Problems with Applications to Neural Networks Training",
abstract = "The supervisor and searcher cooperation framework (SSC), introduced in Refs. 1 and 2, provides an effective way to design efficient optimization algorithms combining the desirable features of the two existing ones. This work aims to develop efficient algorithms for a wide range of noisy optimization problems including those posed by feedforward neural networks training. It introduces two basic SSC algorithms. The first seems suited for generic problems. The second is motivated by neural networks training problems. It introduces also inexact variants of the two algorithms, which seem to possess desirable properties. It establishes general theoretical results about the convergence and speed of SSC algorithms and illustrates their appealing attributes through numerical tests on deterministic, stochastic, and neural networks training problems.",
keywords = "nonlinear programming, stochastic programming, noisy optimization, neural networks training, GRADIENT-METHOD, BARZILAI",
author = "K. Sirlantzis and Lamb, {John Douglas} and Liu, {W. B.}",
year = "2006",
month = "5",
doi = "10.1007/S10957-006-9066-Z",
language = "English",
volume = "129",
pages = "325--340",
journal = "Journal of Optimization Theory and Applications",
issn = "0022-3239",
publisher = "Springer New York",
number = "2",

}

TY - JOUR

T1 - Novel Algorithms for Noisy Minimization Problems with Applications to Neural Networks Training

AU - Sirlantzis, K.

AU - Lamb, John Douglas

AU - Liu, W. B.

PY - 2006/5

Y1 - 2006/5

N2 - The supervisor and searcher cooperation framework (SSC), introduced in Refs. 1 and 2, provides an effective way to design efficient optimization algorithms combining the desirable features of the two existing ones. This work aims to develop efficient algorithms for a wide range of noisy optimization problems including those posed by feedforward neural networks training. It introduces two basic SSC algorithms. The first seems suited for generic problems. The second is motivated by neural networks training problems. It introduces also inexact variants of the two algorithms, which seem to possess desirable properties. It establishes general theoretical results about the convergence and speed of SSC algorithms and illustrates their appealing attributes through numerical tests on deterministic, stochastic, and neural networks training problems.

AB - The supervisor and searcher cooperation framework (SSC), introduced in Refs. 1 and 2, provides an effective way to design efficient optimization algorithms combining the desirable features of the two existing ones. This work aims to develop efficient algorithms for a wide range of noisy optimization problems including those posed by feedforward neural networks training. It introduces two basic SSC algorithms. The first seems suited for generic problems. The second is motivated by neural networks training problems. It introduces also inexact variants of the two algorithms, which seem to possess desirable properties. It establishes general theoretical results about the convergence and speed of SSC algorithms and illustrates their appealing attributes through numerical tests on deterministic, stochastic, and neural networks training problems.

KW - nonlinear programming

KW - stochastic programming

KW - noisy optimization

KW - neural networks training

KW - GRADIENT-METHOD

KW - BARZILAI

U2 - 10.1007/S10957-006-9066-Z

DO - 10.1007/S10957-006-9066-Z

M3 - Article

VL - 129

SP - 325

EP - 340

JO - Journal of Optimization Theory and Applications

JF - Journal of Optimization Theory and Applications

SN - 0022-3239

IS - 2

ER -