Novel First Order Optimization Classification Framework

Peter GECZY
Shiro USUI

Publication
IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E83-A    No.11    pp.2312-2319
Publication Date: 2000/11/25
Online ISSN: 
DOI: 
Print ISSN: 0916-8508
Type of Manuscript: PAPER
Category: Numerical Analysis and Optimization
Keyword: 
first order optimization,  convergence rates,  superlinear convergence,  problem descriptor,  algorithm descriptor,  

Full Text: PDF>>
Buy this Article



Summary: 
Numerous scientific and engineering fields extensively utilize optimization techniques for finding appropriate parameter values of models. Various optimization methods are available for practical use. The optimization algorithms are classified primarily due to the rates of convergence. Unfortunately, it is often the case in practice that the particular optimization method with specified convergence rates performs substantially differently on diverse optimization tasks. Theoretical classification of convergence rates then lacks its relevance in the context of the practical optimization. It is therefore desirable to formulate a novel classification framework relevant to the theoretical concept of convergence rates as well as to the practical optimization. This article introduces such classification framework. The proposed classification framework enables specification of optimization techniques and optimization tasks. It also underlies its inherent relationship to the convergence rates. Novel classification framework is applied to categorizing the tasks of optimizing polynomials and the problem of training multilayer perceptron neural networks.