Novel Superlinear First Order Algorithms

Peter GECZY
Shiro USUI

Publication
IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E87-A    No.6    pp.1620-1631
Publication Date: 2004/06/01
Online ISSN: 
DOI: 
Print ISSN: 0916-8508
Type of Manuscript: PAPER
Category: Neural Networks and Bioengineering
Keyword: 
first order optimization,  steepest descent,  conjugate gradient,  line search subproblem,  classification framework,  neural networks,  

Full Text: PDF>>
Buy this Article



Summary: 
Applying the formerly proposed classification framework for first order line search optimization techniques we introduce novel superlinear first order line search methods. Novelty of the methods lies in the line search subproblem. The presented line search subproblem features automatic step length and momentum adjustments at every iteration of the algorithms realizable in a single step calculation. This keeps the computational complexity of the algorithms linear and does not harm the stability and convergence of the methods. The algorithms have none or linear memory requirements and are shown to be convergent and capable of reaching the superlinear convergence rates. They were practically applied to artificial neural network training and compared to the relevant training methods within the same class. The simulation results show satisfactory performance of the introduced algorithms over the standard and previously proposed methods.