A Tighter Upper Bound on Storage Capacity of Multilayer Networks

Haruhisa TAKAHASHI  

IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E81-A   No.2   pp.333-339
Publication Date: 1998/02/25
Online ISSN: 
Print ISSN: 0916-8508
Type of Manuscript: PAPER
Category: Neural Networks
neural networks,  storage capacity,  multilayer network,  

Full Text: PDF>>
Buy this Article

Typical concepts concerning memorizing capability of multilayer neural networks are statistical capacity and Vapnik-Chervonenkis (VC) dimension. These are differently defined each other according to intended applications. Although for the VC dimension several tighter upper bounds have been proposed, even if limited to networks with linear threshold elements, in literature, upper bounds on the statistical capacity are available only by the order of magnitude. We argue first that the proposed or ordinary formulation of the upper bound on the statistical capacity depends strongly on, and thus, it is possibly expressed by the number of the first hidden layer units. Then, we describe a more elaborated upper bound of the memorizing capacity of multilayer neural networks with linear threshold elements, which improves former results. Finally, a discussion of gaining good generalization is presented.