
For FullText PDF, please login, if you are a member of IEICE,
or go to Pay Per View on menu list, if you are a nonmember of IEICE.

On the Sample Complexity of Consistent Learning with OneSided Error
Eiji TAKIMOTO Akira MARUOKA
Publication
IEICE TRANSACTIONS on Information and Systems
Vol.E78D
No.5
pp.518525 Publication Date: 1995/05/25
Online ISSN:
DOI:
Print ISSN: 09168532 Type of Manuscript: Special Section PAPER (Special Issue on Algorithmic Learning Theory) Category: Computational Learning Theory Keyword: computational learning theory, PAC learning, learning with onesided error, axisparallel rectangles,
Full Text: PDF(617.8KB) >>Buy this Article
Summary:
Although consistent learning is sufficient for PAClearning, it has not been found what strategy makes learning more efficient, especially on the sample complexity, i.e., the number of examples required. For the first step towards this problem, classes that have consistent learning algorithms with onesided error are considered. A combinatorial quantity called maximal particle sets is introduced, and an upper bound of the sample complexity of consistent learning with onesided error is obtained in terms of maximal particle sets. For the class of ndimensional axisparallel rectangles, one of those classes that are consistently learnable with onesided error, the cardinality of the maximal particle set is estimated and O(d/ε1/ε log 1/δ) upper bound of the learning algorithm for the class is obtained. This bound improves the bounds due to Blumer et al. and meets the lower bound within a constant factor.

