For Full-Text PDF, please login, if you are a member of IEICE,|
or go to Pay Per View on menu list, if you are a nonmember of IEICE.
Binary Second-Order Recurrent Neural Networks for Inferring Regular Grammars
Soon-Ho JUNG Hyunsoo YOON
IEICE TRANSACTIONS on Information and Systems
Publication Date: 2000/11/25
Print ISSN: 0916-8532
Type of Manuscript: PAPER
Category: Biocybernetics, Neurocomputing
binary second-order recurrent neural network, stable second-order recurrent neural network, regular grammar, finite-state machine, k-tail equivalence,
Full Text: PDF(1.4MB)>>
This paper proposes the binary second-order recurrent neural networks (BSRNN) equivalent to the modified finite automata (MFA) and presents the learning algorithm to construct the stable BSRNN for inferring regular grammar. This network combines two trends; one is to transform strings of a regular grammar into a recurrent neural network through training with no restriction of the number of neurons, the number of strings, and the length of string and the other is to directly transform itself into a finite automaton. Since neurons in the BSRNN employ a hard-limiter activation functions, the proposed BSRNN can become a good alternative of hardware implementation for regular grammars and finite automata as well as grammatical inference.