THESIS
1994
xiii, 106 leaves : ill. ; 30 cm
Abstract
Many feedforward neural network models have been extensively studied by both theoreticians and practitioners. However, by their nature, feedforward networks are not good for handling problems that involve temporally related events, such as speech and language processing, time series prediction, etc. These problems are commonly found in many real-world applications....[
Read more ]
Many feedforward neural network models have been extensively studied by both theoreticians and practitioners. However, by their nature, feedforward networks are not good for handling problems that involve temporally related events, such as speech and language processing, time series prediction, etc. These problems are commonly found in many real-world applications.
In recent years, some neural network models consisting of feedback (or re-current) connections, called recurrent neural networks, have been proposed for some temporal sequence processing problems. One of the most popular models is Elman's simple recurrent network (SRN) model proposed for learning formal grammars. In this thesis, some limitations of the SRN model and its variants are discussed and demonstrated in detail. Based on these observations, an enhanced network model, called ASCOC, is proposed. This model clearly outperforms SRN in several important aspects, including its ability in learning grammars with embedded structures that SRN fails to learn satisfactorily. Besides studying the proposed model on formal language learning tasks as in previous studies by other researchers, another task beyond the learning of regular grammars is also proposed. This task is formulated as the recognition of radicals (sub-patterns) in on-line Chinese handwriting, although the objective of this research is not to build a complete system for such application. The potential of the new model for handling a larger class of sequence processing problems will also be discussed.
Post a Comment