Convergence Analysis of Recurrent Neural Networks
Title | Convergence Analysis of Recurrent Neural Networks PDF eBook |
Author | Zhang Yi |
Publisher | Springer Science & Business Media |
Pages | 244 |
Release | 2013-11-11 |
Genre | Computers |
ISBN | 1475738196 |
Since the outstanding and pioneering research work of Hopfield on recurrent neural networks (RNNs) in the early 80s of the last century, neural networks have rekindled strong interests in scientists and researchers. Recent years have recorded a remarkable advance in research and development work on RNNs, both in theoretical research as weIl as actual applications. The field of RNNs is now transforming into a complete and independent subject. From theory to application, from software to hardware, new and exciting results are emerging day after day, reflecting the keen interest RNNs have instilled in everyone, from researchers to practitioners. RNNs contain feedback connections among the neurons, a phenomenon which has led rather naturally to RNNs being regarded as dynamical systems. RNNs can be described by continuous time differential systems, discrete time systems, or functional differential systems, and more generally, in terms of non linear systems. Thus, RNNs have to their disposal, a huge set of mathematical tools relating to dynamical system theory which has tumed out to be very useful in enabling a rigorous analysis of RNNs.
A Convergence Result for Learning in Recurrent Neural Networks
Title | A Convergence Result for Learning in Recurrent Neural Networks PDF eBook |
Author | Chung-Ming Kuan |
Publisher | |
Pages | |
Release | 1993 |
Genre | |
ISBN |
Convergence Analysis of Neural Networks
Title | Convergence Analysis of Neural Networks PDF eBook |
Author | David Holzmüller |
Publisher | |
Pages | |
Release | 2019 |
Genre | |
ISBN |
Using Fourier convergence analysis for effective learning in max- min neural networks
Title | Using Fourier convergence analysis for effective learning in max- min neural networks PDF eBook |
Author | Kia Fock Loe |
Publisher | |
Pages | 25 |
Release | 1996 |
Genre | Neural networks (Computer science) |
ISBN |
Abstract: "Max and min operations have interesting properties that facilitate the exchange of information between the symbolic and real- valued domains. As such, neural networks that employ max-min activation functions have been a subject of interest in recent years. Since max-min functions are not strictly differentiable, many ad hoc learning methods for such max-min neural networks have been proposed in the literature. In this technical report, we propose a mathematically sound learning method based on using Fourier convergence analysis to derive a gradient descent technique for max-min error functions. This method is then applied to two models: a feedforward fuzzy-neural network and a recurrent max-min neural network. We show how a 'typical' fuzzy-neural network model employing max- min activation functions can be trained to perform function approximation; its performance was found to be better than that of a conventional feedforward neural network. We also propose a novel recurrent max-min neural network model which is trained to perform grammatical inference as an application example. Comparisons are made between this model and recurrent neural networks that use conventional sigmoidal activation fuctions; such recurrent sigmoidal networks are known to be difficult to train and generalize poorly on long strings. The comparisons show that our model not only performs better in terms of learning speed and generalization, its final weight configuration allows a DFQ to be extracted in a straighforward manner. However, it has a potential drawback: the minimal network size required for successful convergence grows with increasing language depth and complexity. Nevertheless, we are able to demonstrate that our proposed gradient descent technique does allow max-min neural networks to learn effectively. Our leaning method should be extensible to other neural networks that have non-differentiable activation functions."
Neural Networks: Computational Models and Applications
Title | Neural Networks: Computational Models and Applications PDF eBook |
Author | Huajin Tang |
Publisher | Springer Science & Business Media |
Pages | 310 |
Release | 2007-03-12 |
Genre | Computers |
ISBN | 3540692258 |
Neural Networks: Computational Models and Applications presents important theoretical and practical issues in neural networks, including the learning algorithms of feed-forward neural networks, various dynamical properties of recurrent neural networks, winner-take-all networks and their applications in broad manifolds of computational intelligence: pattern recognition, uniform approximation, constrained optimization, NP-hard problems, and image segmentation. The book offers a compact, insightful understanding of the broad and rapidly growing neural networks domain.
Some Convergence Results for Learning in Recurrent Neural Networks
Title | Some Convergence Results for Learning in Recurrent Neural Networks PDF eBook |
Author | Chung-Ming Kuan |
Publisher | |
Pages | |
Release | 1990 |
Genre | |
ISBN |
Structure and Convergence Properties of a Recurrent Neural Network
Title | Structure and Convergence Properties of a Recurrent Neural Network PDF eBook |
Author | |
Publisher | |
Pages | 66 |
Release | 1996 |
Genre | |
ISBN |
The work reported focuses on the conditions necessary for well-defined recurrent neural networks (RNNs) to operate in the stable regime, and on the network parameters that affect the rate of convergence to stability. After an introduction, chapter 2 details the structure of a general form of RNN that may be applied to problems in the two essential areas of robotics and machine intelligence: pattern recognition and motor control. Details of the C programs that configure, run, and analyze the RNN are also provided. Convergence properties of the RNN as a function of its structural and learning parameters are examined and key conditions for stable, periodic, aperiodic, and chaotic operation are established. The use of the RNN in pattern recognition and object classification, and the potential for self-directed motor control in mobile autonomous machines are discussed.