Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and StabilityISBN: 978-0-471-49517-8
Hardcover
304 pages
September 2001
This is a Print-on-Demand title. It will be printed specifically to fill your order. Please allow an additional 10-15 days delivery time. The book is not returnable.
|
Preface.
Introduction.
Fundamentals.
Network Architectures for Prediction.
Activation Functions Used in Neural Networks.
Recurrent Neural Networks Architectures.
Neural Networks as Nonlinear Adaptive Filters.
Stability Issues in RNN Architectures.
Data-Reusing Adaptive Learning Algorithms.
A Class of Normalised Algorithms for Online Training of Recurrent Neural Networks.
Convergence of Online Learning Algorithms in Neural Networks.
Some Practical Considerations of Predictability and Learning Algorithms for Various Signals.
Exploiting Inherent Relationships Between Parameters in Recurrent Neural Networks.
Appendix A: The O Notation and Vector and Matrix Differentiation.
Appendix B: Concepts from the Approximation Theory.
Appendix C: Complex Sigmoid Activation Functions, Holomorphic Mappings and Modular Groups.
Appendix D: Learning Algorithms for RNNs.
Appendix E: Terminology Used in the Field of Neural Networks.
Appendix F: On the A Posteriori Approach in Science and Engineering.
Appendix G: Contraction Mapping Theorems.
Appendix H: Linear GAS Relaxation.
Appendix I: The Main Notions in Stability Theory.
Appendix J: Deasonsonalising Time Series.
References.
Index.
Introduction.
Fundamentals.
Network Architectures for Prediction.
Activation Functions Used in Neural Networks.
Recurrent Neural Networks Architectures.
Neural Networks as Nonlinear Adaptive Filters.
Stability Issues in RNN Architectures.
Data-Reusing Adaptive Learning Algorithms.
A Class of Normalised Algorithms for Online Training of Recurrent Neural Networks.
Convergence of Online Learning Algorithms in Neural Networks.
Some Practical Considerations of Predictability and Learning Algorithms for Various Signals.
Exploiting Inherent Relationships Between Parameters in Recurrent Neural Networks.
Appendix A: The O Notation and Vector and Matrix Differentiation.
Appendix B: Concepts from the Approximation Theory.
Appendix C: Complex Sigmoid Activation Functions, Holomorphic Mappings and Modular Groups.
Appendix D: Learning Algorithms for RNNs.
Appendix E: Terminology Used in the Field of Neural Networks.
Appendix F: On the A Posteriori Approach in Science and Engineering.
Appendix G: Contraction Mapping Theorems.
Appendix H: Linear GAS Relaxation.
Appendix I: The Main Notions in Stability Theory.
Appendix J: Deasonsonalising Time Series.
References.
Index.