Tuesday, June 19, 2012



E-Book Details:
Title:
Neural networks and learning machines
Publisher:
Prentice Hall, 2009
Author:
Simon S. Haykin
Edition:
3, illustrated (2009)
Format:
PDF
ISBN:
0131471392
EAN:
9780131471399
No. of Pages:
906



Book Description:
Fluid and authoritative, this well-organized book represents the first comprehensive treatment of neural networks from an engineering perspective, providing extensive, state-of-the-art coverage that will expose readers to the myriad facets of neural networks and help them appreciate the technology's origin, capabilities, and potential applications. Examines all the important aspects of this emerging technology, covering the learning process, back propagation, radial basis functions, recurrent networks, self-organizing systems, modular networks, temporal processing, aerodynamics, and VLSI implementation. Integrates computer experiments throughout to demonstrate how neural networks are designed and perform in practice. Chapter objectives, problems, worked examples, a bibliography, photographs, illustrations, and a thorough glossary all reinforce concepts throughout. New chapters delve into such areas as support vector machines, and reinforcement learning/aerodynamic programming, plus readers will find an entire chapter of case studies to illustrate the real-life, practical applications of neural networks. A highly detailed bibliography is included for easy reference. For professional engineers and research scientists.
FEATURES:
•   Computer-oriented experiments distributed throughout the text.
•   Extensive, state-of-the-art coverage exposes students to the many facets of neural networks and helps them appreciate the technology's capabilities and potential applications.
•   Reinforces key concepts with chapter objectives, problems, worked examples, a bibliography, photographs, illustrations, and a thorough glossary. 
•   Detailed analysis of back-propagation learning and multi-layer perceptrons.
•   Explores the intricacies of the learning process—an essential component for understanding neural networks.
•   Considers recurrent networks, such as Hopfield networks, Boltzmann machines, and meanfield theory machines, as well as modular networks, temporal processing, and neurodynamics.
•   Integrates computer experiments throughout, giving students the opportunity to see how neural networks are designed and perform in practice. 
•   Includes a detailed and extensive bibliography for easy reference. 
•   On-line learning algorithms rooted in stochastic gradient descent; small-scale and large-scalelearning problems.
•   Kernel methods, including support vector machines, and the representer theorem.
•   Information-theoretic learning models, including copulas, independent components analysis(ICA), coherent ICA, and information bottleneck.
•   Stochastic dynamic programming, including approximate and neurodynamic procedures.
•   Sequential state-estimation algorithms, including Kalman and particle filters.
•   Recurrent neural networks trained using sequential-state estimation algorithms.
New To This Edition :
Revised to provide an up-to-date treatment of both neural networks and learning machines, this book remains the most comprehensive – in breadth of coverage and technical detail – on the market.
  • Renewed Focus: The perceptron, the multilayer perceptron, self-organizing maps, and neurodynamics are considered from the learning machine and neural network perspective.
  • New Discussions: Treatments of supervised learning and semisupervised learning applied to large-scale problems.
  • Broadened Scope: Detailed treatments of dynamic programming and sequential state estimation have been increased to supplement the study of reinforcement and supervised learning.
  • Refocused Chapter Topics: Rosenblatt’s Perceptron, Least-Mean-Square Algorithm, Regularization Theory, Kernel Methods and Radial-Basis function networks (RBF), Bayseian Filtering for State Estimation of Dynamic Systems.
  • Expanded Glossary: Includes notes on the methodology used with matrix analysis and probability theory.
  • Real-life Data: Case studies include US Postal Service Data for semiunsupervised learning using the Laplacian RLS Algorithm, how PCA is applied to handwritten digital data, the analysis of natural images by using sparse-sensory coding and ICA, dynamic reconstruction applied to the Lorenz attractor by using a regularized RBF network, and the model reference adaptive control system.
  • Computer Experiments: The double-moon configuration for generating binary classification data is used as a running example throughout the first seven chapters and chapter ten.
  • New Problems: Nearly half of the problems in the book are new to this edition.

0 comments:

Post a Comment