Huang | Statistical Mechanics of Neural Networks | Buch | 978-981-16-7572-0 | sack.de

Buch, Englisch, 296 Seiten, Format (B × H): 155 mm x 235 mm, Gewicht: 482 g

Huang

Statistical Mechanics of Neural Networks


1. Auflage 2021
ISBN: 978-981-16-7572-0
Verlag: Springer Nature Singapore

Buch, Englisch, 296 Seiten, Format (B × H): 155 mm x 235 mm, Gewicht: 482 g

ISBN: 978-981-16-7572-0
Verlag: Springer Nature Singapore


This book highlights a comprehensive introduction to the fundamental statistical mechanics underneath the inner workings of neural networks. The book discusses in details important concepts and techniques including the cavity method, the mean-field theory, replica techniques, the Nishimori condition, variational methods, the dynamical mean-field theory, unsupervised learning, associative memory models, perceptron models, the chaos theory of recurrent neural networks, and eigen-spectrums of neural networks, walking new learners through the theories and must-have skillsets to understand and use neural networks. The book focuses on quantitative frameworks of neural network models where the underlying mechanisms can be precisely isolated by physics of mathematical beauty and theoretical predictions. It is a good reference for students, researchers, and practitioners in the area of neural networks.

Huang Statistical Mechanics of Neural Networks jetzt bestellen!

Zielgruppe


Research


Autoren/Hrsg.


Weitere Infos & Material


Chapter 1: Introduction
Chapter 2: Spin Glass Models and Cavity Method

Chapter 3: Variational Mean-Field Theory and Belief Propagation

Chapter 4: Monte-Carlo Simulation Methods

Chapter 5: High-Temperature Expansion Techniques

Chapter 6: Nishimori Model

Chapter 7: Random Energy Model

Chapter 8: Statistical Mechanics of Hopfield Model

Chapter 9: Replica Symmetry and Symmetry Breaking

Chapter 10: Statistical Mechanics of Restricted Boltzmann Machine

Chapter 11: Simplest Model of Unsupervised Learning with Binary Synapses

Chapter 12: Inherent-Symmetry Breaking in Unsupervised Learning

Chapter 13: Mean-Field Theory of Ising Perceptron

Chapter 14: Mean-Field Model of Multi-Layered Perceptron

Chapter 15: Mean-Field Theory of Dimension Reduction in Neural Networks

Chapter 16: Chaos Theory of Random Recurrent Networks

Chapter 17: Statistical Mechanics of Random Matrices

Chapter 18: Perspectives


Haiping Huang

Dr. Haiping Huang received his Ph.D. degree in theoretical physics from the Institute of Theoretical Physics, the Chinese Academy of Sciences. He works as an associate professor at the School of Physics, Sun Yat-sen University, China. His research interests include the origin of the computational hardness of the binary perceptron model, the theory of dimension reduction in deep neural networks, and inherent symmetry breaking in unsupervised learning. In 2021, he was awarded Excellent Young Scientists Fund by National Natural Science Foundation of China.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.