Steinwendner / Schwaiger | Programming Neural Networks with Python | Buch | 978-1-4932-2696-2 | sack.de

Buch, Englisch, 457 Seiten, Format (B × H): 175 mm x 252 mm, Gewicht: 728 g

Steinwendner / Schwaiger

Programming Neural Networks with Python


1. Auflage 2025
ISBN: 978-1-4932-2696-2
Verlag: Rheinwerk Verlag GmbH

Buch, Englisch, 457 Seiten, Format (B × H): 175 mm x 252 mm, Gewicht: 728 g

ISBN: 978-1-4932-2696-2
Verlag: Rheinwerk Verlag GmbH


Neural networks are at the heart of AI—so ensure you’re on the cutting edge with this guide! For true beginners, get a crash course in Python and the mathematical concepts you’ll need to understand and create neural networks. Or jump right into programming your first neural network, from implementing the scikit-learn library to using the perceptron learning algorithm. Learn how to train your neural network, measure errors, make use of transfer learning, implementing the CRISP-DM model, and more. Whether you’re interested in machine learning, gen AI, LLMs, deep learning, or all of the above, this is the AI book you need!

Highlights include:

1) Network creation
2) Network training
3) Supervised and unsupervised learning
4) Reinforcement learning
5) Algorithms
6) Multi-layer networks
7) Deep neural networks
8) Back propagation
9) Transformers
10) Python
11) Mathematical concepts
12) TensorFlow


Steinwendner / Schwaiger Programming Neural Networks with Python jetzt bestellen!

Weitere Infos & Material


... Preface ... 15

1 ... Introduction ... 17

1.1 ... Why Neural Networks? ... 17

1.2 ... About This Book ... 18

1.3 ... The Contents in Brief ... 19

1.4 ... Is This Bee a Queen Bee? ... 22

1.5 ... An Artificial Neural Network for the Bee Colony ... 23

1.6 ... From Biology to the Artificial Neuron ... 28

1.7 ... Classification and the Rest ... 32

1.8 ... Summary ... 39

1.9 ... Further Reading ... 39

PART I ... Up and Running ... 41

2 ... Starter Kit for Developing Neural Networks with Python ... 43

2.1 ... The Technical Development Environment ... 43

2.2 ... Summary ... 63

3 ... A Simple Neural Network ... 65

3.1 ... Background ... 65

3.2 ... Bring on the Neural Network! ... 65

3.3 ... Neuron Zoom-In ... 68

3.4 ... Step Function ... 73

3.5 ... Perceptron ... 75

3.6 ... Points in Space: Vector Representation ... 76

3.7 ... Horizontal and Vertical: Column and Line Notation ... 82

3.8 ... The Weighted Sum ... 84

3.9 ... Step-by-Step: Step Functions ... 85

3.10 ... The Weighted Sum Reloaded ... 85

3.11 ... All Together ... 86

3.12 ... Task: Robot Protection ... 89

3.13 ... Summary ... 91

3.14 ... Further Reading ... 91

4 ... Learning in a Simple Network ... 93

4.1 ... Background: Plans Are Being Made ... 93

4.2 ... Learning in Python Code ... 94

4.3 ... Perceptron Learning ... 94

4.4 ... Separating Line for a Learning Step ... 98

4.5 ... Perceptron Learning Algorithm ... 99

4.6 ... The Separating Lines or Hyperplanes for the Example ... 103

4.7 ... scikit-learn Compatible Estimator ... 106

4.8 ... scikit-learn Perceptron Estimator ... 113

4.9 ... Adaline ... 115

4.10 ... Summary ... 125

4.11 ... Further Reading ... 126

5 ... Multilayer Neural Networks ... 127

5.1 ... A Real Problem ... 127

5.2 ... Solving XOR ... 129

5.3 ... Preparations for the Launch ... 134

5.4 ... The Plan for Implementation ... 135

5.5 ... The Setup ("class") ... 136

5.6 ... The Initialization ("__init__") ... 138

5.7 ... Something for In-Between ("print") ... 141

5.8 ... The Analysis ("predict") ... 141

5.9 ... The Usage ... 143

5.10 ... Summary ... 145

6 ... Learning in a Multilayer Network ... 147

6.1 ... How Do You Measure an Error? ... 147

6.2 ... Gradient Descent: An Example ... 149

6.3 ... A Network of Sigmoid Neurons ... 157

6.4 ... The Cool Algorithm with Forward Delta and Backpropagation ... 158

6.5 ... A “fit” Run ... 170

6.6 ... Summary ... 178

6.7 ... Further Reading ... 178

7 ... Examples of Deep Neural Networks ... 179

7.1 ... Convolutional Neural Networks ... 179

7.2 ... Transformer Neural Networks ... 194

7.3 ... The Optimization Method ... 204

7.4 ... Preventing Overfitting ... 205

7.5 ... Summary ... 207

7.6 ... Further Reading ... 207

8 ... Programming Deep Neural Networks Using TensorFlow 2 ... 209

8.1 ... Convolutional Networks for Handwriting Recognition ... 209

8.2 ... Transfer Learning with Convolutional Neural Networks ... 223

8.3 ... Transfer Learning with Transformer Neural Networks ... 231

8.4 ... Summary ... 236

8.5 ... Further Reading ... 236

PART II ... Deep Dive ... 239

9 ... From Brain to Network ... 241

9.1 ... Your Brain in Action ... 241

9.2 ... The Nervous System ... 242

9.3 ... The Brain ... 243

9.4 ... Neurons and Glial Cells ... 245

9.5 ... A Transfer in Detail ... 247

9.6 ... Representation of Cells and Networks ... 249

9.7 ... Summary ... 251

9.8 ... Further Reading ... 251

10 ... The Evolution of Artificial Neural Networks ... 253

10.1 ... The 1940s ... 254

10.2 ... The 1950s ... 255

10.3 ... The 1960s ... 257

10.4 ... The 1970s ... 257

10.5 ... The 1980s ... 258

10.6 ... The 1990s ... 270

10.7 ... The 2000s ... 271

10.8 ... The 2010s ... 272

10.9 ... Summary ... 274

10.10 ... Further Reading ... 274

11 ... The Machine Learning Process ... 277

11.1 ... The CRISP-DM Model ... 277

11.2 ... Ethical and Legal Aspects ... 281

11.3 ... Feature Engineering ... 290

11.4 ... Summary ... 317

11.5 ... Further Reading ... 318

12 ... Learning Methods ... 319

12.1 ... Learning Strategies ... 319

12.2 ... Tools ... 345

12.3 ... Summary ... 350

12.4 ... Further Reading ... 350

13 ... Areas of Application and Real-Life Examples ... 351

13.1 ... Warm-Up ... 351

13.2 ... Image Classification ... 354

13.3 ... Dreamed Images ... 373

13.4 ... Deployment with Pretrained Networks ... 382

13.5 ... Summary ... 386

13.6 ... Further Reading ... 386

... Appendices ... 387

A ... Python in Brief ... 389

B ... Mathematics in Brief ... 417

C ... TensorFlow 2 and Keras ... 435

... The Authors ... 445

... Index ... 447


Schwaiger, Roland
Dr. Roland Schwaiger studied Computer Science at Bowling Green State University, Ohio, USA, and Applied Computer Science and Mathematics at the University of Salzburg, Austria, where he completed his doctorate in Mathematics. After several years of working as an assistant professor at the University of Salzburg, he joined SAP AG in 1996. There, he worked as a Human Resources software developer for three years, which gave him the opportunity to develop his skills in an exciting and inspirational working environment. In 1999, Roland became a freelance trainer, editor, consultant, and developer.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.