Ernst / Schweikard | Fundamentals of Machine Learning | Buch | 978-3-8252-5251-9 | sack.de

Buch, Englisch, 155 Seiten, BC, Format (B × H): 196 mm x 266 mm, Gewicht: 411 g

Reihe: UTB

Ernst / Schweikard

Fundamentals of Machine Learning

Support Vector Machines Made Easy
1. Auflage 2020
ISBN: 978-3-8252-5251-9
Verlag: UTB

Support Vector Machines Made Easy

Buch, Englisch, 155 Seiten, BC, Format (B × H): 196 mm x 266 mm, Gewicht: 411 g

Reihe: UTB

ISBN: 978-3-8252-5251-9
Verlag: UTB


Artificial intelligence will change our lives forever - both at work and in our private lives. But how exactly does machine learning work? Two professors from Lübeck explore this question. In their English textbook they teach the necessary basics for the use of Support Vector Machines, for example, by explaining linear programming, the Lagrange multiplier, kernels and the SMO algorithm. They also deal with neural networks, evolutionary algorithms and Bayesian networks.
Definitions are highlighted in the book and tasks invite readers to actively participate. The textbook is aimed at students of computer science, engineering and natural sciences, especially in the fields of robotics, artificial intelligence and mathematics.

Ernst / Schweikard Fundamentals of Machine Learning jetzt bestellen!

Weitere Infos & Material


Contents
Preface
1 Symbolic Classification and Nearest Neighbour Classification
1.1 Symbolic Classification
1.2 Nearest Neighbour Classification
2 Separating Planes and Linear Programming
2.1 Finding a Separating Hyperplane
2.2 Testing for feasibility of linear constraints
2.3 Linear Programming
MATLAB example
2.4 Conclusion
3 Separating Margins and Quadratic Programming
3.1 Quadratic Programming
3.2 Maximum Margin Separator Planes
3.3 Slack Variables
4 Dualization and Support Vectors
4.1 Duals of Linear Programs
4.2 Duals of Quadratic Programs
4.3 Support Vectors
5 Lagrange Multipliers and Duality
5.1 Multidimensional functions
5.2 Support Vector Expansion
5.3 Support Vector Expansion with Slack Variables
6 Kernel Functions
6.1 Feature Spaces
6.2 Feature Spaces and Quadratic Programming
6.3 Kernel Matrix and Mercer’s Theorem
6.4 Proof of Mercer’s Theorem
Step 1 – Definitions and Prerequisites
Step 2 – Designing the right Hilbert Space
Step 3 – The reproducing property
7 The SMO Algorithm
7.1 Overview and Principles
7.2 Optimisation Step
7.3 Simplified SMO
8 Regression
8.1 Slack Variables
8.2 Duality, Kernels and Regression
8.3 Deriving the Dual form of the QP for Regression
9 Perceptrons, Neural Networks and Genetic Algorithms
9.1 Perceptrons
Perceptron-Algorithm
Perceptron-Lemma and Convergence
Perceptrons and Linear Feasibility Testing
9.2 Neural Networks
Forward Propagation
Training and Error Backpropagation
9.3 Genetic Algorithms
9.4 Conclusion
10 Bayesian Regression
10.1 Bayesian Learning
10.2 Probabilistic Linear Regression
10.3 Gaussian Process Models
10.4 GP model with measurement noise
Optimization of hyperparameters
Covariance functions
10.5 Multi-Task Gaussian Process (MTGP) Models
11 Bayesian Networks
Propagation of probabilities in causal networks
Appendix – Linear Programming
A.1 Solving LP0 problems
A.2 Schematic representation of the iteration steps
A.3 Transition from LP0 to LP
A.4 Computing time and complexity issues
References
Index


Ernst, Floris
Prof. Dr. Floris Ernst lehrt KI (Künstliche Intelligenz) und Robotik an der Universität Lübeck.

Schweikard, Achim
Prof. Dr. Achim Schweikard lehrt KI (Künstliche Intelligenz) und Robotik an der Universität Lübeck.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.