E-Book, Englisch, 0 Seiten
Rubino / Sericola Markov Chains and Dependability Theory
Erscheinungsjahr 2014
ISBN: 978-1-139-98953-4
Verlag: Cambridge University Press
Format: PDF
Kopierschutz: Adobe DRM (»Systemvoraussetzungen)
E-Book, Englisch, 0 Seiten
ISBN: 978-1-139-98953-4
Verlag: Cambridge University Press
Format: PDF
Kopierschutz: Adobe DRM (»Systemvoraussetzungen)
Dependability metrics are omnipresent in every engineering field, from simple ones through to more complex measures combining performance and dependability aspects of systems. This book presents the mathematical basis of the analysis of these metrics in the most used framework, Markov models, describing both basic results and specialised techniques. The authors first present both discrete and continuous time Markov chains before focusing on dependability measures, which necessitate the study of Markov chains on a subset of states representing different user satisfaction levels for the modelled system. Topics covered include Markovian state lumping, analysis of sojourns on subset of states of Markov chains, analysis of most dependability metrics, fundamentals of performability analysis, and bounding and simulation techniques designed to evaluate dependability measures. The book is of interest to graduate students and researchers in all areas of engineering where the concepts of lifetime, repair duration, availability, reliability and risk are important.
Autoren/Hrsg.
Fachgebiete
- Technische Wissenschaften Energietechnik | Elektrotechnik Elektrotechnik
- Mathematik | Informatik Mathematik Numerik und Wissenschaftliches Rechnen Angewandte Mathematik, Mathematische Modelle
- Mathematik | Informatik EDV | Informatik Angewandte Informatik Wirtschaftsinformatik
- Technische Wissenschaften Maschinenbau | Werkstoffkunde Maschinenbau
- Wirtschaftswissenschaften Betriebswirtschaft Wirtschaftsmathematik und -statistik
- Mathematik | Informatik EDV | Informatik Informatik
Weitere Infos & Material
1. Introduction; 2. Discrete time Markov chains; 3. Continuous time Markov chains; 4. State aggregation of Markov chains; 5. Sojourn times in subsets of states; 6. Occupation times; 7. Performability; 8. Stationary detection; 9. Simulation of dependability models; 10. Bounding techniques.