E-Book, Englisch, 800 Seiten
Ross Introduction to Probability Models
10. Auflage 2006
ISBN: 978-0-12-375687-9
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark
E-Book, Englisch, 800 Seiten
ISBN: 978-0-12-375687-9
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark
Introduction to Probability Models, Tenth Edition, provides an introduction to elementary probability theory and stochastic processes. There are two approaches to the study of probability theory. One is heuristic and nonrigorous, and attempts to develop in students an intuitive feel for the subject that enables him or her to think probabilistically. The other approach attempts a rigorous development of probability by using the tools of measure theory. The first approach is employed in this text.
The book begins by introducing basic concepts of probability theory, such as the random variable, conditional probability, and conditional expectation. This is followed by discussions of stochastic processes, including Markov chains and Poison processes. The remaining chapters cover queuing, reliability theory, Brownian motion, and simulation. Many examples are worked out throughout the text, along with exercises to be solved by students.
This book will be particularly useful to those interested in learning how probability theory can be applied to the study of phenomena in fields such as engineering, computer science, management science, the physical and social sciences, and operations research. Ideally, this text would be used in a one-year course in probability models, or a one-semester course in introductory probability theory or a course in elementary stochastic processes.
New to this Edition: 65% new chapter material including coverage of finite capacity queues, insurance risk models and Markov chainsContains compulsory material for new Exam 3 of the Society of Actuaries containing several sections in the new examsUpdated data, and a list of commonly used notations and equations, a robust ancillary package, including a ISM, SSM, test bank, and companion websiteIncludes SPSS PASW Modeler and SAS JMP software packages which are widely used in the field Hallmark features: Superior writing styleExcellent exercises and examples covering the wide breadth of coverage of probability topics Real-world applications in engineering, science, business and economics
Sheldon M. Ross is a professor in the Department of Industrial Engineering and Operations Research at the University of Southern California. He received his Ph.D. in statistics at Stanford University in 1968. He has published many technical articles and textbooks in the areas of statistics and applied probability. Among his texts are A First Course in Probability, Introduction to Probability Models, Stochastic Processes, and Introductory Statistics. Professor Ross is the founding and continuing editor of the journal Probability in the Engineering and Informational Sciences. He is a Fellow of the Institute of Mathematical Statistics, and a recipient of the Humboldt US Senior Scientist Award.
Autoren/Hrsg.
Weitere Infos & Material
1;Front Cover;1
2;Title Page;4
3;Copyright Page;5
4;Table of Contents;6
5;Preface;12
6;Chapter 1. Introduction to Probability Theory;18
6.1;1.1 Introduction;18
6.2;1.2 Sample Space and Events;18
6.3;1.3 Probabilities Defined on Events;21
6.4;1.4 Conditional Probabilities;24
6.5;1.5 Independent Events;27
6.6;1.6 Bayes’ Formula;29
6.7;Exercises;32
6.8;References;37
7;Chapter 2. Random Variables;38
7.1;2.1 Random Variables;38
7.2;2.2 Discrete Random Variables;42
7.2.1;2.2.1 The Bernoulli Random Variable;43
7.2.2;2.2.2 The Binomial Random Variable;44
7.2.3;2.2.3 The Geometric Random Variable;46
7.2.4;2.2.4 The Poisson Random Variable;47
7.3;2.3 Continuous Random Variables;48
7.3.1;2.3.1 The Uniform Random Variable;49
7.3.2;2.3.2 Exponential Random Variables;51
7.3.3;2.3.3 Gamma Random Variables;51
7.3.4;2.3.4 Normal Random Variables;51
7.4;2.4 Expectation of a Random Variable;53
7.4.1;2.4.1 The Discrete Case;53
7.4.2;2.4.2 The Continuous Case;55
7.4.3;2.4.3 Expectation of a Function of a Random Variable;57
7.5;2.5 Jointly Distributed Random Variables;61
7.5.1;2.5.1 Joint Distribution Functions;61
7.5.2;2.5.2 Independent Random Variables;65
7.5.3;2.5.3 Covariance and Variance of Sums of Random Variables;67
7.5.4;2.5.4 Joint Probability Distribution of Functions of Random Variables;76
7.6;2.6 Moment Generating Functions;79
7.6.1;2.6.1 The Joint Distribution of the Sample Mean and Sample Variance from a Normal Population;88
7.7;2.7 The Distribution of the Number of Events that Occur;91
7.8;2.8 Limit Theorems;94
7.9;2.9 Stochastic Processes;101
7.10;Exercises;103
7.11;References;112
8;Chapter 3. Conditional Probability and Conditional Expectation;114
8.1;3.1 Introduction;114
8.2;3.2 The Discrete Case;114
8.3;3.3 The Continuous Case;119
8.4;3.4 Computing Expectations by Conditioning;123
8.4.1;3.4.1 Computing Variances by Conditioning;134
8.5;3.5 Computing Probabilities by Conditioning;139
8.6;3.6 Some Applications;157
8.6.1;3.6.1 A List Model;157
8.6.2;3.6.2 A Random Graph;158
8.6.3;3.6.3 Uniform Priors, Polya’s Urn Model, and Bose–Einstein Statistics;166
8.6.4;3.6.4 Mean Time for Patterns;170
8.6.5;3.6.5 The k-Record Values of Discrete Random Variables;174
8.6.6;3.6.6 Left Skip Free Random Walks;177
8.7;3.7 An Identity for Compound Random Variables;183
8.7.1;3.7.1 Poisson Compounding Distribution;186
8.7.2;3.7.2 Binomial Compounding Distribution;188
8.7.3;3.7.3 A Compounding Distribution Related to the Negative Binomial;189
8.8;Exercises;190
9;Chapter 4. Markov Chains;208
9.1;4.1 Introduction;208
9.2;4.2 Chapman–Kolmogorov Equations;212
9.3;4.3 Classification of States;221
9.4;4.4 Limiting Probabilities;231
9.5;4.5 Some Applications;247
9.5.1;4.5.1 The Gambler’s Ruin Problem;247
9.5.2;4.5.2 A Model for Algorithmic Efficiency;251
9.5.3;4.5.3 Using a Random Walk to Analyze a Probabilistic Algorithm for the Satisfiability Problem;254
9.6;4.6 Mean Time Spent in Transient States;260
9.7;4.7 Branching Processes;262
9.8;4.8 Time Reversible Markov Chains;266
9.9;4.9 Markov Chain Monte Carlo Methods;277
9.10;4.10 Markov Decision Processes;282
9.11;4.11 Hidden Markov Chains;286
9.11.1;4.11.1 Predicting the States;290
9.12;Exercises;292
9.13;References;307
10;Chapter 5. The Exponential Distribution and the Poisson Process;308
10.1;5.1 Introduction;308
10.2;5.2 The Exponential Distribution;309
10.2.1;5.2.1 Definition;309
10.2.2;5.2.2 Properties of the Exponential Distribution;311
10.2.3;5.2.3 Further Properties of the Exponential Distribution;318
10.2.4;5.2.4 Convolutions of Exponential Random Variables;325
10.3;5.3 The Poisson Process;329
10.3.1;5.3.1 Counting Processes;329
10.3.2;5.3.2 Definition of the Poisson Process;330
10.3.3;5.3.3 Interarrival and Waiting Time Distributions;333
10.3.4;5.3.4 Further Properties of Poisson Processes;336
10.3.5;5.3.5 Conditional Distribution of the Arrival Times;342
10.3.6;5.3.6 Estimating Software Reliability;353
10.4;5.4 Generalizations of the Poisson Process;356
10.4.1;5.4.1 Nonhomogeneous Poisson Process;356
10.4.2;5.4.2 Compound Poisson Process;363
10.4.3;5.4.3 Conditional or Mixed Poisson Processes;368
10.5;Exercises;371
10.6;References;387
11;Chapter 6. Continuous-Time Markov Chains;388
11.1;6.1 Introduction;388
11.2;6.2 Continuous-Time Markov Chains;389
11.3;6.3 Birth and Death Processes;391
11.4;6.4 The Transition Probability Function Pij(t);398
11.5;6.5 Limiting Probabilities;407
11.6;6.6 Time Reversibility;414
11.7;6.7 Uniformization;423
11.8;6.8 Computing the Transition Probabilities;426
11.9;Exercises;429
11.10;References;436
12;Chapter 7. Renewal Theory and Its Applications;438
12.1;7.1 Introduction;438
12.2;7.2 Distribution of N(t);440
12.3;7.3 Limit Theorems and Their Applications;444
12.4;7.4 Renewal Reward Processes;456
12.5;7.5 Regenerative Processes;464
12.5.1;7.5.1 Alternating Renewal Processes;467
12.6;7.6 Semi-Markov Processes;474
12.7;7.7 The Inspection Paradox;477
12.8;7.8 Computing the Renewal Function;480
12.9;7.9 Applications to Patterns;483
12.9.1;7.9.1 Patterns of Discrete Random Variables;484
12.9.2;7.9.2 The Expected Time to a Maximal Run of Distinct Values;491
12.9.3;7.9.3 Increasing Runs of Continuous Random Variables;493
12.10;7.10 The Insurance Ruin Problem;495
12.11;Exercises;501
12.12;References;512
13;Chapter 8. Queueing Theory;514
13.1;8.1 Introduction;514
13.2;8.2 Preliminaries;515
13.2.1;8.2.1 Cost Equations;516
13.2.2;8.2.2 Steady-State Probabilities;517
13.3;8.3 Exponential Models;519
13.3.1;8.3.1 A Single-Server Exponential Queueing System;519
13.3.2;8.3.2 A Single-Server Exponential Queueing System Having Finite Capacity;528
13.3.3;8.3.3 Birth and Death Queueing Models;534
13.3.4;8.3.4 A Shoe Shine Shop;539
13.3.5;8.3.5 A Queueing System with Bulk Service;541
13.4;8.4 Network of Queues;544
13.4.1;8.4.1 Open Systems;544
13.4.2;8.4.2 Closed Systems;549
13.5;8.5 The System M/G/1;555
13.5.1;8.5.1 Preliminaries: Work and Another Cost Identity;555
13.5.2;8.5.2 Application of Work to M/G/1;556
13.5.3;8.5.3 Busy Periods;557
13.6;8.6 Variations on the M/G/1;558
13.6.1;8.6.1 The M/G/1 with Random-Sized Batch Arrivals;558
13.6.2;8.6.2 Priority Queues;560
13.6.3;8.6.3 An M/G/1 Optimization Example;563
13.6.4;8.6.4 The M/G/1 Queue with Server Breakdown;567
13.7;8.7 The Model G/M/1;570
13.7.1;8.7.1 The G/M/1 Busy and Idle Periods;575
13.8;8.8 A Finite Source Model;576
13.9;8.9 Multiserver Queues;579
13.9.1;8.9.1 Erlang’s Loss System;580
13.9.2;8.9.2 The M/M/k Queue;581
13.9.3;8.9.3 The G/M/k Queue;582
13.9.4;8.9.4 The M/G/k Queue;584
13.10;Exercises;585
13.11;References;595
14;Chapter 9. Reliability Theory;596
14.1;9.1 Introduction;596
14.2;9.2 Structure Functions;597
14.2.1;9.2.1 Minimal Path and Minimal Cut Sets;599
14.3;9.3 Reliability of Systems of Independent Components;603
14.4;9.4 Bounds on the Reliability Function;607
14.4.1;9.4.1 Method of Inclusion and Exclusion;608
14.4.2;9.4.2 Second Method for Obtaining Bounds on r(p);617
14.5;9.5 System Life as a Function of Component Lives;619
14.6;9.6 Expected System Lifetime;627
14.6.1;9.6.1 An Upper Bound on the Expected Life of a Parallel System;631
14.7;9.7 Systems with Repair;633
14.7.1;9.7.1 A Series Model with Suspended Animation;637
14.8;Exercises;640
14.9;References;646
15;Chapter 10. Brownian Motion and Stationary Processes;648
15.1;10.1 Brownian Motion;648
15.2;10.2 Hitting Times, Maximum Variable, and the Gambler’s Ruin Problem;652
15.3;10.3 Variations on Brownian Motion;653
15.3.1;10.3.1 Brownian Motion with Drift;653
15.3.2;10.3.2 Geometric Brownian Motion;653
15.4;10.4 Pricing Stock Options;655
15.4.1;10.4.1 An Example in Options Pricing;655
15.4.2;10.4.2 The Arbitrage Theorem;657
15.4.3;10.4.3 The Black-Scholes Option Pricing Formula;661
15.5;10.5 White Noise;666
15.6;10.6 Gaussian Processes;668
15.7;10.7 Stationary and Weakly Stationary Processes;671
15.8;10.8 Harmonic Analysis of Weakly Stationary Processes;676
15.9;Exercises;678
15.10;References;682
16;Chapter 11. Simulation;684
16.1;11.1 Introduction;684
16.2;11.2 General Techniques for Simulating Continuous Random Variables;689
16.2.1;11.2.1 The Inverse Transformation Method;689
16.2.2;11.2.2 The Rejection Method;690
16.2.3;11.2.3 The Hazard Rate Method;694
16.3;11.3 Special Techniques for Simulating Continuous Random Variables;697
16.3.1;11.3.1 The Normal Distribution;697
16.3.2;11.3.2 The Gamma Distribution;701
16.3.3;11.3.3 The Chi-Squared Distribution;701
16.3.4;11.3.4 The Beta (n, m) Distribution;702
16.3.5;11.3.5 The Exponential Distribution—The Von Neumann Algorithm;703
16.4;11.4 Simulating from Discrete Distributions;705
16.4.1;11.4.1 The Alias Method;708
16.5;11.5 Stochastic Processes;713
16.5.1;11.5.1 Simulating a Nonhomogeneous Poisson Process;714
16.5.2;11.5.2 Simulating a Two-Dimensional Poisson Process;720
16.6;11.6 Variance Reduction Techniques;723
16.6.1;11.6.1 Use of Antithetic Variables;724
16.6.2;11.6.2 Variance Reduction by Conditioning;727
16.6.3;11.6.3 Control Variates;732
16.6.4;11.6.4 Importance Sampling;734
16.7;11.7 Determining the Number of Runs;739
16.8;11.8 Generating from the Stationary Distribution of a Markov Chain;740
16.8.1;11.8.1 Coupling from the Past;740
16.8.2;11.8.2 Another Approach;742
16.9;Exercises;743
16.10;References;751
17;Appendix: Solutions to Starred Exercises;752
18;Index;792
Preface
This text is intended as an introduction to elementary probability theory and stochastic processes. It is particularly well suited for those wanting to see how probability theory can be applied to the study of phenomena in fields such as engineering, computer science, management science, the physical and social sciences, and operations research. It is generally felt that there are two approaches to the study of probability theory. One approach is heuristic and nonrigorous and attempts to develop in the student an intuitive feel for the subject that enables him or her to “think probabilistically.” The other approach attempts a rigorous development of probability by using the tools of measure theory. It is the first approach that is employed in this text. However, because it is extremely important in both understanding and applying probability theory to be able to “think probabilistically,” this text should also be useful to students interested primarily in the second approach. New to This Edition
The tenth edition includes new text material, examples, and exercises chosen not only for their inherent interest and applicability but also for their usefulness in strengthening the reader’s probabilistic knowledge and intuition. The new text material includes Section 2.7, which builds on the inclusion/exclusion identity to find the distribution of the number of events that occur; and Section 3.6.6 on left skip free random walks, which can be used to model the fortunes of an investor (or gambler) who always invests 1 and then receives a nonnegative integral return. Section 4.2 has additional material on Markov chains that shows how to modify a given chain when trying to determine such things as the probability that the chain ever enters a given class of states by some time, or the conditional distribution of the state at some time given that the class has never been entered. A new remark in Section 7.2 shows that results from the classical insurance ruin model also hold in other important ruin models. There is new material on exponential queueing models, including, in Section 2.2, a determination of the mean and variance of the number of lost customers in a busy period of a finite capacity queue, as well as the new Section 8.3.3 on birth and death queueing models. Section 11.8.2 gives a new approach that can be used to simulate the exact stationary distribution of a Markov chain that satisfies a certain property. Among the newly added examples are 1.11, which is concerned with a multiple player gambling problem; 3.20, which finds the variance in the matching rounds problem; 3.30, which deals with the characteristics of a random selection from a population; and 4.25, which deals with the stationary distribution of a Markov chain. Course
Ideally, this text would be used in a one-year course in probability models. Other possible courses would be a one-semester course in introductory probability theory (involving Chapters 1–3 and parts of others) or a course in elementary stochastic processes. The textbook is designed to be flexible enough to be used in a variety of possible courses. For example, I have used Chapters 5 and 8, with smatterings from Chapters 4 and 6, as the basis of an introductory course in queueing theory. Examples and Exercises
Many examples are worked out throughout the text, and there are also a large number of exercises to be solved by students. More than 100 of these exercises have been starred and their solutions provided at the end of the text. These starred problems can be used for independent study and test preparation. An Instructor’s Manual, containing solutions to all exercises, is available free to instructors who adopt the book for class. Organization
Chapters 1 and 2 deal with basic ideas of probability theory. In Chapter 1 an axiomatic framework is presented, while in Chapter 2 the important concept of a random variable is introduced. Subsection 2.6.1 gives a simple derivation of the joint distribution of the sample mean and sample variance of a normal data sample. Chapter 3 is concerned with the subject matter of conditional probability and conditional expectation. “Conditioning” is one of the key tools of probability theory, and it is stressed throughout the book. When properly used, conditioning often enables us to easily solve problems that at first glance seem quite difficult. The final section of this chapter presents applications to (1) a computer list problem, (2) a random graph, and (3) the Polya urn model and its relation to the Bose-Einstein distribution. Subsection 3.6.5 presents k-record values and the surprising Ignatov’s theorem. In Chapter 4 we come into contact with our first random, or stochastic, process, known as a Markov chain, which is widely applicable to the study of many real-world phenomena. Applications to genetics and production processes are presented. The concept of time reversibility is introduced and its usefulness illustrated. Subsection 4.5.3 presents an analysis, based on random walk theory, of a probabilistic algorithm for the satisfiability problem. Section 4.6 deals with the mean times spent in transient states by a Markov chain. Section 4.9 introduces Markov chain Monte Carlo methods. In the final section we consider a model for optimally making decisions known as a Markovian decision process. In Chapter 5 we are concerned with a type of stochastic process known as a counting process. In particular, we study a kind of counting process known as a Poisson process. The intimate relationship between this process and the exponential distribution is discussed. New derivations for the Poisson and nonhomogeneous Poisson processes are discussed. Examples relating to analyzing greedy algorithms, minimizing highway encounters, collecting coupons, and tracking the AIDS virus, as well as material on compound Poisson processes, are included in this chapter. Subsection 5.2.4 gives a simple derivation of the convolution of exponential random variables. Chapter 6 considers Markov chains in continuous time with an emphasis on birth and death models. Time reversibility is shown to be a useful concept, as it is in the study of discrete-time Markov chains. Section 6.7 presents the computationally important technique of uniformization. Chapter 7, the renewal theory chapter, is concerned with a type of counting process more general than the Poisson. By making use of renewal reward processes, limiting results are obtained and applied to various fields. Section 7.9 presents new results concerning the distribution of time until a certain pattern occurs when a sequence of independent and identically distributed random variables is observed. In Subsection 7.9.1, we show how renewal theory can be used to derive both the mean and the variance of the length of time until a specified pattern appears, as well as the mean time until one of a finite number of specified patterns appears. In Subsection 7.9.2, we suppose that the random variables are equally likely to take on any of m possible values, and compute an expression for the mean time until a run of m distinct values occurs. In Subsection 7.9.3, we suppose the random variables are continuous and derive an expression for the mean time until a run of m consecutive increasing values occurs. Chapter 8 deals with queueing, or waiting line, theory. After some preliminaries dealing with basic cost identities and types of limiting probabilities, we consider exponential queueing models and show how such models can be analyzed. Included in the models we study is the important class known as a network of queues. We then study models in which some of the distributions are allowed to be arbitrary. Included are Subsection 8.6.3 dealing with an optimization problem concerning a single server, general service time queue, and Section 8.8, concerned with a single server, general service time queue in which the arrival source is a finite number of potential users. Chapter 9 is concerned with reliability theory. This chapter will probably be of greatest interest to the engineer and operations researcher. Subsection 9.6.1 illustrates a method for determining an upper bound for the expected life of a parallel system of not necessarily independent components and Subsection 9.7.1 analyzes a series structure reliability model in which components enter a state of suspended animation when one of their cohorts fails. Chapter 10 is concerned with Brownian motion and its applications. The theory of options pricing is discussed. Also, the arbitrage theorem is presented and its relationship to the duality theorem of linear programming is indicated. We show how the arbitrage theorem leads to the Black–Scholes option pricing formula. Chapter 11 deals with simulation, a powerful tool for analyzing stochastic models that are analytically intractable. Methods for generating the values of arbitrarily distributed random variables are discussed, as are variance reduction methods for increasing the efficiency of the simulation. Subsection 11.6.4 introduces the valuable simulation technique of importance sampling, and indicates the usefulness of tilted distributions when applying this method. Acknowledgments
We would like to acknowledge with thanks the helpful suggestions made by the many reviewers of the text. These...




