1. introduction to probability theory 1
1.1. introduction. 1
1.2. sample space and events 1
1.3. probabilities defined on events 4
1.4. conditional probabilities 7
1.5. independent events 10
1.6. bayes' formula 12
2. random variables 23
2.1. random variables 23
2.2. discrete random variables 27
2.2.1. the bernoulli random variable 28
2.2.2. the binomial random variable 29
2.2.3. the geometric random variable 31
2.2.4. the poisson random variable 32
2.3. continuous random variables 34
2.3.1. the uniform random variable 35
2.3.2. exponential random variables 36
It is generally felt that there are two approaches to the study of probability theory. One approach is heuristic and nonrigorous and attempts to develop in the student an intuitive feel for the subject which enables him or her to "think probabilistically." The other approach attempts a rigorous development of probability by using the tools of measure theory. It is the first approach that is employed in this text. However, because it is extremely important in both understanding and applying probability theory to be able to "think probabilistically," this text should also be useful to students interested primarily in the second approach.
New to This Edition
The eighth edition contains five new sections.
·Section 3.6.4 presents an elementary approach, using only conditional expectation, for computing the expected time until a sequence of independent and identically distributed random variables produce a specified pattern.
·Section 3.6.5 derives an identity involving compound Poisson random variables and then uses it to obtain an elegant recursive formula for the probabilities of compound Poisson random variables whose incremental increases are nonnegative and integer valued.
·Section 5.4.3 is concerned with a conditional Poisson process, a type of process that is widely applicable in the risk industries.
·Section 7.10 presents a derivation of and a new characterization for the classical insurance ruin probability.
·Section 11.8 presents a simulation procedure known as coupling from the past; its use enables one to exactly generate the value of a random variable whose distribution is that of the sta6onary distribution of a given Markov chain, even in cases where the stationary distribution cannot itself be explicitly determined.
There are also new Examples and Exercises in almost all chapters. Among the more significant are
·Examples 3.19, 3.28, 5.4 and 5.19, relating to insurance;
·Example 2.47 on the Poisson paradigm;
·Examples 4.7 and 4.23 on the Bonus-Malus system for setting automobile insurance premiums;
·Example 4.22, which shows how to obtain the expected time until a specified pattern appears in a sequence of Markov chain generated data;
·Example 5.1, which illustrates the connection between the total expected discounted return and the total expected (undisconnted) return earned by an exponentially distributed random time;
·Examples 11.19 and 11.20, which further indicate the use of variance reduction in obtaining efficient simulation estimators.
Ideally, this text would be used in a one-year course in probability models. Other possible courses would be a one-semester course in introductory probability theory (involving Chapters 1-3 and parts of others) or a course in elementary stochastic processes. The textbook is designed to be flexible enough to be used in a variety of possible courses. For example, I have used Chapters 5 and 8, with smatterings from Chapters 4 and 6, as the basis of an introductory course in queueing theory.
Examples and Exercises
Many examples are worked out throughout the text, and there are also a large number of exercises to be solved by students. More than 100 of these exercises have been starred and their solutions provided at the end of the text. These starred problems can be used for independent study and test preparation. An Instructor's Manual, containing solutions to all exercises, is available free to instructors who adopt the book for class. ..