AMS 507: Introduction to Probability
The topics include sample spaces, axioms of probability, conditional probability and independence, discrete and continuous random variables, jointly distributed random variables, characteristics of random variables, law of large numbers and central limit theorem, Markov chains.
Required Textbook: A First Course in Probability by Sheldon M. Ross, 10th edition, 2020, Pearson
Supplementary Textbook: Introduction to Probability by Joseph K. Blitzstein and Jessica Hwang, 2nd edition, 2019, Chapman & Hall/CRC
Learning Outcomes
- Demonstrate an understanding of core concepts of probability theory and their use in applications:
- experiments, outcomes, sample spaces, events, and the role of set theory in probability;
- the axioms of probability and the theorems and their consequences;
- using counting principles to calculate probabilities of events in sample spaces of equally likely outcomes;
- independence and disjointness;
- conditional probability;
- the law of total probability and Bayes. law;
- the method of conditioning to solve problems;
- Markov chains and associated conditioning arguments.
- Demonstrate an understanding of the theory of random variables and their applications:
- the difference between discrete random variables, continuous random variables, and random variables with hybrid distributions;
- cumulative distribution functions and their properties;
- probability mass functions for discrete random variables and computations to evaluate probabilities;
- properties of commonly used discrete distributions, such as binomial, geometric, Poisson, and hypergeometric distributions;
- probability density functions, computing them from cumulative distribution functions, and vice versa;
- properties of commonly used density functions, such as uniform, exponential, gamma, beta, and normal densities;
- means, variances, and higher moments of random variables, and their properties;
- connections and differences between different distribution functions, e.g., normal approximation to binomial, Poisson approximation to binomial, and the difference between binomial and hypergeometric;
- Markov and Chebyshev inequalities and utilizing them to give bounds and estimates of probabilities;
- Demonstrate an understanding of the theory of jointly distributed random variables and their applications:
- computations with joint distributions, both for discrete and continuous random variables;
- computations with joint density functions and conditional density functions;
- conditional expectation and conditioning arguments in computations involving two or more random variables;
- computations with the bivariate normal distribution, the t-distribution, and chi-squared distributions, order statistics;
- applying indicator random variables to compute expectations;
- using moment generating functions in solving problems with sums of independent random variables;
- the weak and strong laws of large numbers;
- applying the central limit theorem in estimating probabilities.