STAT 511 -- EXAM 3 REVIEW SHEET I. More Continuous Distributions A. Gamma and Related Distributions 1. Support of gamma pdf 2. Definition and interesting properties of the gamma pdf a. Role of shape and scale parameters b. Understanding the Gamma Function 3. Using and recognizing the "kernel" of the gamma pdf 4. Using the gamma mean and variance 5. Recognizing a gamma mgf 6. A special case: The Chi-square Distribution a. Which gamma parameters produce a Chi-square Distribution? b. Mean and Variance of a Chi-square Distribution 7. An important special case: The Exponential Distribution a. Which gamma parameters produce a Exponential Distribution? b. Mean and Variance of a Exponential Distribution c. Finding exponential probabilities using the exponential cdf (easiest) or direct integration (still fairly easy) d. Memoryless property of Exponential distribution 8. Connection between a Poisson process and the exponential (and gamma) distributions B. Beta Distribution 1. Support of Beta pdf 2. Beta pdf and how its parameter values affect the pdf's shape and skewness 3. Using the beta mean and variance 4. Finding beta probabilities using the formula involving Binomial probabilities II. Other Distributional Topics A. Tchebysheff's Theorem 1. Markov's Inequality 2. Using Tchebysheff's inequality 3. Probability statements based simply on a r.v.'s mean and standard deviation B. Piecewise Functions of a r.v. 1. Finding Expected Values of a Piecewise Function g(Y) *** Mixed Distributions will not be on the Summer 2021 Exam *** III. Bivariate Distributions A. Understanding what a random vector is 1. Joint pmf of two jointly discrete r.v.'s 2. Joint cdf of two jointly discrete r.v.'s 3. Joint pdf of two jointly continuous r.v.'s B. Finding Probabilities with Joint pdf's 1. Integrating the joint pdf over a particular region 2. Understanding Double Integrals and the correct limits of integration 3. Sketching the region of support for two jointly continuous r.v.'s 4. Determining (and sketching) the region of integration to find a certain probability II. Working with Joint pdf of two jointly continuous r.v.'s A. Finding Probabilities with Joint pdf's 1. Integrating the joint pdf over a particular region 2. Understanding Double Integrals and the correct limits of integration 3. Sketching the region of support for two jointly continuous r.v.'s 4. Determining (and sketching) the region of integration to find a certain probability B. Marginal and Conditional Distributions 1. Marginal distributions for jointly discrete r.v.'s 2. Summing probabilities across rows (or columns) of two-way table 3. Marginal distributions for jointly continuous r.v.'s 4. Integrating joint pdf across values of the OTHER r.v. 5. Conditional distributions for jointly discrete r.v.'s 6. Conditional distributions for jointly continuous r.v.'s 7. Conditional pdf depends on the (constant) value of the "given" r.v. 8. Finding conditional probabilities using the basic definition of conditional probability C. Independent Random Variables 1. Intuitive Notion of Independent r.v.'s 2. Formal Definition of Independent r.v.'s (Discrete Case) 3. Formal Definition of Independent r.v.'s (Continuous Case) 4. Conditions for checking Independence or Dependence of two r.v.'s 5. Definition of a random sample of n measurements III. More results about Random Vectors A. Expected Value of a Function of a Random Vector 1. Finding E[g(Y1, Y2)] using the joint pdf 2. Helpful Theorems for finding Expected Values of a Functions of Y1 and Y2 3. Expected Value of the Product of INDEPENDENT r.v.'s B. Covariance of Two Random Variables 1. What does Covariance measure? 2. Formal Definition of Covariance 3. Computational Formula for Finding Covariance 4. Correlation Coefficient rho (and its advantages) a. Cauchy-Schwarz inequality b. Showing that -1 <= rho <= 1 5. Finding and interpreting covariance and correlation, given a joint pdf 6. Connection between Independence and "Zero Covariance" a. Showing Independence implies "Zero Covariance" b. Showing "Zero Covariance" does not imply Independence C. Linear Functions of Random Variables 1. Expected value of a Linear Combination of Random Variables 2. Variance of a Linear Combination of Two Random Variables a. Variance of a Sum of Two r.v.'s b. Variance of a Difference of Two r.v.'s 3. Covariance between Two Linear Combinations of Two Random Variables D. Conditional Expectations 1. Basic Formulas for Conditional Expected Values 2. Law of Iterated Expectation 3. Law of Iterated Variance III. Special Multivariate Distributions A. The Multinomial Probability Distribution 1. What is a multinomial experiment? 2. Generalization of the binomial distribution from 2 to k categories 3. Using the Multinomial joint probability function to find probabilities 4. Expected Value, Variance, Covariance for multinomial distribution B. The Bivariate Normal Distribution 1. If (Y1,Y2) are bivariate normal, what does this imply about: a. Marginal distribution of Y1 (or of Y2)? b. Conditional distribution of Y1 | y2 (or of Y2 | y1)? 2. Connection between Independence and "Zero Covariance" in the BIVARIATE NORMAL CASE C. Multivariate Hypergeometric Distribution (WILL NOT BE ON SUMMER 2021 EXAM)