STAT 511 -- POST-EXAM 3 REVIEW SHEET ** NOTE: The final exam will include material from throughout the course. ** Please study the material on the Test 1, Test 2, and Test 3 Review Sheets as well. I. Review of Basics of Bivariate Distributions A. Understanding what a random vector is B. Joint pmf of two jointly discrete r.v.'s C. Joint cdf of two jointly discrete r.v.'s D. Joint pdf of two jointly continuous r.v.'s II. Working with Joint pdf of two jointly continuous r.v.'s A. Finding Probabilities with Joint pdf's 1. Integrating the joint pdf over a particular region 2. Understanding Double Integrals and the correct limits of integration 3. Sketching the region of support for two jointly continuous r.v.'s 4. Determining (and sketching) the region of integration to find a certain probability B. Marginal and Conditional Distributions 1. Marginal distributions for jointly discrete r.v.'s 2. Summing probabilities across rows (or columns) of two-way table 3. Marginal distributions for jointly continuous r.v.'s 4. Integrating joint pdf across values of the OTHER r.v. 5. Conditional distributions for jointly discrete r.v.'s 6. Conditional distributions for jointly continuous r.v.'s 7. Conditional pdf depends on the (constant) value of the "given" r.v. C. Independent Random Variables 1. Intuitive Notion of Independent r.v.'s 2. Formal Definition of Independent r.v.'s (Discrete Case) 3. Formal Definition of Independent r.v.'s (Continuous Case) 4. Conditions for checking Independence or Dependence of two r.v.'s 5. Definition of a random sample of n measurements III. More results about Random Vectors A. Expected Value of a Function of a Random Vector 1. Finding E[g(Y1, Y2)] using the joint pdf 2. Helpful Theorems for finding Expected Values of a Functions of Y1 and Y2 3. Expected Value of the Product of INDEPENDENT r.v.'s B. Covariance of Two Random Variables 1. What does Covariance measure? 2. Formal Definition of Covariance 3. Computational Formula for Finding Covariance 4. Correlation Coefficient rho (and its advantages) a. Cauchy-Schwarz inequality b. Showing that -1 <= rho <= 1 5. Finding and interpreting covariance and correlation, given a joint pdf 6. Connection between Independence and "Zero Covariance" a. Showing Independence implies "Zero Covariance" b. Showing "Zero Covariance" does not imply Independence C. Linear Functions of Random Variables 1. Expected value of a Linear Combination of Random Variables 2. Variance of a Linear Combination of Two Random Variables a. Variance of a Sum of Two r.v.'s b. Variance of a Difference of Two r.v.'s 3. Covariance between Two Linear Combinations of Two Random Variables D. Conditional Expectations 1. Basic Formulas for Conditional Expected Values 2. Law of Iterated Expectation 3. Law of Iterated Variance III. Special Multivariate Distributions A. The Multinomial Probability Distribution 1. What is a multinomial experiment? 2. Generalization of the binomial distribution from 2 to k categories 3. Using the Multinomial joint probability function to find probabilities 4. Expected Value, Variance, Covariance for multinomial distribution B. The Bivariate Normal Distribution 1. If (Y1,Y2) are bivariate normal, what does this imply about: a. Marginal distribution of Y1 (or of Y2)? b. Conditional distribution of Y1 | y2 (or of Y2 | y1)? 2. Connection between Independence and "Zero Covariance" in the BIVARIATE NORMAL CASE C. Multivariate Hypergeometric Distribution