STAT 513 -- EXAM 2 REVIEW SHEET I. Neyman-Pearson Testing Approach A. Simple and Composite Hypotheses 1. What Is a Simple Hypothesis? 2. What Is a Composite Hypothesis? 3. Examples of each Type B. The Neyman-Pearson Lemma 1. Guarantee of "Most Powerful" (MP) test 2. When is it applicable? 3. Finding the rejection region (RR) of the N-P test 4. Simplifying the RR 5. Determining the appropriate constant for an alpha-level test C. Relationship between Most Powerful Tests and Sufficient Statistics D. Uniformly Most Powerful (UMP) Tests 1. For what types of hypotheses are UMP tests possible? 2. Examples where no UMP test exists II. Likelihood Ratio (LR) Test A. Parameter Space Omega 1. Nuisance parameters 2. Restricted Parameter Space Omega_0 B. LR Test Statistic 1. Maximized Restricted Likelihood and Maximized Unrestricted Likelihood 2. Form of the RR of a LR test C. Specific Examples of LR tests D. Large-sample Approximate LR test 1. Distribution of -2*ln(lambda) 2. How to find the appropriate degrees of freedom III. Theory of Simple Linear Regression (SLR) A. Probabilistic Model 1. Deterministic Component and Random Component 2. Dependent (Response) Variable Y, Independent Variable X 3. Examples of Models Linear in the Parameters B. Simple Linear Regression Model 1. Statement of Model Equation 2. Intercept beta_0, Slope beta_1, random error epsilon 3. Which parts of model are constant and which are random? C. Estimating the Model with least squares 1. Determining whether a straight-line model is appropriate 2. Scatterplot 3. Observed Y-values and Fitted Values 4. Least squares philosophy (minimizing SSE) 5. The least-squares estimates for beta_0 and beta_1 6. Interpreting estimated slope and estimated intercept 7. Using least-squares line to predict Y values for a given X D. Model Assumptions 1. mean of random error component = 0 2. variance of random error component constant for all values of X 3. Normal Distribution of random error component 4. Values of random error component for any two observations are independent E. Properties of Least-Squares Estimators 1. Unbiasedness 2. Variances and covariances of estimators 3. The Gauss-Markov Theorem and BLUEs F. Estimating the error variance sigma^2 1. Definition and Calculation of MSE 2. Unbiasedness of MSE G. Distributions of Estimators under the Normality Assumption 1. Distributions of beta_0-hat and beta_1-hat 2. Distribution of (n-2)MSE/(sigma^2) H. Inferences about the Regression Parameters 1. Tests and CIs about a0*beta_0 + a1*beta_1 2. Tests about the slope beta_1 3. How does testing whether the slope is 0 test whether X is related to E(Y)? 4. Test statistic, rejection region for test of H_0: beta_1 = 0 5. Confidence Interval for the true slope beta_1 6. CI for the E(Y) at a particular X value 7. Interpreting the CIs