Simple Linear Regression Model Yi = 0 + 1 Xi + ui The regression equation
Estimated Regression Equation i = 0 + 1 Xi Fitted values for the dependent variable Slope Estimate ( 1) 1 = (Xi - X)(Yi - ) / (Xi - X)² Estimate of the slope of the regression line Intercept Estimate ( 0) 0= - 1X Estimate of the intercept of the regression line Residual (Error) ui = Yi - i Difference between actual and predicted values Total Sum of Squares (SST) SST = (Yi - )² Total variability in the dependent variable Explained Sum of Squares (SSR) SSR = ( i - )² Variability explained by the regression Residual Sum of Squares (SSE) SSE = (Yi - i)² Unexplained variability Relationship (SST, SSR, SSE) SST = SSR + SSE Breakdown of total variation Coefficient of Determination (R²) R² = SSR/SST = 1 - SSE/SST Proportion of variance explained by the regressio Error Variance (Residual Variance) s² = SSE / (n - 2)Variance of residuals (used for hypothesis testing and confide Standard Error of 1 SE( 1) = s / (Xi - X)² Measures the precision of the estimated slope Standard Error of 0 SE( 0) = s [1/n + X² / (Xi - X)²] Measures the precision of the estimated intercep t-statistic for 1 t = 1 / SE( 1) Used for hypothesis testing on 1 Confidence Interval for 1 1 ± t( /2, n-2) × SE( 1) Range within which the true 1 lies Gauss-Markov Theorem OLS estimators are BLUE Best Linear Unbiased Estimators under classical assum F-statistic for Overall Fit F = [R² / k] / [(1 - R²) / (n - k - 1)] Tests overall significance of the regression SST, SSR, SSE Relationship SST = SSR + SSE Relationship between total, explained, and residual va Properties of OLS Estimators E( 0) = 0 and E( 1) = 1 OLS estimators are unbiased Error Term Assumptions E(ui) = 0, Var(ui) = ², No Autocorrelation Classical linear regression assumptions