Inference on Linear Models

Let (Y,X,U)(Y, X, U) be a random vector where YY and UU take values in R\mathbf{R} and XRk+1X \in \mathbf{R}^{k+1}.

We already assume that

  1. E[XU]=0E[X U]=0

  2. E[XX]<E\left[X X^{\prime}\right]<\infty

  3. No perfect collinearity in XX

  4. Var[XU]<\operatorname{Var}[X U]<\infty

Under these assumptions, we establish the asymptotic normality of the OLS estimator β^\hat{\beta},

n(β^β)dN(0,V)\sqrt{n}(\hat{\beta}-\beta) \stackrel{d}{\rightarrow} N(0, \mathbb{V})

with

V=(E[XX])1E[XXU2](E[XX])1\mathbb{V}=\left(E\left[X X^{\prime}\right]\right)^{-1} E\left[X X^{\prime} U^2\right]\left(E\left[X X^{\prime}\right]\right)^{-1}

We also described a consistent estimator V^n\hat{\mathbb{V}}_n of the limiting variance V\mathbb{V}. We develop methods for inference under the assumption that E[XXU2]E\left[X X^{\prime} U^2\right] is non-singular.

Last updated