Linear Regression

Introduction to Linear Regression

Let (Y,X,U)(Y, X, U)be a random vector where YY and UU take values in R\mathbf{R} and $X$ takes values in Rk+1\mathbf{R}^{k+1}. Assume further that the first component of XX is a constant equal to one, i.e.,X=(X0,X1,,Xk)X=\left(X_0, X_1, \ldots, X_k\right)^{\prime} with X0=1X_0=1. Let β=(β0,β1,,βk)Rk+1\beta=\left(\beta_0, \beta_1, \ldots, \beta_k\right)^{\prime} \in \mathbf{R}^{k+1} be such that:

Y=Xβ+UY=X^{\prime} \beta+U

We can have that:

  • X1,,XkX_1, \ldots, X_k are column vectors, store data for specific variable kk

  • Shape of YY, XX^{\prime}, β\beta and UU:

    • YY: n×1n \times 1

    • XX^{\prime}: n×(k+1)n \times (k+1)

    • β\beta: (k+1)×1(k+1) \times 1

    • UU: n×1n \times 1

  • β0\beta_0 is an intercept parameter and the remaining βj\beta_j are slope parameters.

Here, we got the basic structure of the Linear Regression.

Last updated