# Linear Regression

## Introduction to Linear Regression

Let $$(Y, X, U)$$be a random vector where $$Y$$ and $$U$$ take values in $$\mathbf{R}$$ and $X$ takes values in $$\mathbf{R}^{k+1}$$. Assume further that the first component of $$X$$ is a constant equal to one, i.e.,$$X=\left(X\_0, X\_1, \ldots, X\_k\right)^{\prime}$$ with $$X\_0=1$$.  Let $$\beta=\left(\beta\_0, \beta\_1, \ldots, \beta\_k\right)^{\prime} \in \mathbf{R}^{k+1}$$ be such that:&#x20;

$$
Y=X^{\prime} \beta+U
$$

We can have that:

* $$X\_1, \ldots, X\_k$$ are column vectors, store data for specific variable $$k$$
* Shape of $$Y$$, $$X^{\prime}$$, $$\beta$$ and $$U$$:&#x20;
  * $$Y$$: $$n \times 1$$
  * $$X^{\prime}$$: $$n \times (k+1)$$
  * $$\beta$$: $$(k+1) \times 1$$
  * $$U$$: $$n \times 1$$
* &#x20;$$\beta\_0$$ is an intercept parameter and the remaining $$\beta\_j$$ are slope parameters.

Here, we got the basic structure of the Linear Regression.
