# CLRM – Assumption 1: Linear Parameter and correct model specification

Assumption 1 requires that the dependent variable $\textbf{y}$ is a linear combination of the explanatory variables $\textbf{X}$ and the error terms $\epsilon$. Assumption 1 requires the specified model to be linear in parameters, but it does not require the model to be linear in variables. Equation 1 and 2 depict a model which is both, linear in parameter and variables. Note that Equation 1 and 2 show the same model in different notation.

(1) $\textbf{y} = \textbf{X}\boldsymbol\beta + \boldsymbol\epsilon$

(2) $y_{i} = \beta_{0} + \beta_{1}x_{i1} + \beta_{2}x_{i2} + ... + \beta_{K}x_{iK} + \epsilon_{i}$

In order for OLS to work the specified model has to be linear in parameters. Note that if the true relationship between $x_{1}$ and $y$ is non linear it is not possible to estimate the coefficient $\beta$ in any meaningful way. Equation 3 shows an empirical model in which $\beta_{1}$ is of quadratic nature.

(3) $y_{i} = \beta_{0} + (\beta_{1})^{2}x_{i1} + \beta_{2}x_{i2} + ... + \beta_{K}x_{iK} + \epsilon_{i}$

Assumption 1 of CLRM requires the model to be linear in parameters. OLS is not able to estimate Equation 3 in any meaningful way. However, assumption 1 does not require the model to be linear in variables. OLS will produce a meaningful estimation of $\beta_{1}$ in Equation 4.

(4) $y_{i} = \beta_{0} + \beta_{1}(x_{i1})^{2} + \beta_{2}x_{i2} + ... + \beta_{K}x_{iK} + \epsilon_{i}$

Using the method of ordinary least squares (OLS) allows us to estimate models which are linear in parameters, even if the model is non linear in variables. On the contrary it is not possible to estimate models which are non linear in parameters, even if they are linear in variables.

Finally, every model estimated with OLS should contains all relevant explanatory variables and all included explanatory variables should be relevant. Not including all relevant variables gives rise to the omitted variables bias. A very serious issue in regression analysis.