Assumption 1 requires that the dependent variable is a linear combination of the explanatory variables
and the error terms
. Assumption 1 requires the specified model to be linear in parameters, but it does not require the model to be linear in variables. Equation 1 and 2 depict a model which is both, linear in parameter and variables. Note that Equation 1 and 2 show the same model in different notation.
(1)
(2)
In order for OLS to work the specified model has to be linear in parameters. Note that if the true relationship between and
is non linear it is not possible to estimate the coefficient
in any meaningful way. Equation 3 shows an empirical model in which
is of quadratic nature.
(3)
Assumption 1 of CLRM requires the model to be linear in parameters. OLS is not able to estimate Equation 3 in any meaningful way. However, assumption 1 does not require the model to be linear in variables. OLS will produce a meaningful estimation of in Equation 4.
(4)
Using the method of ordinary least squares (OLS) allows us to estimate models which are linear in parameters, even if the model is non linear in variables. On the contrary it is not possible to estimate models which are non linear in parameters, even if they are linear in variables.
Finally, every model estimated with OLS should contains all relevant explanatory variables and all included explanatory variables should be relevant. Not including all relevant variables gives rise to the omitted variables bias. A very serious issue in regression analysis.
Assumptions of Classical Linear Regressionmodels (CLRM)
Overview of all CLRM Assumptions
Assumption 1
Assumption 2
Assumption 3
Assumption 4
Assumption 5
I have a question with regards to this. If for instance, the model if: ln(y) = a+bx^2 +e. do we still say that it satisfied GMA?
Generally, the Assumptions of Classical Linear Regression Model require the specified model to be linear in parameters, but they do not require the model to be linear in variables. Does this answer your question?