# Omitted Variable Bias: An Example

This post is part of the series on the omitted variable bias and provides a simulation exercise that illustrates how omitting a relevant variable from your regression model biases the coefficients. The R code will be provided at the end. Continue reading Omitted Variable Bias: An Example

# The Problem of Mulitcollinearity

Multicollinearity or collinearity refers to a situation where two or more variables of a regression model are highly correlated. Because of the high correlation, it is difficult to disentangle the pure effect of one single explanatory variables $x$ on the dependent variable $y$. From a mathematical point of view, multicollinearity only becomes an issue when we face perfect multicollinearity. That is, when we have identical variables in our regression model. Continue reading The Problem of Mulitcollinearity

# Linear Regression in STATA

In STATA one can estimate a linear regression using the command  regress. In this post I will present how to use the STATA function regress to run OLS on the following model

$y = \alpha + \beta_{1} x_{1}$

# Omitted Variable Bias

The omitted variable bias is a common and serious problem in regression analysis. Generally, the problem arises if one does not consider all relevant variables in a regression. In this case, one violates the third assumption of the assumption of the classical linear regression model. The following blog posts explain the omitted variable bias and demonstrate its consequences.

# Multiple Regression in Julia

Julia presents various ways to carry out multiple regressions. One easy way is to use the lm() function of the GLM package. In this post I will present how to use the lm() and run OLS on the following model

$y = \alpha + \beta_{1} x_{1} + \beta_{2} x_{2} + \beta_{3} x_{3}$

# Linear Regression

A linear regression is a special case of the classical linear regression models that describes the relationship between two variables by fitting a linear equation to observed data. Thereby, one variable is considered to be the explanatory (or independent) variable, and the other variable is considered to be the dependent variable. For instance, an econometrician might want to relate weight to their calorie consumption using a linear regression model.