# How to compute the Lorenz Curve

In contrast to our previous post, that is the post that summarized the Lorenz Curve in general terms, this post details how to construct the Lorenz Curve and provides a hypothetical example in R.

# The Lorenz Curve

The Lorenz Curve displays the actual income or wealth distribution of an economy. The concept was brought up by the American economist Max O. Lorenz in 1905. The curve represents a graphical representation of the income or wealth distribution of an economy or country. That is, it shows the proportion of income earned or wealth possessed by any given percentage of the population. In the case that everyone has approximately the same wealth, we have a very equal society. While in a case where few own the majority of wealth, we have high inequality. The following figure depicts the Lorenz curve for three economies with varying degrees of inequality.

# Cluster Robust Standard Errors in Stargazer

In a previous post, we discussed how to obtain clustered standard errors in R. While the previous post described how one can easily calculate cluster robust standard errors in R, this post shows how one can include cluster robust standard errors in stargazer and create nice tables including clustered standard errors.

# Robust Standard Errors in Stargazer

In a previous post, we discussed how to obtain robust standard errors in R. While the previous post described how one can easily calculate robust standard errors in R, this post shows how one can include robust standard errors in stargazer and create nice tables including robust standard errors.

# Omitted Variable Bias

The omitted variable bias is a common and serious problem in regression analysis. Generally, the problem arises if one does not consider all relevant variables in a regression. In this case, one violates the third assumption of the assumption of the classical linear regression model. The following series of blog posts explains the omitted variable bias and discusses its consequences.

# Multiple Regression in Julia

Julia presents various ways to carry out multiple regressions. One easy way is to use the lm() function of the GLM package. In this post I will present how to use the lm() and run OLS on the following model $y = \alpha + \beta_{1} x_{1} + \beta_{2} x_{2} + \beta_{3} x_{3}$

Continue reading Multiple Regression in Julia