September 29, 2020
Multiple linear regression is the extension of simple linear regression and is equally as common in statistics. To understand how multiple linear regression analysis works, try to solve the following problem by reviewing what you already know and reading through this guide. This guide is meant for those unsure how to approach the problem or for those encountering this concept for the first time.
You’re curious about which factors play into the salary people earn. In order to find out you’d like to conduct a multiple linear regression analysis on data that has the salary, education level in years, and work experience for 10 individuals. Conduct a multiple regression analysis by finding the regression model on the following data set.
What is MLR?
Multiple linear regression is the extension of simple linear regression. Meaning, the basic concepts behind multiple linear regression, or MLR, are the same. The main difference, however, is that multiple linear regression has one response variable with two or more explanatory variables.
The motivation behind MLR is that in many cases, the predictions from regression models get better with more explanatory variables. Intuitively, this makes sense as the majority of the phenomena around us - the demand for goods, the growth of plants, etc. - typically have more than just one variable related to them. Mathematically, this also makes sense: the more variables you add to the model, the higher the explained variance, or R squared value, of the model.
However, introducing more variables means you should practice extra precaution during your analysis. Having a high r-squared value doesn’t always mean you’ve found the best regression model. Often, too high of an r-squared value can signal towards underlying problems with your model. Take a look at some of the common problems you can encounter when building your MLR model.
|Overfitting||Adding too many predictors||The model is too closely related, or “fit”, to the sample data set to the point that it introduces a lot of variability|
|Underfitting||Adding too few predictors||The model does not “fit” the data well enough because it is not complex enough to the point that it introduces bias|
|Multicollinearity||Pairs of explanatory variables are too highly correlated||Reduces the reliability of the model because it affects the variance|
The first two concepts are often referred to as the bias-variance trade-off. The more complex your model, the higher the risk of overfitting the data and therefore having higher variance. The less complex the model, the higher the risk of underfitting the data and therefore the having higher bias. The best models find the sweet spot between the overfitted and underfitted model, which can be visualized in the graph below.
In order to explain multiple linear regression, let’s start with the multiple regression model.
As you may notice, this is simply an extension of the SLR model, which can be written in any of the following ways.
In order to understand this equation, let's break it down by first looking at the linear parameters.
Find a summary of these linear parameters below
|The regression coefficient of the first independent variable|
|The regression coefficient of the second independent variable|
|The regression coefficient of the th independent variable|
Next, take a look at the error term.
Recall that this error term is based off of the real population parameters. In the MLR equation, this error term is actually assumed to be zero. Because we do not know the true population parameters, we arrive at the estimated multiple regression equation.
Take a look at the table below to understand what this estimated MLR equation means.
|The estimates of the population parameters|
|The estimate of the parameter|
Remember that the population parameters are measured from the actual population, whereas the estimates of these parameters are based off of a sample from the population and they are called statistics.
To calculate the estimators, let’s start with the easiest first, which is the intercept. The equation for the intercept is simply a rearranged version of the MLR equation. To illustrate this, take an MLR equation with only two independent variables.
Solving for , we get:
The formulas for the and estimators are a bit more complicated. Take a look at the table below to see the formulas you’ll need to calculate these estimators.
Two Variable MLR Step by Step
Next, we go ahead and plug them in.
Putting these numbers all together, we get a multiple regression model of: