12.4 - Detecting Multicollinearity Using Variance Inflation Factors

Printer-friendly versionPrinter-friendly version

Okay, now that we know the effects that multicollinearity can have on our regression analyses and subsequent conclusions, how do we tell when it exists? That is, how can we tell if multicollinearity is present in our data?

Some of the common methods used for detecting multicollinearity include:

  • The analysis exhibits the signs of multicollinearity — such as, estimates of the coefficients vary from model to model.
  • The t-tests for each of the individual slopes are non-significant (P > 0.05), but the overall F-test for testing all of the slopes are simultaneously 0 is significant (P < 0.05).
  • The correlations among pairs of predictor variables are large.

Looking at correlations only among pairs of predictors, however, is limiting. It is possible that the pairwise correlations are small, and yet a linear dependence exists among three or even more variables, for example, if X3 = 2X1 + 5X2 + error, say. That's why many regression analysts often rely on what are called variance inflation factors (VIF) to help detect multicollinearity.

What is a Variation Inflation Factor?

As the name suggests, a variance inflation factor (VIF) quantifies how much the variance is inflated. But what variance? Recall that we learned previously that the standard errors — and hence the variances — of the estimated coefficients are inflated when multicollinearity exists. So, the variance inflation factor for the estimated coefficient bk —denoted VIFk —is just the factor by which the variance is inflated.

Let's be a little more concrete. For the model in which xk is the only predictor:

\[y_i=\beta_0+\beta_kx_{ik}+\epsilon_i\]

it can be shown that the variance of the estimated coefficient bk is:

\[Var(b_k)_{min}=\frac{\sigma^2}{\sum_{i=1}^{n}(x_{ik}-\bar{x}_k)^2}\]

Note that we add the subscript "min" in order to denote that it is the smallest the variance can be. Don't worry about how this variance is derived — we just need to keep track of this baseline variance, so we can see how much the variance of bk is inflated when we add correlated predictors to our regression model.

Let's consider such a model with correlated predictors:

\[y_i=\beta_0+\beta_1x_{i1}+ \cdots + \beta_kx_{ik}+\cdots +\beta_{p-1}x_{i, p-1} +\epsilon_i\]

Now, again, if some of the predictors are correlated with the predictor xk, then the variance of bk is inflated. It can be shown that the variance of bk is:

\[Var(b_k)=\frac{\sigma^2}{\sum_{i=1}^{n}(x_{ik}-\bar{x}_k)^2}\times \frac{1}{1-R_{k}^{2}}\]

where \(R_{k}^{2}\)  is the R2-value obtained by regressing the kth predictor on the remaining predictors. Of course, the greater the linear dependence among the predictor xk and the other predictors, the larger the \(R_{k}^{2}\)  value. And, as the above formula suggests, the larger the \(R_{k}^{2}\)  value, the larger the variance of bk.

How much larger? To answer this question, all we need to do is take the ratio of the two variances. Doing so, we obtain:

\[\frac{Var(b_k)}{Var(b_k)_{min}}=\frac{\left( \frac{\sigma^2}{\sum(x_{ik}-\bar{x}_k)^2}\times \frac{1}{1-R_{k}^{2}} \right)}{\left( \frac{\sigma^2}{\sum(x_{ik}-\bar{x}_k)^2} \right)}=\frac{1}{1-R_{k}^{2}}\]

The above quantity is what is deemed the variance inflation factor for the kth predictor. That is:

\[VIF_k=\frac{1}{1-R_{k}^{2}}\]

where \(R_{k}^{2}\)  is the R2-value obtained by regressing the kth predictor on the remaining predictors. Note that a variance inflation factor exists for each of the k predictors in a multiple regression model.

How do we interpret the variance inflation factors for a regression model? Again, it is a measure of how much the variance of the estimated regression coefficient bk is "inflated" by the existence of correlation among the predictor variables in the model. A VIF of 1 means that there is no correlation among the kth predictor and the remaining predictor variables, and hence the variance of bk is not inflated at all. The general rule of thumb is that VIFs exceeding 4 warrant further investigation, while VIFs exceeding 10 are signs of serious multicollinearity requiring correction.

An Example

Let's return to the blood pressure data (bloodpress.txt) in which researchers observed the following data on 20 individuals with high blood pressure:

  • blood pressure (y = BP, in mm Hg)
  • age (x1 = Age, in years)
  • weight (x2 = Weight, in kg)
  • body surface area (x3 = BSA, in sq m)
  • duration of hypertension (x4 = Dur, in years)
  • basal pulse (x5 = Pulse, in beats per minute)
  • stress index (x6 = Stress)

As you may recall, the matrix plot of BP, Age, Weight, and BSA:

plot

the matrix plot of BP, Dur, Pulse, and Stress:

plot

and the correlation matrix:

minitab output

suggest that some of the predictors are at least moderately marginally correlated. For example, body surface area (BSA) and weight are strongly correlated (r = 0.875), and weight and pulse are fairly strongly correlated (r = 0.659). On the other hand, none of the pairwise correlations among age, weight, duration and stress are particularly strong (r < 0.40 in each case).

Regressing y = BP on all six of the predictors, we obtain:

minitab output

[Minitab v17 reports the variance inflation factors by default; for v16 you have to select this under Options.] As you can see, three of the variance inflation factors —8.42, 5.33, and 4.41 —are fairly large. The VIF for the predictor Weight, for example, tells us that the variance of the estimated coefficient of Weight is inflated by a factor of 8.42 because Weight is highly correlated with at least one of the other predictors in the model.

For the sake of understanding, let's verify the calculation of the VIF for the predictor Weight. Regressing the predictor x2 = Weight on the remaining five predictors:

minitab output

Minitab reports that \(R_{Weight}^{2}\) is 88.1% or, in decimal form, 0.881. Therefore, the variance inflation factor for the estimated coefficient Weight is by definition:

\[VIF_{Weight}=\frac{Var(b_{Weight})}{Var(b_{Weight})_{min}}=\frac{1}{1-R_{Weight}^{2}}=\frac{1}{1-0.881}=8.4\]

Again, this variance inflation factor tells us that the variance of the weight coefficient is inflated by a factor of 8.4 because Weight is highly correlated with at least one of the other predictors in the model.

So, what to do? One solution to dealing with multicollinearity is to remove some of the violating predictors from the model. If we review the pairwise correlations again:

minitab output

we see that the predictors Weight and BSA are highly correlated (r = 0.875). We can choose to remove either predictor from the model. The decision of which one to remove is often a scientific or practical one. For example, if the researchers here are interested in using their final model to predict the blood pressure of future individuals, their choice should be clear. Which of the two measurements — body surface area or weight — do you think would be easier to obtain?! If indeed weight is an easier measurement to obtain than body surface area, then the researchers would be well-advised to remove BSA from the model and leave Weight in the model.

Reviewing again the above pairwise correlations, we see that the predictor Pulse also appears to exhibit fairly strong marginal correlations with several of the predictors, including Age (r = 0.619), Weight (r = 0.659) and Stress (r = 0.506). Therefore, the researchers could also consider removing the predictor Pulse from the model.

Let's see how the researchers would do. Regressing the response y = BP on the four remaining predictors age, weight, duration and stress, we obtain:

minitab output

Aha — the remaining variance inflation factors are quite satisfactory! That is, it appears as if hardly any variance inflation remains. Incidentally, in terms of the adjusted R2-value, we did not seem to lose much by dropping the two predictors BSA and Pulse from our model. The adjusted R2-value decreased to only 98.97% from the original adjusted R2-value of 99.44%.

PRACTICE PROBLEMS: Variance inflation factors

Detecting multicollinearity using VIFk.

We’ll use the cement.txt data set to explore variance inflation factors. The response y measures the heat evolved in calories during the hardening of cement on a per gram basis. The four predictors are the percentages of four ingredients: tricalcium aluminate (x1), tricalcium silicate (x2), tetracalcium alumino ferrite (x3), and dicalcium silicate (x4). It’s not hard to imagine that such predictors would be correlated in some way.

1. Use the Stat >> Basic Statistics >> Correlation ... command in Minitab to get an idea of the extent to which the predictor variables are (pairwise) correlated. Also, use the Graph >> Matrix Plot ... command in Minitab to get a visual portrayal of the (pairwise) relationships among the response and predictor variables.

(CHECK YOUR ANSWER)

2. Regress the fourth predictor, x4, on the remaining three predictors, x1, x2, and x3. That is, fit the linear regression model treating x4 as the response and x1, x2, and x3 as the predictors. What is the R24 value? (Note that Minitab rounds the R2 value it reports to three decimal places. For the purposes of the next question, you’ll want a more accurate R2 value. Calculate the R2 value SSR using its definition, \(\frac{SSR}{SSTO}\). Use your calculated value, carried out to 5 decimal places, in answering the next question.)

(CHECK YOUR ANSWER)

3. Using your calculated R2 value carried out to 5 decimal places, determine by what factor the variance of b4 is inflated. That is, what is VIF4?

(CHECK YOUR ANSWER)

4. Minitab will actually calculate the variance inflation factors for you. Fit the multiple linear regression model with y as the response and x1,x2,x3 and x4 as the predictors. The VIFk will be reported as a column of the estimated coefficients table. Is the VIF4 that you calculated consistent with what Minitab reports?

(CHECK YOUR ANSWER)

5. Note that all of the VIFk are larger than 10, suggesting that a high degree of multicollinearity is present. (It should seem logical that multicollinearity is present here, given that the predictors are measuring the percentage of ingredients in the cement.) Do you notice anything odd about the results of the t-tests for testing the individual H0 : βi = 0 and the result of the overall F-test for testing H0 : β1 = β2 = β3 = β4 = 0? Why does this happen?

(CHECK YOUR ANSWER)

6. We learned that one way of reducing data-based multicollinearity is to remove some of the violating predictors from the model. Fit the linear regression model with y as the response and X1 and X2 as the only predictors. Are the variance inflation factors for this model acceptable?

(CHECK YOUR ANSWER)