38

Confounding Variables Can Bias Your Results

 5 years ago
source link: https://www.tuicool.com/articles/hit/FRvuUv6
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Omitted variable bias occurs when aregression model leaves out relevant independent variables , which are known as confounding variables. This condition forces the model to attribute the effects of omitted variables to variables that are in the model, which biases thecoefficient estimates .

BN3QF3Y.png!web

This problem occurs because your linear regression model is specified incorrectly—either because the confounding variables are unknown or because the data do not exist. If this bias affects your model, it is a severe condition because you can’t trust your results.

In this post, you’ll learn about this type of bias, how it occurs, and how to detect and correct it.

Related post: Specifying the Correct Regression Model

What Are the Effects of Omitted Variable Bias?

Omitting confounding variables from your regression model can bias the coefficient estimates. What does thatmean exactly? When you’re assessing the effects of the independent variables in the regression output, this bias can produce the following problems:

  • Overestimate the strength of aneffect.
  • Underestimate the strength of an effect.
  • Change the sign of an effect.
  • Mask an effect that actually exists

You don’t want any of these problems to affect your regression results!

To learn more about the properties of biased and unbiased estimates in regression analysis, read my post about the Gauss-Markov theorem .

Synonyms for Confounding Variables and Omitted Variable Bias

In the context of regression analysis, there are various synonyms for omitted variables and the bias they can cause. Analysts often refer to omitted variables that cause bias as confounding variables, confounders, and lurking variables. These are important variables that the statistical model does include and, therefore, cannot control. Additionally, they call the bias itself omitted variable bias, spurious effects, and spurious relationships. I’ll use these terms interchangeably.

What Conditions Cause Omitted Variable Bias?

How does this bias occur? How can variables you leave out of the model affect the variables that you include in the model? At first glance, this problem might not make sense.

For omitted variable bias to occur, the following two conditions must exist:

  • The omitted variable must correlate with thedependent variable.
  • The omitted variable must correlate with at least one independent variable that is in the regression model.

The diagram below illustrates these two conditions. There must be non-zero correlations (r) on all three sides of the triangle.

QbUBRrA.png!web

Thiscorrelation structure causes confounding variables that are not in the model to bias the estimates that appear in your regression results. For example, removing either X variable will bias the other X variable.

The amount of bias depends on the strength of these correlations. Strong correlations produce greater bias. If the relationships are weak, the bias might not be severe. And, if the omitted variable is not correlated with another independent variable at all, excluding it does not produce bias.

Finally, if you’re performing a randomized experiment, omitted variable bias is less likely to be a problem. Randomized studies minimize the effects of confounding variables by equally distributing them across the treatment groups. Omitted variable bias tends to occur in observational studies.

I’ll explain how confounding variables can bias the results using two approaches. First, I’ll work through an example and describe how the omitted variable forces the model to attribute the effects of the excluded variable to the one in the model. Then, I’ll go into a more statistical explanation that details the correlation structure,residuals, and an assumption violation. Explaining confounding variables using both approaches will give you a solid grasp of how the bias occurs.

Related post: Understanding Correlations

Practical Example of How Confounding Variables Can Produce Bias

qaM3iuz.png!web

I used to work in a biomechanics lab. One study assessed the effects of physical activity on bone density. We measured various characteristics including the subjects’ activity levels, their weights, and bone densities among many others. Theories about how our bodies build bone suggest that there should be a positive correlation between activity level and bone density. In other words, higher activity produces greater bone density.

Early in the study, I wanted to validate our initial data quickly by using simple regression analysis to determine whether there is a relationship between activity and bone density. If our data were valid, there should be a positive relationship. To my great surprise, there was no relationship at all!

What was happening? The theory is well established in the field. Maybe our data was messed up somehow? Long story short, thanks to a confounding variable, the model was exhibiting omitted variable bias.

To perform the quick assessment, I included activity level as the only independent variable, but it turns out there is another variable that correlates with both activity and bone density—the subject’s weight.

After including weight in the regression model, along with activity, the results indicated that both activity and weight are statistically significant and have positive correlations with bone density. The diagram below shows the signs of the correlations between the variables.

ZVv6JrN.png!web

How the Omitted Confounding Variable Hid the Relationship

Right away we see that these conditions can produce omitted variable bias because all three sides of the triangle have non-zero correlations. Let’s find out how leaving weight out of the model masked the relationship between activity and bone density.

Subjects who are more active tend to have higher bone density. Additionally, subjects who weigh more also tend to have higher bone density. However, there is a negative correlation between activity and weight. More active subjects tend to weigh less.

This correlation structure produces two opposing effects of activity. More active subjects get a bone density boost. However, they also tend to weigh less, which reduces bone density.

When I fit a regression model with only activity, the model had to attribute both opposing effects to activity alone. Hence, the zero correlation. However, when I fit the model with both activity and weight, it could assign the opposing effects to each variable separately.

For this example, when I omitted weight from the model, it produced a negative bias because the model underestimated the effect of activity. The results said there is no correlation when there is, in fact, a positive correlation.

Correlations, Residuals, and OLS Assumptions

iIFzmm2.png!web Residuals = Observed value – Fitted value

Now, let’s look at this from another angle that involves the residuals and an assumption. When you satisfy the ordinary least squares (OLS) assumptions, the Gauss-Markov theorem states that your estimates will be unbiased and have minimum variance.

However, omitted variable bias occurs because the residuals violates one of the assumptions. To see how this works, you need to follow a chain of events.

Suppose you have a regression model with two significant independent variables, X1 and X2. These independent variables correlate with each other and the dependent variable—which are the requirements for omitted variable bias.

Now, imagine that we take variable X2 out of the model. It is the confounding variable. Here’s what happens:

  1. The model fits the data less well because we’ve removed a significant explanatory variable. Consequently, the gap between the observed values and thefitted values increases. These gaps are the residuals.
  2. The degree to which each residual increases depends on the relationship between X2 and the dependent variable. Consequently, the residuals correlate with X2.
  3. X1 correlates with X2, and X2 correlates with the residuals. Ergo, variable X1 correlates with the residuals.
  4. Hence, this condition violates the ordinary least squares assumption that independent variables in the model do not correlate with the residuals. Violations of this assumption produce biased estimates.

This explanation serves a purpose later in this post!

The important takeaway here is that leaving out an important variable not only reduces the goodness-of-fit (larger residuals), but it can also bias the coefficient estimates.

Related posts: 7 Classical OLS Assumptions and Check Your Residual Plots

Predicting the Direction of Omitted Variable Bias

We can use correlation structures, like the one in the example, to predict the direction of bias that occurs when the model omits a confounding variable. The direction depends on both the correlation between the included and omitted independent variables and the correlation between the included independent variable and the dependent variable. The table below summarizes these relationships and the direction of bias.

Included and Omitted: Negative Correlation Included and Omitted: Positive Correlation Included and Dependent: Negative Correlation Positive bias: coefficient is overestimated. Negative bias: coefficient is underestimated. Included and Dependent: Positive Correlation Negative bias: coefficient is underestimated. Positive bias: coefficient is overestimated.

Let’s apply this table to the bone density example. The included (Activity) and omitted confounding variable (Weight) have a negative correlation, so we need to use the middle column. The included variable (Weight) and the dependent variable (Bone Density) have a positive relationship, which corresponds to the bottom row. At the intersection of the middle column and bottom row, the table indicates that we can expect a negative bias, which matches our results.

Suppose we hadn’t collected weight and were unable to include it in the model. In that case, we can use this table, along with the hypothesized relationships, to predict the direction of the omitted variable bias.

How to Detect Omitted Variable Bias and Identify Confounding Variables

You saw one method of detecting omitted variable bias in this post. If you include different combinations of independent variables in the model, and you see the coefficients changing, you’re watching omitted variable bias in action!

In this post, I started with a regression model that has activity as the lone independent variable and bone density as the dependent variable. After adding weight to the model, the correlation changed from zero to positive.

However, if we don’t have the data, it can be harder to detect omitted variable bias. If my study hadn’t collected the weight data, the answer would not be as clear.

I presented a clue earlier in this post. We know that for omitted variable bias to exist, an independent variable must correlate with the residuals. Consequently, we can plot the residuals by the variables in our model. If we see a relationship in the plot, rather than random scatter, it both tells us that there is a problem and points us towards the solution. We know which independent variable correlates with the confounding variable.

Another step is to carefully consider theory and other studies. Ask yourself several questions:

  • Do the coefficient estimates match the theoretical signs and magnitudes? If not, you need to investigate. That was my first tip-off!
  • Can you think of confounding variables that you didn’t measure that are likely to correlate with both the dependent variable and at least one independent variable? Reviewing the literature, consulting experts, and brainstorming sessions can shed light on this possibility.

Obstacles to Correcting Omitted Variable Bias

Again, you saw the best correction possible in this post—including the variable in the model! Including confounding variables in a regression model allows the analysis to control for them and prevent the spurious effects that the omitted variables would have caused otherwise. Theoretically, you should include all independent variables that have a relationship with the dependent variable. That’s easier said than done because this approach produces real-world problems.

For starters, you might need to collect data on many more characteristics than is feasible. Additionally, some of these characteristics might be very difficult or even impossible to measure. Suppose you fit a model for salary that includes experience and education. Ability might also be a significant variable, but one that is much harder to measure in some fields.

Furthermore, as you include more variables in the model, the number of observations must increase to avoid overfitting the model, which can also produce unreliable results. Measuring more characteristics and gathering a largersample size can be an expensive proposition!

Because the bias occurs when the confounding variables correlate with independent variables, including these confounders invariably introduces multicollinearity into your model. Multicollinearity causes its own problems including unstable coefficient estimates, lower statisticalpower, and less precise estimates.

It’s important to note a tradeoff that might occur between precision and bias. As you include the formerly omitted variables, you lessen the bias, but the multicollinearity can potentially reduce the precision of the estimates.

It’s a balancing act! Let’s get into some practical recommendations.

Related posts: Overfitting Regression Models and Multicollinearity in Regression

Recommendations for Addressing Confounding Variables and Omitted Variable Bias

Before you begin your study, arm yourself with all the possible background information you can gather. Research the study area, review the literature, and consult with experts. This process enables you to identify and measure the crucial variables that you should include in your model. It helps you avoid the problem in the first place. Just imagine if you collect all your data and then realize that you didn’t measure a critical variable. That’s an expensive mistake!

After the analysis, this background information can help you identify potential bias, and, if necessary, track down the solution.

Check those residual plots! Sometimes you might not be sure whether bias exists, but the plots can clearly display the hallmarks of confounding variables.

Recognize that omitted variable bias lessens as the degree of correlations decrease. It might not always be a significant problem. Understanding the relationships between the variables helps you make this determination.

Remember that a tradeoff between bias and the precision of the estimates might occur. As you add confounding variables to reduce the bias, keep an eye on the precision of the estimates. To track the precision, check theconfidence intervals of the coefficient estimates. If the intervals become wider, the estimates are less precise. In the end, you might accept a little bias if it significantly improves precision.

What to Do When Including Confounding Variables is Impossible

If you absolutely cannot include an important variable and it causes omitted variable bias, consider using a proxy variable. Typically, proxy variables are easy to measure, and analysts use them instead of variables that are either impossible or difficult to measure. The proxy variable can be a characteristic that is not of any great importance itself, but has a good correlation with the confounding variable. These variables allow you to include some of the information in your model that would not otherwise be possible, and, thereby, reduce omitted variable bias. For example, if it is crucial to include historical climate data in your model, but those data do not exist, you might include tree ring widths instead.

Finally, if you can’t correct omitted variable bias using any method, you can at least predict the direction of bias for your estimates. After identifying confounding variable candidates, you can estimate their theoretical correlations with the relevant variables and predict the direction of the bias—as we did with the bone density example.

If you aren’t careful, the hidden hazards of confounding variables and omitted variable bias can completely flip the results of your regression analysis!


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK