According to the Washington Post, a new study has suggested a link between caffeine and miscarriages. This is the kind of news that gets a lot of attention and potentially affects people’s behavior. Eighty-five percent of the population gets a daily caffeine fix and more than 10% of women aged 15–44 become pregnant per year. Any woman who is pregnant, trying to get pregnant, or potentially pregnant in the future will probably consider this an important story. The story was extra eye-catching because, as the headline noted, “Mom’s and dad’s (!) pre-pregnancy caffeine intake may affect miscarriage risk.”
This is an example of a story where people blur the difference between correlation and causation, and potentially make less-than-optimal behavioral changes as a consequence. Readers can be easily be misled. The story is getting a lot of attention and the reporting strongly nudges the reader toward a conclusion that caffeine consumption causes an increased risk of miscarriage. But the study merely shows that those who consume more than two caffeinated drinks a day have a higher rate of miscarriage, not that the caffeine causes the miscarriage.
The issue is that there could be many other reasons that caffeine consumers have more miscarriages. For example, they may, in general, have less healthy habits, which might not be so surprising when you consider that energy drinks and soda along with coffee were included as caffeinated beverages in the study. Perhaps they exercise less or eat less healthy.
Maybe caffeine is horrible for you. Or maybe, as the Post noted from a government panel last year, “coffee could be part of a ‘healthy lifestyle’.” My point is that as we are inundated with information on important subjects that could be life-altering … or worthless, we need to look a little closer before we choose to make big behavioral changes based on what we read in the media.
University of Michigan Psychology Professor Richard Nesbitt recently gave a great example on Edge.org of the mistakes we can make when we confuse correlation and causation. He described an example of a correlation showing that men who take Vitamin E have a lower risk of prostate cancer. Men reading or hearing about this might say, “I should increase my intake Vitamin E.” But then a controlled laboratory experiment was done and found that Vitamin E actually increases the chance of prostate cancer. How can both be true? A lot of these correlation-studies are subject to a “healthy user bias,” because the person doing (or refraining from) one thing may also be doing a lot of other relevant things. In Nesbitt’s description about Vitamin E, you can imagine that men taking Vitamin E are also watching their weight, exercising, and limiting alcohol and tobacco intake. “You pull one thing out of that correlate [Vitamin E] and it’s going to look like Vitamin E is terrific because it’s dragging all these other good things along with it.”
This is not just nitpicking. Imagine that someone actively trying to reduce their risk of prostrate cancer increases their Vitamin E intake based on the reporting of the population study. They are harming themselves rather than helping themselves, all based on the inference that correlation means causation.
Similarly, this kind of widespread publicity about such a correlation can lead readers into the same trap. For instance, the article concluded, “there was one piece of positive and practical news in the study: Women who took a daily multivitamin before and after conception appeared to greatly reduce miscarriage risk.” Yet that doesn’t mean that taking a multivitamin reduces your risk of miscarriage. It could be that the people who are taking multivitamins do a lot of other things and are generally healthier. We don’t really know much about it. There’s no controlled study. All we have is the correlation being reported, based on journals filled out by 500 couples.
When you see a big correlation like that, it doesn’t necessarily mean you should change your behavior. In fact, the behavior change might be counter-productive, as in the Vitamin E case. That’s not to say that these correlations have no value in the scientific community: these correlations do spur further investigation.
The correlations themselves aren’t dangerous. The way these correlations are reported? Well that might be.