Why You Shouldn't Care That Yogurt, Mouthwash, Red Meat, Burnt Toast, and Bras Have Been Linked to Cancer

Why You Shouldn't Care That Yogurt, Mouthwash, Red Meat, Burnt Toast, and Bras Have Been Linked to Cancer
X
Story Stream
recent articles

I hate to break it to you, but almost everything in daily life has been linked to cancer: burnt toast, hot dogs, poor tooth brushing, you name it!

You now have two choices: panic or continue on with your day.

I recommend the latter.

Much of the health information you read online or hear on the morning news comes from observational studies -- scientists look at people who eat certain foods, or take certain drugs, or live certain lifestyles and see how their health compares with the health of people who don't do those things. Studies like these have revealed some disconcerting links: Women who eat yogurt at least once a month have twice the risk of ovarian cancer. People who drink coffee twice a day have double the risk of pancreatic cancer. Individuals with a "Type A" personality have more heart attacks.

There is, however, a general trend in regards to observational studies. They have a very high chance of being flat out wrong.

Twenty-seven years ago, a trio of researchers surveyed the epidemiological literature. They found 56 health claims based on observational studies where research was in direct conflict. An average of 2.4 studies supported each claim, while 2.3 studies did not. Unsurprisingly, most of the claims were tied to cancer risk.

In 2011, statisticians S. Stanley Young and Alan Karr teamed up to analyze twelve randomized clinical trials that scrutinized the results of 52 observational studies. Most of the observational studies showed various vitamin supplements to produce positive health outcomes. However, the superior clinical trials disagreed.

"They all confirmed no claims in the direction of the observational claims," Young and Karr revealed in Significance Magazine. "We repeat that figure: 0 out of 52. To put it another way, 100% of the observational claims failed to replicate. In fact, five claims (9.6%) are statistically significant in the clinical trials in the opposite direction to the observational claim."

What has gone so wrong with observational studies? In the past, epidemiologists used them to conclusively demonstrate the grave public health risks posed by smoking, which led to much-needed regulation and oversight. Observational studies also made plain the benefits of vaccines, drinking water fluoridation, and motor vehicle safety belts.

First, design of observational studies can be problematic. Many rely on self-reported data regarding eating or lifestyle behaviors, for example. Any large study population also comes with a host of potentially confounding variables that can muck up the results. Lastly, observational studies often "fish" for results, so to speak. Young and Karr cited an experiment which found that women who eat cereal have more baby boys, an overtly ridiculous result that makes no biological sense.

"The data set consisted of the gender of children of 740 mothers along with the results of a food questionnaire, not of breakfast cereal alone but of 133 different food items... Breakfast cereal... was one of the few foods of the 133 to give a positive."

You see, with a typical p-value of .05 to denote a "significant" result, there's basically a 5% chance that, just by luck, a claim will be significant. So if researchers test enough outcomes, one is bound to hit the mark.

Also problematic is the nature of epidemiological studies, themselves. They simply aren't good at teasing out subtle risks. A 3000% increase in the risk of lung cancer from smoking is certainly genuine, but a 38% increased risk for breast cancer due to occupational exposure of electromagnetic fields may be completely bogus.

"With epidemiology you can tell a little thing from a big thing. What's very hard to do is tell a little thing from nothing at all," Michael Thun, the former director of analytic epidemiology for the American Cancer Society told Science Magazine.

Young and Karr have a plan to fix observational studies. Their seven-step approach basically introduces peer review throughout the entire process, from data collection, to analysis, to actually writing the report. They liken their approach to a product manufacturer maintaining quality control at key steps, rather than only after the product is created.

If ever adopted, Young and Karrs's method will hopefully restore some credibility to epidemiology, and, just maybe, the public will finally get to stop reading headlines that make tooth brushing out to be a life or death affair.

(Image: Shutterstock)

Comment
Show commentsHide Comments
You must be logged in to comment.
Register

Related Articles