Don’t Believe the Hype: Medical Studies Aren’t Always Based on Science

It’s human nature to want more for less. Who wouldn’t want to work out fewer hours a week while still losing weight or drink copious amounts of wine while still being healthy? The media knows this. Often when a study is published that caters to most people’s innate affinity for laziness, it’s pushed to the forefront. Unfortunately, most of the time what’s hidden under the promise of a glossy headline is the fact that most studies are conducted on very small and often biased samples that don’t necessarily reflect the general public (like 10 healthy women in their 30s from Sweden or 15 athletic men in their 60s from New York). Researchers know this, of course, but with the tremendous pressure to secure funding and get published, they often feed into the media frenzy.

Worse yet, you might assume that the results of all new drug trials are automatically published, yet most of the time, negative or inconclusive findings go unreported. What does that mean for you — and worse, your doctor, who relies on medical journals for the latest health and pharmaceutical news? It means that if one study shows that drug X will help lower your cholesterol and four studies show that it won’t, odds are that only the study with the positive results will be published, leading doctors to prescribe something that in all likelihood won’t help patients. This phenomenon explains why in 2003, when researchers looked at 101 studies published in top scientific journals between 1979 and 1983 that claimed a new therapy or medical technology was very promising, only five made it to market within a decade and only one was still extensively used in 2003.

For the most accurate picture of a specific topic, it’s important to view multiple studies together, as they can often contradict each other. Think about it. How often have we read that coffee is good for our health? How often have we read that coffee is bad for us? In isolation, most studies are meaningless — and in some cases, even dangerous.

British physician, academic and science writer Ben Goldacre gives an example. “So, ‘Red wine can help prevent breast cancer.’ This is a headline from the Daily Telegraph in the U.K.A glass of red wine a day could help prevent breast cancer.’ So you go and find this paper, and what you find is it is a real piece of science. It is a description of the changes in one enzyme when you drip a chemical extracted from some red grape skin onto some cancer cells in a dish on a bench in a laboratory somewhere. And that’s a really useful thing to describe in a scientific paper, but on the question of your own personal risk of getting breast cancer if you drink red wine, it tells you absolutely bugger all. Actually, it turns out that your risk of breast cancer actually increases slightly with every amount of alcohol that you drink.”

In this age where many of us google instead of thinking for ourselves, it’s more important than ever to make a concerted effort to view information with a critical eye and remember that if it sounds too good to be true, it probably is.

[Vox, TED]

Trending


X