I recently came across this headline on the Internet: “Study shows that bacon makes you live longer.” After having a good laugh, I tracked the claim down to its origin. Turns out, the study was performed on roundworms. Researchers found that when roundworms were given niacin (vitamin B3), they lived 10 percent longer than worms not given the vitamin.
It was a legitimate study. But some doofus read about it and made this mind-boggling leap: “When roundworms were given niacin, they lived 10 percent longer. Bacon contains niacin. Therefore, if I eat bacon, I’ll live longer.”
The logic would make Aristotle weep.
The author then put his flawed conclusion out there on the web, where it’s duping gullible people everywhere. Incidentally, niacin is found in scores of foods, many of which — including peanuts, tuna, brewer’s yeast, and liver — contain much more niacin than bacon. But the headline “Liver makes you live longer” just doesn’t carry quite the same buzz. And besides, it’s still flawed logic.
This example, though funny, illustrates how a legitimate study can be grossly misrepresented to the public. Even well-meaning journalists can mislead readers by omitting critical information about a study, such as facts like all participants were overweight, or there were only eight people in the study.
So how do you know what to look for, and what to believe, when you read about a new medical finding or health claim?
Start with a healthy dose of skepticism. If it sounds too good to be true (bacon anyone?), it probably is. The claim should be backed by research and published in a peer-reviewed journal, where experts scrutinize the study to ensure it is well designed and executed. Thousands of these journals exist, but a few well-known examples are JAMA (Journal of the American Medical Association), BMJ (British Medical Journal), The Lancet, and Diabetes.
Good studies will usually be randomized, controlled, and double blind. These involve giving one group of subjects a treatment and comparing it to a group given a placebo. Choosing who gets the treatment and who doesn’t is random. Double blind means that neither the scientists nor the participants know who is getting which treatment. This helps prevent bias when data is collected and analyzed.
Placebos are dummy treatments, like sugar pills, given to the control group in place of the real treatment. Many people given the placebo, however, will experience some degree of healing. The placebo effect is so real that treatments or drugs being tested must prove they’re effective on their own merit by outperforming the placebo in trials.
A very different though equally legitimate type of study is the cohort study. These follow large groups of people over time. Scientists use cohort studies, for example, when they’re studying the long-term effect of smoking on various cancers and cannot ethically enlist people to smoke. In many studies, scientists draw human inferences from rodent research. This may seem far-fetched, but humans are actually quite similar to mice and rats in genetic makeup, biology, and behavior. Also, rodents can be specifically bred to carry genes that mimic human conditions. Mice predisposed to diabetes, for example, are used to study treatments for that disease. Promising results from animal studies are usually followed up with human trials, when possible and ethical.
There are two other kinds of research you might hear about. One is a meta-analysis, which combines and analyzes all the known studies on a particular subject. This widespread approach greatly increases the reliability of the research. The other kind, clinical evidence, isn’t true research but relies on observational studies by doctors and other practitioners. Strong clinical evidence can often lead to more formal investigation.
Despite a study being well designed and published, the results can still be misinterpreted. Consider these factors when reading the latest health claim:
• Private industry funds a lot of research and can interpret and advertise results to its advantage. This happens frequently in the food industry.
• One study rarely provides the definitive word on anything, which is why meta-analysis is so valuable.
• If you read that a drug or supplement failed in a study, consider that: a) it could be universally true; b) it could be true for that study, but not for a dozen others; c) the dose might have been too low, used for too short a time, or used too late in the disease process; or d) the treatment may not have prevented the illness, but did reduce duration or severity of symptoms.
• Terms like “significantly,” “more,” and “less” are meaningless. Example: “Women eating soup before dinner lost more weight than…” Tell me how much more — half a pound a month? Three pounds a week?
Above all, use common sense and get your information from trusted sites like the National Institutes of Health, Centers for Disease Control and Prevention, Mayo Clinic, and Web MD. Avoid sites run by individuals, unless you’ve vetted their credentials. And then repeat this mantra: “Show me the research.”
~ Comment on this column below.