The October, November and December issues of the newsletter included columns on interpreting scientific study results. Here are some take-home messages from that series to consider when reading a news report about a scientific finding.
Was there a control group?
Control groups are the best way to keep extraneous factors from influencing a finding. If you want to know whether ventilation prolongs life in ALS, you have to be sure all study participants have the same general characteristics with respect to stage of disease, medications and other factors. The only variable should be the ventilation they receive.
Were study participants randomly assigned to a group?
When all participants in a study have the same disease, they must each have an equal chance of being assigned to the placebo (sham medication) group or a treatment group (taking the medication being tested). If any factors influenced the group assignment, results could be biased. For instance, if healthier patients got the experimental drug and sicker ones the placebo, it could make the drug look more effective than it really is.
If you see an apparent disease cluster, is there a proposed causative factor that can be tested?
Disease clusters are often identified because they cry out for attention. But they aren’t necessarily meaningful.
People who live or work near each other may develop the same disease by chance. To test whether the cluster is a chance occurrence or not, researchers must pose a hypothesis about the cause of the disease that can be tested in multiple environments.
Are the results significant?
The test of whether a result is meaningful or not rests on whether it’s "statistically significant." If the probability of seeing the obtained result if chance alone were operating is greater than 5 percent, the finding is considered "not significant."
Results that fail to reach significance can sometimes prompt researchers to change the way their studies are conducted. They may need a larger or longer study, or they may need to address a previously overlooked variable that influenced the results.
Does it make sense?
Common sense should prevail when reading about a study.
If three or four people say they benefited from a particular herbal supplement or electrical stimulation device, the finding probably doesn’t mean much. If hundreds of people say they benefited, it might mean something.
Of course, it’s always a good idea to consider the source of any information. A company that sells supplements or muscle stimulators is rarely objective about them. Even if satisfied customers are recruited to spread the word about a product, you can rest assured that they were selected from among customers who may have been less enthusiastic.
MDA thanks biomedical statistician David Schoenfeld of Massachusetts General Hospital in Boston for co-authoring the series. A feature on media coverage of science and medical news is scheduled for the March-April issue of Quest, MDA’s bimonthly magazine.