Most researchers do not deliberately claim causal results in an observational study. But do we lead our readers to draw a causal conclusion unintentionally by explaining why significant correlations and relationships may exist? Here we perform a randomized controlled experiment in a massive open online course run in 2013 that teaches data analysis concepts to test the hypothesis that explaining an analysis will lead readers to interpret an inferential analysis as causal. We test this hypothesis with a single example of an observational study on the relationship between smoking and cancer. We show that adding an explanation to the description of an inferential analysis leads to a 15.2% increase in readers interpreting the analysis as causal (95% confidence interval for difference in two proportions: 12.8%–17.5%). We then replicate this finding in a second large scale massive open online course. Nearly every scientific study, regardless of the study design, includes an explanation for observed effects. Our results suggest that these explanations may be misleading to the audience of these data analyses and that qualification of explanations could be a useful avenue of exploration in future research to counteract the problem. Our results invite many opportunities for further research to broaden the scope of these findings beyond the single smoking-cancer example examined here.
- Randomized trial
ASJC Scopus subject areas
- Biochemistry, Genetics and Molecular Biology(all)
- Agricultural and Biological Sciences(all)