Research that glisters isn’t always gold

It’s tempting to take a study with a good headline at face value, says Alex Quigley, but its important to dig deeper to assess the research’s real worth
18th September 2020, 12:01am
Research That Glisters Isn’t Always Gold

Share

Research that glisters isn’t always gold

https://www.tes.com/magazine/archived/research-glisters-isnt-always-gold

Do you remember the research that showed chocolate can help you lose weight? It came out in 2015, made headlines around the world and seemed too good to be true.

Guess what? It was. Journalist Johannes Bohannon later explained that it was a scam, in which he had manipulated the results using small sample sizes and careful combing of the date for desirable findings.

We should be shocked that we can be so easily led astray by research, but it actually happens all the time. And in education, it happens more often than we would like.

School teachers and leaders rarely have to look out for fake studies, but when you are busy and looking for best-fit advice, it is important to dig beneath the headlines of the best research evidence available.

Here’s a good example. In 2018, a group of highly reputable researchers from Sussex University released a study on reading, entitled ‘Just Reading’: the impact of a faster pace of reading narratives on the comprehension of poorer adolescent readers in English classrooms. The researchers found that Year 8 pupils reading faster over the course of a school term led to improved reading outcomes.

The research findings prove incredibly attractive: “Simply reading challenging, complex novels aloud and at a fast pace in each lesson repositioned ‘poorer readers’ as ‘good’ readers.”

The researchers had described exactly how I taught English. Like continuous chocolate eating, having your typical classroom practice confirmed by research is everything you want to hear. I was primed to sit back and celebrate.

But when you scrutinise beneath the headlines of the ‘Just Reading’ study, you’re forced to think harder about the seeming implications for the classroom.

First, there are some red flags to consider. The size of the study warrants consideration - the evidence is drawn from only 20 classrooms. Data from only 343 students can lead to skewed results, just like the small sample size in the chocolate study.

Second, every pupil improved, without comparison with a “business as usual” control group, so how do we even know what caused it?

And third, when you dig into the analysis of classroom practice, it shows up further anomalies. The premise of the study is that quickly reading two novels over a term makes a significant difference compared with one novel, but three of the 10 faster-reading teachers didn’t finish the second book.

As a profession, we need to ask more critical questions of the evidence. We need to look beyond single studies. We need to ask whether a study has been replicated. We need to find out whether there is a broader range of studies that we can refer to - or a meta-analysis (a study of multiple studies) we can read.

Most of all, we need to remember that if it sounds too good to be true, it probably isn’t true.

Alex Quigley is national content manager for the Education Endowment Foundation, a former teacher and the author of Closing the Reading Gap and Closing the Vocabulary Gap

This article originally appeared in the 18 September 2020 issue

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared