We once visited a particularly memorable school leader, who told us that “the evidence says that feedback works, so I tell my teachers to bloody get on with their marking and stop whingeing”.
This anecdote serves as a timely reminder that slapping on a label marked "evidence’" doesn’t actually guarantee good advice. Indeed even the hokiest hogwash and most blistering of balderdash will come served with a side order of "researchers have said".
So how can you avoid the bunkum and keep your eye on the evidence ball?
Systematic Reviews are your friend
These papers are made by researchers spending many months assembling every possible piece of research on a topic, carefully categorising by relevance and weighting by quality. Finally, after many hundreds of hours of sifting, they identify common findings from across the whole sweep of literature. If you’re interested in the evidence in a field, tap it into Google with the phrase “systematic review” and see what you get. For example “teacher behaviour management systematic review” or “growth mindset systematic review” will throw up interesting overviews of the literature in each field.
Beware the list
Almost anyone with Google can assemble a long list of dubious references that appear to support their claim – just ask a "flat Earth" fanatic! Any list of references can look impressive, but it tells you nothing about the quality of the research contained and, worse, you don’t know what’s been ignored or specifically excluded from the list.
The less you know about a subject, the more prone you can be to getting overexcited, seeing simplistic certainty where the reality is much more complicated. This can lead to the Dunning-Kruger effect, where someone exposed to a new idea hugely overestimates their expertise in the field. To counter this, be sure to look for opposing views. An easy approach is simply to tack on the word "criticism" when searching for information. For example, “growth mindset criticism” and “core knowledge curriculum criticism” are both searches that can raise some helpful questions to bear in mind when critically appraising these ideas.
Remain cautious about single studies
The media loves a study, with "scientists discover…" headlines a surefire clickbait hit for many tabloids. However, no matter how exciting or dramatic the finding, a single study should rarely be taken in isolation as a definitive guide to what works. Instead, balance it against any existing systematic reviews in the field, or – even better – wait for a new systematic review to balance the evidence up and deliver a new finding. This is just as true for "new research shows [X] works" as "new research shows [X] has no impact" type research.
Once you've found evidence then you can focus on embedding it, evaluating the impact and building staff members’ practice through effective, responsive CPD.
Use these four tools and the quality of your evidence-based CPD and decision-making can rocket, beating the bunkum and surfacing the sense.
Bridget Clay is head of programme for Leading Together at Teach First. She tweets @bridget89ec. David Weston is CEO of the Teacher Development Trust. He tweets @informed_edu
This column is based on their book, Unleashing Great Teaching, published by Routledge