'Because of the success of science, there is, I think, a kind of pseudo-science. Social science is an example of a science which is not a science; they don't do [things] scientifically; they follow the forms -- you gather data, you do so-and-so and so forth but they don't get any laws, they haven't found out anything...You see, I have the advantage of having found out how hard it is to get to really know something, how careful you have to be about checking the experiment, how easy it is to make mistakes and fool yourself. I know what it means to know something, and therefore I see how they get their information and I can't believe they know it...'
From "The Pleasure of Finding Things Out," by Richard P. Feynman, Penguin Books 1999
It was reported this week that children in Luton are enjoying the benefit of a new programme designed to boost literacy. It may surprise you to learn that this at no point involves the actual practising of reading, writing or even listening. Instead it is based around a fifteen minute exercise program that pupils are required to perform before they start their lessons and no, wait, please don't leave, it's all real I assure you.
The headline from the BBC website
caught my eye: 'Luton Schools test exercise routine to boost learning.'
Now that is already itself quite a rare thing in the reporting of educational research - a sober headline. Something is being tested, it says cautiously, in service to plain fact and nothing more. Luckily we only have to wait for the first sentence before the acid drops and normal service is restored.
'A routine of exercises developed by an academic who had to re-learn how to read has improved pupils' reading, writing and arithmetic.'
There it is, right there: 'it has improved'. By implication, this has been proven, and it is known as well as we know anything. So how have we arrived at this understanding? Because, if that claim is true, then we need to be getting every sooty urchin out into the playground and rubbing their tummies before literacy hour, double-quick.
Dr Elizabeth McClelland, Oxford academic, developed the program after her own experience re-learning how to read; inspired by her success, she started Move4Words, an organisation that promotes the programme. The claims certainly are impressive:
Their improvement in the number of children achieving the expected level (Level 4+) in English and Maths was three times the increase achieved by eight matched comparison schools....Average rates of progress were "outstanding" after Move4words, for 92 children (all those whose tracking data has been provided to us) from schools in relatively deprived areas.
There's a good deal more in this vein. There is no question whatsoever about the work, character, intentions or motivation of either Dr McClelland or her organisation. She seems, like most people in our sector, to be motivated by nothing less than an absolute desire to improve the lot of children, and to add to the sum of their future capital. My issue is with the conclusions that can be drawn from the data provided.
Dr McClelland's experiences are edifying; to drag oneself out of a reading disability is inspirational. But my concern is the rapidity with which opinion becomes advocacy, and then policy in education. A theory is born; it appeals to someone with a budget; confirmation data is sought, and obtained; the theory multiplies, fed on a cash agar; it replicates, again and again, until it occupies the host.
What are the main problems with the research behind programmes like these?
1. The dangers of cognitive bias - 1
There are scores of these, all well known to the field of science. For example, the Hawthorne effect, where subjects of an experiment modify their behaviours because they know they are taking part in a large experiment.
Put simply, if you tell people, especially children, that they're taking part in something designed to improve something as subjectively interpreted as 'concentration' they are usually disproportionately inclined to agree that the intervention has had the stated effect. This is why it's so important to test subjects blind whenever new medicines are under investigation.
2. The dangers of cognitive bias - 2
That's just the kids. The fact that the teachers are all aware of what the intervention is supposed to achieve, makes their own delivery and subsequent interpretation of effect very susceptible to subconscious bias.
The desire to succeed, to not be seen to fail, to assist the children in a brand new school initiative, is powerful and seductive. It is interesting that in many of these programmes, the same people who administer the intervention also collect the data that tends to confirm the expected outcome. Teacher assessment is particularly prone to this danger.
3. The lack of an effective control
The data I could see from the Move4Words programme compares the intervention schools with non-intervention schools, with children at similar stages in their school careers, and that's how the impressive results mentioned above are obtained.
'20% more 11-yr-old children reached the expected level (Level 4+) in both English and Maths in KS2 SATs when they followed the Move4words programme in the Spring and Summer term of Year 6, compared to the same schools’ performance in the previous three years (8 schools, 235 children)'
So we have a comparison between one set of pupils' tracked progress, which maps favourably against a different set of students from previous year groups. In other words it's comparing two entirely different sets of pupils. That concerns me. How is one to say that the improvement is down to the intervention and not the participants or teachers? In fact, taking that further, how is one to know that any effect obtained is due to the exercise intervention? Further still, how do we know that the comparison schools where Move4Words wasn't used, weren't doing other interventions?
As far as controls go, I believe that we shouldn't compare intervention schools with non-intervention schools. Why? Because almost anything you do has a positive effect. If you introduce a programme into a school, however speculative, chances are it will have a temporary effect. Why? Back to the biases again- people tend to create the effects they expect.
So a better control, to my understanding, would be an intervention school against another school with a different intervention. I have a further concern that in experiments involving people, good controls might be impossible to obtain, simply because it is never possible to compare two exactly identical individuals in exactly the same circumstance. Human psychology is simply to complex, too dense, to permit reduction.
Like I say, I'm sure the people working on this are absolutely genuine and dedicated. I have no evidence to say that Brain Gym, or Move4Words, or any other exercise program doesn't have this effect. It may well be that rubbing one's brain buttons assists focus, concentration, or a hundred other effects. But I don't think this can be shown from the data here. Of course, I could be wrong.
The danger here is that enormous baskets of cash are often lit like Chinese Lanterns and waved goodbye into the night sky. Time is wasted when kids could be doing something that works, as opposed to what we'd merely like to work. The question I always ask of people who like to chuckle over the brainless educational research of yesterday, like VAK learning styles, is 'What are we falling for now? What's this year's Brain Gym?' That's often a much harder question to answer.
Feedback on the project
'Pupils at nine primary schools in Luton are trialling a radical new physical activity programme that can drastically improve their reading ability'
That's also quite an endorsement. This is no small experiment; this is going mainstream. But it's incumbent on us, I think, to be very careful what we spend taxpayer's contributions on. The EEF
, for example, is funding a similar program by Primary Movement to the tune of £209,000.
I'll leave you with a quote from John Rack, head of research and professional development at Dyslexia Action, who might have a thing or two to say about this matter. Here's what he had to say in 2012
about Dr McClelland's work:
"I would say to do these things with an open mind, because in almost all cases we don't actually know what the active ingredient is... We certainly know there is room for innovation. But the track record shows that, basically, if you want to get better at reading, you need to train in reading. It would be nice to come up with a fix for the underlying problem, but so far nothing has stood rigorous investigation."