One of the greatest illusions we cling to is that there are simple solutions to complex problems. How perfect it would be if we could say a magic word and "Shazam!", all our cares dissolved.
The world of education is full of moonshine. Normally, such initiatives attempt to justify themselves with reference to research - "Here's the science bit", they implore. However, anyone with any training in scientific method could poke a hole in at least half of what pours down on schools. That does not stop institutions trumpeting their latest successes and drawing spurious links between stimulus A and outcome B.
I read an interesting article recently in TES about a school I will not name that claimed psychometrically testing pupils had resulted in a Damascene conversion in their behaviour.
The school took two groups, 60 pupils in total, from Year 10 who had a history of either disruption or underachieving; they then spun them through a set of assessments that tested their personality types, processing speed and emotional intelligence. After the tests, teachers "explained the results" to pupils, and "encouraged them to think about how they behave".
Here's the peach: before the project, temporary exclusions within the group numbered 11, and classroom removals stood at 47; a year later, those numbers were "one and three, respectively". Jackpot! Ladies and gentlemen, you can put down the stun batons and tranq darts. We've cracked it.
Now, I have no doubt that the school was well-intentioned. But I smell a science rat for the following reasons:
1. Sample size. Sixty pupils, over a couple of years. This isn't terribly large, is it? The problem is you could take a different set of 60 pupils and get wildly different results. That's one of the difficulties in drawing universal conclusions from singular experiences; it leads us to believe that because I was bitten by a dog, all dogs bite me.
2. Alternative explanations. It would appear that the children accumulated more behavioural black marks in Year 9 than Year 11. You'd almost think that Year 9s were somehow more prone to poor behaviour... Oh yes, they are, and that is a fact, universally acknowledged. Year 11s knuckle down for the exams; they grow up; perhaps, just perhaps, they even respond to the behavioural policies of the school. It is simply not possible to claim that intervention A caused behaviour B.
3. Unclear explanatory mechanisms. If the tests are the catalyst for the behaviour, how did they actually work? Did the pupils suddenly realise, "Holy smoke, I'm a level 3 extrovert with a tendency to impulsivity, I'd better sharpen up"? To posit that a psychometric test would by itself modify behaviour seems a bizarre claim.
4. Bias. It's very easy to succumb to confirmation bias, the tendency to seek out data that confirm one's hypothesis. It's one of the reasons why the natural sciences use controls and anonymise participants. Also, when someone knows they are part of an experiment, they tend to believe that the stimulus, even if it is a sugar pill, is having an effect. The pupils are taking part in a programme, with concomitantly increased levels of interest shown in them, and scrutiny, and teacher time. In my experience, quite a lot of children would react favourably to this attention.
It's not that I think that psychometric testing isn't a valuable experience (although it isn't) or that it necessarily has no power to affect behaviour (although I think it's unlikely). But concluding that the tests improve behaviour just does not follow from the data. In fact, one might say that the only way someone would conclude that A caused B in this case would be if someone had some kind of vested interest in the outcome.
Perhaps the governor of the school that sponsored the experiment - the same governor who works at a company that "provides psychometric tests to business leaders and wants to expand into schools" - knows how this idea got off the ground. I imagine it was just a happy accident.
I have a theory of my own. Behaviour management is simpler than you think: it takes high expectations; rigour; clear boundaries; a strong, professional hierarchy that works together to enforce those boundaries; escalated sanctions; communicating with pupils. The problem is, those are not the sexy, easy answers schools want - they want to hear about a programme or a project or a product they can buy, and to wish away the difficulties of teaching. Well, we can't.
It doesn't need sorcery dressed in the gown of science. Just hard work and giving a damn.