Every week Tom Bennett will be shouting at the laptop about some damn fool idea in education, or else he'll be writing about classrooms, students, or why teaching is the most important job in the world. This week, Tom questions the validity of some educational research.
Many of the more brainless ideas trialled in education in the last few decades have been supported by research of apparently impeccable pedigree. Unfortunately it often doesn't take a lot of poking to realise that some research is better than others. Here are the most common problems I find in social science research:
Poorly phrased hypotheses, or titles
This is when you have a title that nearly reduces you to tears of mirth or sadness, depending on how strong your stomach is: `How does emotional intelligence best increase performance in postgraduate studies?' `Using brain gym as a tool to promote multiple intelligences,' that kind of thing. Papers loaded with so many assumptions and presumptions that you would need a crowbar to separate them all from each other. This is why physical scientists weep when they look at these kinds of papers. If they tried to get funding for `The causal relationship between fairies and dream catchers,' there would be a riot in the Sorbonne. But some social scientists launch into their grand expeditions with, it seems, not a care in the world.
Papers so obviously designed to prove their point that the reader feels clobbered if they presume to disagree
This is one of the most common errors. `Research from the Academy of Flutes,' the article will start, `Shows that flute usage, or flutage, adds on average two grades to a pupil's GCSE outcomes.' and so on. `We asked 200 Cambridge professors from the University flute society if they felt that flute playing was useful to their overall well being. 110% said yes..' and so on.
Research that is unfalsifiable
Also common. Claims that `capitalism is inevitable, but so is Communism,' may impress them in the shipyards, Mr Marx, but there's no way of showing this to be false, unless you want to wait until the end of time and see every civilisation ever.
Analysis that reaches past the data
This is the work of the devil; when a paper takes the opinions of, say, 100 school children, and presents it as a fait accompli that these opinions are representative of the whole population. Or worse, that then claims this evidence shows that `children are not being listened to enough' or so on. Which leads me into my next, and least favourite social science trope:
Mistaking facts and values
If I interview 100 teachers and find out that 90% of them would like longer summer holidays, that is a fact - that 90% of them think that. For a paper to then suggest that `this means teachers should have longer holidays' is an absurd leap from fact to value; the writer is peddling the latter, which is fine, have all the opinions you want, but don't dress them up as research. I think that dogs should shut the hell up when I'm trying to sleep, but I haven't kept a dream diary to back this up.
Social science is often not science. It is investigation; it is commentary; it often illuminates, and helps provide valuable light and guidance in human affairs. What it does not do is offer reliable predictive powers, nor irrefutable explanatory mechanisms for processes. Merely commentary, case study, opinion, and subjective analysis.
And that's fine. If we don't conflate the two. Why?
Lots of reasons. One is the quality of some of the research itself. Leslie K John of Harvard Business School said that `13 of all academic psychologists admit to questionable research practices.' For example: stopping data collection at convenient points, when the desired results had been found, and omitting other tested variables. One third! No wonder it gets a rep. There have been a variety of scandals uncovered in social science research that shows how problematic this is for the integrity of the whole subject. Note how difficult it is for scientists to duplicate, and therefore test the claims made by previous researchers in social science. Which makes open release of all data even more important, and when this isn't done then it's simply a case of, well, making it all up, and then you can say anything and science is dead.
And that's barely the surface of it. Next time you read a piece of educational research, see how many of these sins it commits.