A little while ago, my Giraffe arrived. No, that's wrong...perhaps it was the Tiger. No, wait...I've got it...the Panda, that document full of statistics that supposedly shows how your school is doing compared with everyone else's. But does it really? And do schools find it useful?
Statistics have become ingrained in our schools and the younger teachers on my staff find it astonishing that throughout my own early career nobody ever came and measured anything much. Today, from the moment a prospective teacher starts training, he or she is taught to prepare, analyse, quantify, justify and annotate every little detail. I'm often surprised that students on teaching practice find time to teach anything.
The supervisor of my current student teachers recently faxed her observations on the lessons she'd watched them do. The notes were full of fashionable phrases: the focus of the lesson and its related strands; differentiating; managing performance and behaviour; referencing the plenary session. After reading them I realised I hadn't a clue what had been taught or what the children had learned. When she next came, I chided her about the sheets. She smiled wryly and said: "I'm afraid that's how we have to do it these days." I'm afraid I remain unconvinced.
In the same way, I remain unconvinced about the usefulness of my Panda. Like most heads, I can quote my SATs results from the past four years almost without thinking because, sadly, so much seems to hang on them. We'd done pretty well last year and I wasn't going to bother ploughing through the Panda, but as I went to drop it in a drawer, it fell open at a page with a summary in bold text that said our performance in mathematics was below the national average.
I thumbed further - and there was more - also in bold type. The percentage of children reaching level 3 was well below the national average and the performance of pupils in core subjects fell below the national average by the equivalent of 0.1 points.
By picking up the Panda and thumbing through it quickly, I dd what any governor or parent might do to gain a quick overall impression - and it seemed to make pretty dismal reading. And yet our results had been very good. It didn't make sense. I decided that the fine-toothed comb would have to be brought out after all, and after poring over this document all Friday evening, I realised the Panda wasn't making any sense.
What, for example, did "falling below by the equivalent of 0.1 points" actually mean? It appears that an arbitrary number of "points" is allocated for levels reached, with no explanation for why the number has been chosen. 0.1 is a tiny amount, and, in the Panda's own documentation, not to be relied upon. To the casual observer, therefore, "falling below the national average" in heavy type can give quite a damaging picture of a school. Near the back of the document, and therefore much less likely to be read, there is a page telling me that in all core subjects, in comparison with similar schools, we are well above average, and therefore really doing very well.
The document makes much of "trends". This kind of information is flawed at best, and, in the case of schools like ours with a rapidly changing population, it becomes totally unreliable. The children in one of the Year 6 classes in 1999, for example, had changed enormously since Year 2. More than half of them were not at the school four years previously, and, academically, most of the new ones were far less able. Ofsted teams seize upon these trends, however; it's easier to try to prove a "trend" than do an in-depth analysis of what's really happening at the school.
Statistics, it is often said, can be made to prove anything. Certainly, in education at the moment, we are at the mercy of the statisticians and their reams of paper. My Panda contained 44 pages and every school in the UK gets one. That's a lot of paper. I'm afraid you'll have to do the arithmetic; mine is below the national average.
Mike Kent is head of Comber Grove primary school, Camberwell, south Londone-mail: email@example.com