Heads crash and burn" and "Children lose sleep over tests". It's beginning to feel like meltdown. Two of the most important features of successful schools -happy headteachers, and happy school children - are starting to unravel before our eyes. From the headteachers' point of view, the message couldn't be clearer. In various studies, questionnaires and research projects they identify the same problems again and again: punishing league tables, the threat of Ofsted, a constant turnover of new initiatives and a mountainous workload.
From the child's point of view, there is the relentless testing and levelling from the moment they enter school. Small children don't like this. They cry and have nightmares. Bigger children don't like it. They become troublesome and rebel. But the two points of view are inextricably linked.
Headteachers are often in the untenable position of running schools that really do not best suit the interests of their children. They are under constant pressure to jockey for position in tables they know to be meaningless, and to come under harsh scrutiny through national data they know to be flawed. And ultimately, the most demoralising thing of all is knowing that they are sometimes working, flat out, on all the wrong things.
They are on some kind of moral collision course with the very people they are employed to protect. They know it isn't right to put children under stress, yet they cannot avoid it. They themselves are caught up in the relentless pursuit of statistical improvement.
Contextualised value-added scores (CVAs) are Ofsted's new way of judging a school's success before inspectors even set foot in the building. Despite widespread mistrust of the figures among school leaders, data continues to remain central to sealing their fate. It is claimed that the new CVAs are more sophisticated and therefore, by design, more complicated. This in itself is worrying. Performance and assessment reports, for example, increasingly have that "spilled cup of coffee" look about them. Headteacher burn-out? Maybe it begins with looking at a mess of statistics and wishing that somewhere you could cut to the quick and find the bottom line: "You're sunk, mate" or "Last year, your cohorts of waifs and strays came good".
Listening to a statistician explaining school data analysis can feel like your very life-blood draining away. When you hear phrases like "univariate analysis", "confidence intervals" and "measures of uncertainty" you can't be blamed for wondering if you're in entirely the wrong profession.
Recently, I attended such a session. Picture the scene: the statistician, an eager young man in a sharp suit, removes his jacket halfway through as he warms to his PowerPoint presentation of extremely complicated graphs. He fails to pick up the growing dismay in his audience as he suggests with enthusiasm that they are now working on better ways of defining deprivation and grouping similar schools. On he goes happily layering graph upon graph, innocently claiming that the old Panda was flawed. He is genuinely surprised at the rising tide of anger around him and finally a little alarmed as the audience of responsible school leaders begins to behave like a lynch mob.
One head shrieks: "Flawed? But Ofsted has hung, drawn and quartered us on these figures!" Are we any more confident that CVA scores are going to be more accurate? And shouldn't we continue to question a fundamental weakness of such data analysis in the first place? Statistics based on performance at key stages 1, 2 and 3 in only three subjects are essentially a crude and narrow way of describing the parameters of a child's achievement. Add to this the number-crunching exercise that takes into account indices of social deprivation (notoriously difficult to quantify) and you end up with your school coming out the other end with a very dubious judgement, and a whole new set of blotches on a piece of paper.
Heads continue to be faced with "evidence" that they profoundly mistrust and find difficult to challenge. Despite repeated complaints and resistance from headteachers' unions the Department for Education and Skills continues to deny reason and insists that it is a key barometer that will measure a school's success. Even the tests themselves have been challenged at the highest level, from Ken Boston, chief executive of the Qualifications and Curriculum Authority. He has publicly stated that they should be scrapped.
Again, the DfES has swept aside his views, describing the tests as a reliable, objective and consistent method of measuring achievement. Many children have become disenchanted learners in schools that are under constant pressure to perform on the narrowest of playing fields. Fine young minds and fine dedicated leaders are under fire.
We're testing the wrong things, defining success in very limited ways, and ultimately we expect headteachers to commit to a system that strikes at the heart of their young learners.
Mix suspicion and despair and you get burn-out or, more simply "I'm out".
Out, as in out of here. And what you're left with is a chronic shortage of school leaders, and rudderless schools. What's so surprising is that we are surprised at all.
Lindy Barclay is assistant headteacher at Redbridge community school, Southampton