The bad news arrived one Thursday morning in February.
In front of Richard Brown was a document which called into question all the hard-won improvements made at his school over the past seven years.
Minsthorpe community college, a sprawling 1,800-pupil comprehensive in the former coalfields of west Yorkshire, had been slowly winning itself a reputation. It was oversubscribed, and its last Ofsted inspection had been glowing.
Crucially, results were improving, both at key stage 3 and at GCSE, to the extent that the specialist science college has been in the Specialist Schools and Academies Trust's "most improved club" for the past two years.
Yet the document, an Ofsted pre-inspection briefing, suggested that Minsthorpe would struggle to emerge with anything better than a middling verdict from its latest encounter with the inspectorate.
This came even though the school had judged itself to be good with some outstanding features.
The source of concern was the "contextual value-added" (CVA) score given to the school in the briefing.
CVA is emerging as the key to the success or failure in inspections. Your school's future may depend on it. But what happens if you think it is wrong, or if other performance measures paint a better picture? And if something goes wrong, is it possible to pin down, from the data, where your school's weaknesses lie? The experience of Mr Brown at Minsthorpe is not encouraging.
On Ofsted's KS2-4 CVA, Minsthorpe rated 51. Schools faring best under CVA have a rating of 1, meaning they are in the top 1 per cent in England.
Minsthorpe's score meant it was achieving just below average, in terms of the progress that pupils make and their level of disadvantage.
Yet alternative data suggests that his school is among the most effective in the country.
A rival contextual value-added scoring system puts Minsthorpe in the top 17 per cent in England for KS2-4; in the top 30 per cent for KS2-3 and in the top 10 per cent at KS3-4.
And these are not back-of-the-envelope statistics. They come from the Fischer Family Trust, a not-for-profit data analysis company, whose systems are used by schools across England to set their pupils targets and identify areas for improvement. It is endorsed by the Government. On other measures, such as Durham university's Yellis (Year 11 Information System) assessments, Minsthorpe is also ahead of the game.
Unfortunately for Mr Brown, this alternative evidence, and what the five inspectors saw in observations of 22 lessons carried out in just over a day at the school, was not enough to convince them to change the verdict suggested by CVA. Minsthorpe was judged "satisfactory, with some good and exemplary features".
Mr Brown admits that this does not mean failure. But he said it had profoundly disheartened staff. He said: "It's a downer. It knocks your confidence. We take (the judgement) on board, but...we know the school is better than the inspectors are telling us it is."
One in 25 secondaries is in Minsthorpe's position, where the Ofsted CVA and the Fischer Family Trust alternative CVA measure are at odds. Yet it seems impossible to understand why schools fare better using one analysis than another.
Mr Brown is still mystified. His local authority cannot tell him. The Fischer Family Trust has no definitive answers. And Ofsted is similarly flummoxed.
The only reason he can come up with is that the trust's data acknowledges that the college is dealing with higher levels of disadvantage than that assumed by the CVA.
The CVA, he believes, wrongly includes more prosperous areas in the surrounding countryside in the school's catchment area. This means its pupils get less credit for the progress they make. But this is just a hunch.
Given this, would it not be prudent for the inspectorate to embrace a more cautious use of the CVA?
Even though Ofsted recently advised its inspectors against hanging their entire judgement on CVA, evidence from this inspection, and others, suggests their verdicts are being determined by it.
Ofsted's final verdict criticised Minsthorpe for "overestimating" its strengths in its self-evaluation, but made no mention of the fact that FFT data supported the school's verdict.
Mr Brown said the inspector gave him the impression that CVA made it very difficult for the school to get a "good" verdict. And this is borne out by the pre-inspection briefing the school received, which shows how CVA was used to question Minsthorpe's self-evaluation.
Mr Brown said: "We felt that the CVA influenced the way the inspectors approached the college. We felt we were always trying to convince them we were better than satisfactory."
Training materials for inspectors show how other criteria are used in judging a school's results, including comparing its "raw score" test and exam results against the national average. But CVA, which is a much more sophisticated indicator, appears to be the key.
Ofsted said this week that inspectors have been advised not to base their judgements entirely on data. It would not say whether its own CVA measure was a more accurate guide to school quality than the trust's, but Ofsted's was a common data system which was used by all schools.
Mr Brown insists he does not want to be seen as a victim, but just wants to get to the bottom of why the CVA was so mediocre, and why the school is no longer judged such a success.
The answers he gets could give a clue to how schools across the country are treated in the future.
Contextual value-added, KS2-4, 20030405: Percentile ranking (1 is top): 2003: 95, well below average 2004: 89, well below average 2005: 51, average
2 Fischer Family Trust Contextual value-added, KS2-4, 20030405 Percentile ranking (1 is top): 2003: 26 2004: 19 2005: 17, all significantly higher than expected
3 Yellis (Year 11 Information System, from Durham university) Prediction of proportion of pupils gaining five or more A*-C at GCSE in 2005, based on tests taken by Year 10 pupils in 2003: 33 per cent Actual achievement of those pupils: 63 per cent