To most British teachers the idea of value-added results would be utterly unremarkable. They would also be unsurprised by the notion that such results can demonstrate the difference good teaching can make to pupils from deprived backgrounds and can therefore, sometimes, be fairer.
But when an academic in Ontario tried out a value-added measure on elementary schools, scorn was poured over it by the local education authority and teachers' unions.
Professor David Johnson's study, Ontario's Best Public Schools, 200506- 200708, came to the innocuous conclusion: "Good teaching matters. We can detect it."
To factor out pupils' backgrounds, Professor Johnson built a composite model of each school's student body, considering parents' education, pupils' mother tongue and the proportion of pupils born outside Canada.
He then compared how this model group would perform in the standard reading and English tests with the school's results.
This meant that some apparently weak schools were shown to have done much better than expected with their students, while some good schools, with advantaged intakes, were shown to have coasted.
UK teachers' unions have stressed how important alternative measures are to raw scores, but the Elementary Teachers' Federation of Ontario was unimpressed.
"I'd give it a failing grade," said its president. "Quite frankly, I find it insulting to make a comparison based on test scores from a multiple- choice test taken on one day in two different grades.
"Doing so can only demoralise students, teachers and administrators. Differences between schools, between two higher socio-economic or two lower socio-economic schools, cannot be reduced to one simple thing - teaching - because there are a myriad of other socio-economic and other factors operating in different schools."
Professor Johnson used three years of rolling data for each school to avoid the educational equivalent of the rogue poll.
"You want your sample size to be big enough," he said. "In small schools there may be only one class of 30 11-year-olds. If that year a teacher has three slower students that class will likely perform 10 percentage points lower on the tests than predicted."
But this use of longer-term data was what irritated the Ontario Public Schools Boards' Association (OPSBA).
It complained that using data that went back to 2005 misrepresented how students were doing because over that time they have performed better on the tests conducted by the Education Quality and Accountability Office. In 200405, for example, 59 per cent of the province's 12-year-olds achieved the mandated level of 3 or better in writing, while 67 per cent did in 20089.
Are the teachers' unions and school boards simply unable to accept that good teaching can be detected and that it matters in how students perform? Not quite: Ontario's unions and OPSBA hold awards ceremonies for teachers every year.