The influential Pisa international education survey has been compromised by British heads who fail to answer questions honestly or accurately, a new academic study suggests.
They believe it is because some head teachers either do not have the time to fill in questionnaires for the programme properly or have been trying to portray their schools in an overly flattering light.
Their findings have worrying implications because the background data submitted by heads is vital in providing context for Pisa test results and calculating many of the wider findings of the Pisa report, used by governments around the world to help set education policy.
Professor Jörg Blasius from Bonn University, in Germany, said:“In the UK you have some impression management. There are principals who are claiming ‘my school is very good, everything is perfect’, whether it is true or not.
“This is a problem for the data, because they are more positive [than in other countries].”
But the Organisation for Economic Cooperation and Development (OECD), which runs Pisa, insists that the data it uses is “high quality”.
The news comes less than three weeks before the next edition of Pisa, published every three years, is released.
The latest study, opens up criticism on another front, by looking in detail at some of the actual data gathered for Pisa.
The academics examined the school questionnaire answers from 71 of the 74 countries that participated in Pisa 2009 and could only find 16 countries where the data they examined appeared to be of “high quality”.
The UK was one of the ten countries identified by Professor Blasius as having data from the questionnaires that was particularly “questionable”.
His study looked at three multiple choice sections of the questionnaire. The first section headed ‘school climate’ includes questions on potential problems such as student disruption in classes, teacher absenteeism, bullying, and poor student attendance.
It asks principals to tick one of four boxes to show how much they hinder student learning ranging from “not at all” to “a lot”. Of the 445 UK heads who completed the section, 17 ticked the “very little” for all 13 questions.
Another 16 heads deviated very slightly from that pattern with half ticking the “to some extent” box when asked about staff resisting change, and the other half ticking “not at all” when asked about students using alcohol or illegal drugs. A further six UK heads marked all items with “not at all”.
The questionnaires are supposed to be anonymous but Professor Blasius suggested some heads could have concerns about being identified by their Pisa number.
His papers also argues that because of the time needed to fill in the questionnaire some heads may have chosen to “simplify their task rather than to refuse to participate”.
Gabriel Sahlgren, Centre for Market Reform of Education, research director, said: “This shows we can’t really trust the background information that principals provide and many of the conclusions in the Pisa report are dependent on them having access to that data.”
The resource shortages section was answered by 440 UK heads, and the research found that 58 of them or 13.2 per cent said “not at all” for all 13 questions when asked how much teaching was hindered by shortages in a series of areas ranging from qualified teachers to computers and “library materials”.
But heads in other rich countries were far less likely to be uniformly positive about their school’s resources. In Norway only 0.5 per cent ticked “not at all” for all resources questions, in the Netherlands it was 1.7 per cent and in Germany 1.5.
“This is quite concerning,” Mr Sahlgren said. “In the UK it suggests that schools do not take Pisa that seriously whereas in other countries they take it more much more seriously.
“It calls into question whether these types of survey can even be managed to provide an accurate picture.”
In a third part of the question on management practices, 78 of the 439 UK heads gave responses identical to several other schools, according to the research.
Brian Lightman, Association of School and College Leaders general secretary, said: “I have a real worry that Pisa is turning into another high stakes accountability measure rather than being what it was designed to be – to benchmark systems honestly and objectively. There is a real danger of perverse incentives influencing the outcomes.”
But Andreas Schleicher, the OECD deputy director of education who runs Pisa, said: “PISA data - both from the test and the questionnaires - are validated to high standards. This includes analysis to detect response biases in the questionnaires.
“Pending a more thorough review of the analysis in the unpublished research paper, our assessment is that the response patterns highlighted are in fact quite plausible and do not present evidence of falsification or cheating.
“It is also important to note that the school principal questionnaire responses are used in the analysis of the PISA test results; they do not have any bearing on the test results themselves.”