Academics have raised fresh questions about the credibility of the world's most influential international education study.
Researchers have uncovered thousands of cases of identical information being submitted for different schools taking part in the last edition of Pisa (Programme for International Student Assessment), the survey used every three years to analyse and rank national school systems.
In a study seen by TES, German and Canadian academics write that their trust in Pisa's data has been "heavily compromised" by what they have found in the results of its school background questionnaires. But the Organisation for Economic Cooperation and Development (OECD), which runs Pisa, has insisted that the data it uses is "high quality".
The researchers' claims come less than three weeks before the release of the next (2012) Pisa results, which will influence education policy in many countries.
In July, TES published claims from other academics that the statistical model used to calculate Pisa's headline rankings meant they were "useless", "meaningless" and "utterly wrong".
The new research looked at 71 of the 74 countries that participated in Pisa 2009 and could only find 16 countries where the data they examined appeared to be of high quality. It did not analyse actual test results, but looked at the information collected that is used to put the results in context and draw wider conclusions in the Pisa reports.
Ten countries are highlighted in the research for having particularly "questionable" data. Three of them were apparently extreme cases, where the academics suggested that the responses to school questionnaires had actually been fabricated by the national research institutes gathering Pisa data. But TES inquiries suggest that this may be explained by the same principals running several different schools.
However, the researchers have also found hundreds of examples of schools where principals have ticked the questionnaire boxes in such an implausibly uniform way that researchers doubt the data can be accurate.
They looked at three sections of the questionnaires, covering "school climate" issues such as levels of teacher absenteeism and student disruption, resources levels, and management practices. For each question, principals were asked to tick a multiple-choice box to indicate the extent to which the problem affected the school. But the researchers found hundreds of examples of school leaders ticking the same box for every question.
"Being the guardians of the school's image and reputation, the principals would be torn between providing factual and school-enhancing responses," the paper says. It also suggests that principals may not have had enough time to give considered responses and may not have trusted the survey's anonymity.
Countries with significant examples of such "questionable" data include the UK and the US.
Study co-author Professor Jorg Blasius, from Bonn University in Germany, said the school information provided by principals was crucial if the Pisa data was to be robust enough for the wider evaluations it was used for.
Gabriel Sahlgren, research director at the UK's Centre for Market Reform of Education, said: "This suggests that many conclusions from the Pisa report are invalid, and a lot of academic research that has been based on the data from Pisa is also called into question."
Andreas Schleicher, deputy director of education at the OECD, said: "Pisa data, both from the test and the questionnaires, (is) validated to high standards. This includes analysis to detect response biases in the questionnaires.
"Pending a more thorough review of the analysis in the unpublished research paper, our assessment is that the response patterns highlighted are in fact quite plausible and do not present evidence of falsification or cheating."
"It is also important to note that the school principal questionnaire responses are used in the analysis of the Pisa test results; they do not have any bearing on the test results themselves."