The Scottish Office project to test what children know in primary 1 by surveying two schools in all 32 education authorities will produce unreliable data and should not be used for measuring value-added progress throughout primary, it is claimed.
Researchers involved in devising an alternative baseline assessment scheme in Aberdeen last week told a conference of headteachers that only objective testing of pupils early in P1 in reading and numeracy would provide fair, valid and meaningful evidence.
Aberdeen is in its second year of baseline testing all children and adapting a system devised by Durham University. The PIPS (Performance Indicators in Primary Schools) scheme is the most accepted method south of the border and involves 3,500 schools.
Baseline testing became law this year in England and Wales, forcing all primaries over the next five weeks to assess their new intake and inform parents.
Scotland is taking a slower route and piloting different approaches through early intervention initiatives. Only Aberdeen has adopted the PIPS scheme, although other councils are interested.
Under the Government's pilot, teachers are asked to write profiles of pupils six months into primary 1. But Dr Linda Croxford, of the Centre of Educational Sociology at Edinburgh University and a consultant to the city,believes the system is flawed.
"Pupil profiling should not be used for valued-added analysis because the data is not truly comparable. It is based on teachers' judgments and is open to abuse. There could be built-in fiddling of the figures and it cannot be used in an audit sense," Dr Croxford said.
In Aberdeen, all primaries except the two involved with the Scottish Office project test pupils early on entry. The 20-minute tests assess basic skill levels and are sent straight to Durham for analysis. Pupils repeat the same test at the end of P1.
This term pupils in primary 4 will be given three half-hour tests to assess skill levels. Teachers will also assess personal and social development. The evidence will act as a baseline for others coming through the system.
John Stodter, Aberdeen's director of education, said the city had investigated a number of systems but backed PIPS because of its simplicity and ease of marking. "We also wanted a system that had status, credibility and good empirical research. The tests relate to real skills in reading, writing and number work and give new insights into individual pupil abilities," he said.
Mr Stodter said it was important to assess initial skills, abilities and attitudes. "Unless you have that very clearly measured it is difficult to say what the school is adding in terms of its commitment. But more importantly it allows you to mark the progress of individual children against past progress, against another group of children, against the city average, and it gives teachers a simple tool to group children and look at areas where they have not developed at appropriate rates."
Mr Stodter continued: "It can identify children who are making little progress and the beauty for me is that we have a system based on individual pupil attainment that can be aggregated up to make general comparisons, rather than a top-down system that has little to offer teachers trying to make a difference for individual children in classes."
The PIPS system south of the border is used to test pupils every two years, a feature Aberdeen is keen to see replicated with new Scottish test materials. Children would therefore be judged on short, objective tests at four stages in their primary education, allowing schools to map progress throughout primary and into secondary. Dr Croxford said individual teachers only had snapshots of progress, without the total picture.
Following the Government's admission last week that it is impractical to introduce hard targets in primary based on the national benchmarks of the 5-14 programme, the focus is more likely to switch to the long-term work on value-added. Mr Stodter said under Aberdeen's emerging system targets were determined on "real knowledge about real children at an individual level" and not "constructed externally".
Carol Fitz-Gibbon, professor of education at Durham University and a leading advocate of value-added measures, told the conference: "Nothing can come closer to measure the effectiveness of teaching than a value-added measure. You can predict 50 per cent of achievement from baseline assessment but only 9 per cent from parental background."
Using free school meals as an indicator of disadvantage was "facile and simplistic," Professor Fitz-Gibbon said.