Aberdeen's education director, John Stodter, is a keen psychologist and is anxious to start with the child at the centre of the education process. He and his colleagues are harnessing new technology to help them focus on baseline assessment, already testing children on entry into Primary 1, in accordance with national policy. (Baseline assessment will be introduced across Scotland by 2001.) But Aberdeen is also going beyond that, testing children again at the end of P1, at the beginning and end of P4 and at the beginning and end of P7. The idea is to chart a graph of the child's progress throughout primary school and enable staff to spot any peaks or troughs that should be investigated.
To do this, Aberdeen has adopted an English software program called PIPS (Performance Indicators in Primary Schools), devised by Professor Peter Tymms at Durham University. PIPS was introduced in Aberdeen in 1997 and the five-year-olds who were tested then are now in P3. They will be the first cohort of children to be tested again when they enter P4 next autumn. But already the authority and school managers are delighted with their findings, which are analysed by Dr Linda Croxford of Edinburgh University.
Mr Stodter says PIPS gives the authority comprehensive data which will build up into a full database over four years. "It is unique in the amount of detail it provides on individual pupils, so teaching strategies can be tailored to individuals or groups or classes. And you can see what's going on, the contributory factors - whether gender, social deprivation, prior attainment or age - which affect policy for school entry."
He was surprised at the extent to which age made a difference on both the reading and maths tests of pupils entering Primary 1. They were 0.3 points lower for each month that a child's age was below the average for P1 pupils. "The younger you are the worse you do," he says, "and you don't catch up as you go on."
The second year's results showed pupils scoring higher where schools had been involved in early intervention work, because of the impact of discussions in nursery. Twelve children in one P1 class scored in the 60s to 70s, compared to only three children the previous year.
For the basic assessment, when children start in P1, a teacher or senior manager takes each one through a 20-minute program on computer. This leads the child through a series of questions on basic reading and maths.
The literacy questions go from the child being asked to write its name and look at three pictures for vocal word recognition, for example "Can you point to some carrots?", to "Can you read this story?" On maths, questions range from "Which is the biggest?" of three pictures, to "How many apples are here?", to "What is 21 more than 32?" At each answer, the teacher uses the computer mouse to click on right or wrong and the program moves on to the next level, according to how the child is doing. A cheery cartoon character called Pip leaps up and down when the children do well and rewards them at the end of each section with a little halo above his head.
The child's record is printed out and retained on the computer. When he or she is tested again at the end of P1, the program will start from where the child left off.
The system is beautifully simple, according to Ruth Mathers, assistant headteacher of Ashley Road Primary. The little ones love Pip and look forward to their next chance to play with him, she says. More importantly, the information the school is gathering is proving valuable for teachers.
The test is done within the first two weeks of school, before any "teacher effect" will show. Teachers are given the initial results within a couple of days, showing them how each pupil did in reading and maths. A few weeks later they receive the fully processed results from Durham University.
The children's scores give teachers information they can put to immediate use. The reading graph for one class showed an astonishing cluster with nearly all the children around average ability, one way above the others and a couple well below. Simply by looking at that graph, Mr Stodter says, the teacher could decide that that class would benefit from whole class teaching for reading. Another class revealed three distinct groups of ability, which would benefit from group teaching.
Individual children also stood out, like the boy who scored 159 for reading, when the rest of the class ranged from 19 to 60. That child, says Mr Stodter, would be given extra learning support to extend his abilities, as would the few at the bottom of the range - the aim being to raise the attainment of all the children on the chart, not to close the gap between them.
You also have to look at the child behind the results, says Ms Mathers. For example, one boy whose father died was suffering from emotional trauma at the time of the initial test and did poorly, but did tremendously at the end of the year. And two elective mutes, who had trouble with the reading test but no problem with maths.
Education adviser Anne Park says PIPS gives the authority information it has never had before. "It lets the authority see if there is a problem with an individual child. Officers have never been in a position to do that." It is also affecting which schools the authority targets with classroom assistants for English as an additional language.
"I'm making a judgment for that school, based on knowledge for that group in that school at that time, always trying to pitch it that wee bit higher," says Ms Park.