At the end of this term secondary schools will be reporting to the parents of Year 9 pupils two assessment levels for English, maths and science. One of the reported levels will be the result of a teacher's assessment (TA) and the other of the key stage 3 tests. Given the different skills covered by the two assessments, and their different intentions, teachers will not be surprised if some pupils are awarded two different levels. But what will this tell parents? Will it provide ammunition for the proponents of tests to say that TA is biased, or for the adherents of TA to say that the tests are inaccurate? To avoid the idea that one level is "right" and the other "wrong" it is important to begin discussing with parents now what information about their children's progress the two forms of assessment can give. The tests and the TA measure the pupils' progress in using skills which are described in the National Curriculum for English. The differences between them are in their purposes, in their coverage of the curriculum and how validly they assess it, in how far they can be reliable and consistent, and in the contexts in which the two assessments are made.
The first issue is that the two different modes of assessment have different purposes. The tests have been described by School Curriculum and Assessment Authority (SCAA) as a "summative 'snapshot' of attainment at the end of the key stage, based on key aspects of a subject which can readily be tested". Teacher assessments are more formative. They take into account the pupils' progress throughout the year and can reveal a good deal about the strengths and weaknesses of the pupils and what they might do to improve. Schools can highlight this by giving parents not only the overall TA level but also the levels awarded separately for speaking and listening, reading and writing.
The second point at issue is that the tests and TA in English cover different combinations of the skills described in the national curriculum. The most obvious difference is that the TA level includes an assessment of the pupils' oral work whereas the test level is gained from two written papers. The tests focus on close reading of two passages, a short piece of writing and a one-hour task on the Shakespeare play which the pupils have been studying. The teachers' assessments, on the other hand, take into account the pupils' wider reading and their ability to do a range of reading and writing tasks. To judge the validity of the English assessments we must ask how far the skills they assess really are those required the national curriculum. It can be argued that the coverage of the curriculum by TA gives it greater validity as an overall assessment of the English Order because it can include a wider range of the required skills.
When we ask if an assessment is reliable we are asking if a pupil would have gained the same level if he or she had been assessed on another occasion or in another part of the country. To claim consistency the assessment would have to maintain this reliability. Parents will want to know how reliable and consistent their children's assessments are. The tests, because they are more standardised than TA, can claim more reliability. The 1995 English test papers went through a process of trialling and national pre-testing last year and the results were analysed. The papers were then refined and descriptions of the different levels of performance developed. In May, Year 9 pupils sat the same papers and their answer scripts are now being marked by markers using the same mark schemes throughout the country. The tests have therefore been standardised in a way that TA cannot be. Continued from one year to the next this process should maintain a consistent national standard.
English teachers do work with colleagues in their own and other schools to apply the standards required by the national curriculum. But this is a complicated judgment, made more complicated by the fact that what exactly the levels mean in terms of real pupils' performances is still being defined. SCAA is about to publish exemplar materials to illustrate pupils' work at the new Level Descriptions in the English Order. It will take time and experience of working with such materials for teachers to be sure that the standards they are setting for TA are in line with national standards. The tests, their mark schemes and the exemplar answers produced for markers will also play a part in defining those standards.
Parents could also be reminded that the contexts in which the pupils did their work for the two assessments were different. The work for the TA has been done over a period of time in class or for homework. Teachers will have worked with their classes to help them produce their best work in a range of activities. They will have given support and made suggestions for improvement as the work progressed. For the tests. on the other hand, the pupils had to work on their own and within time limits. This produced what we hope were typical performances. These points do not mean that the test was an "unfair" assessment or that the TA was "not the pupils' own work". It simply means that we should remember the different circumstances in which the work for each assessments were done. A difference between pupils' reported levels may be revealing something about the contexts in which they can perform best. Some pupils, for exampel, may perform better when they have plenty of time to complete a task, in which case their TA level may be higher than their test level.
There is an understandable concern among teachers that the test and TA results could be used to make unhelpful comparisons between pupils' performances. But the answer to unhelpful comparisons is to make them helpful by putting the information that we have in context and interpreting it carefully.
This year's dual system of reporting will give parents two pieces of information about their children's work in English. The process of working with parents to understand this information could usefully begin when the results are sent out.
Andrew Watts is Director of the Key Stage 3 English Project at the Midland Examining Group.