The publication of the Scottish Survey of Literacy and Numeracy (SSLN) coincided with the results of a major Organisation for Economic Cooperation and Development (OECD) study entitled Synergies for Better Learning. That report looks in depth at evaluation and assessment practice across the world and draws a number of recommendations for policy and practice.
In Scotland, we have made important strides in evaluation and assessment in recent years, although we have not yet produced the kind of single coherent framework covering both elements that the OECD recommends. Nonetheless, in Building the Curriculum 5 we do have a comprehensive assessment framework and the University of Glasgow's recent Assessment at Transition report provides more useful guidance. They both raise the question of how we can ensure that assessment plays its full part in high-quality learning in every school and every classroom across the country?
The nature of the new curriculum requires a wide range of assessment methods. Assessment is perhaps the most technically complex aspect of a teacher's skills and all teachers should be able to design reliable assessment methods and interpret results. However, assessment has so far received relatively little attention in teacher education. One way of partly addressing this gap is through teachers' participation in moderation. Moderation not only promotes reliability but also involves the kind of professional dialogue that can and should be a major contributor to professional development.
We need to ensure that we assess what matters, not just what can be most easily measured. This requires assessment that covers the full curriculum and not just literacy and numeracy. Valid assessment depends on a deep understanding of what curriculum goals mean for learning. To quote Einstein: "Everything should be made as simple as possible, but no simpler." Too often, we oversimplify what is to be assessed. Curriculum for Excellence will be a pale shadow of its potential if we fail to assess the kind of deep learning that it espouses, with teachers exercising professional judgement in an environment of well-earned trust.
The best teaching also needs the kind of reliable evidence that can be got from well-designed testing. Data are the friends of good teaching and learning. However, we are well aware of the perverse effects that high-stakes testing can have. Teaching to the test, focusing on children whose performance can most easily be improved and playing safe in teaching methods are all well-documented dangers of the misuse of test results. These negative practices are by no means inevitable. Formative and summative assessments are not in conflict with each other. Both are essential if we are to realise the potential of assessment as a key support for learning. The first guides day-to-day decisions about the next steps in learning. The second allows us to assess cumulative learning over time and can also inform decisions about pace and depth of learning by providing good comparative data for schools in similar contexts. Similarly, although we know that the "mark" can become an end in itself, there are other ways of responding to "test" answers. In particular, thoughtful commentary that provides the kind of feedback that takes learning forward can be much more influential.
Technology increasingly allows us to use adaptive techniques, such as those used in Denmark, which adjust test questions to reflect the pattern of responses. Similarly, the "smart" (school measurement assessment and reporting toolkit) approach used in Australia draws on a range of test data to provide not just diagnostic information at the class and student levels but also to suggest teaching strategies and resources that reflect that diagnostic evidence. Nearer to home, Education Scotland is looking at ways of using the results of testing to guide support.
The proposed Senior Phase Benchmarking Tool is an encouraging step forward. By benchmarking a school's performance against a comparable "virtual" school, it should provide useful insights while avoiding the perverse effects that can accompany league tables. We now need to apply the same kind of creative thinking to assessment at the earlier stages.
At the national level, SSLN 2012 provides helpful and encouraging insights into the state of literacy in our schools. However, as this is the first survey in this form, we must be cautious about interpretation. We will have further performance information when the results of the next Pisa (Programme for International Student Assessment) survey become available next December. For many countries, Pisa results have given rise to what is now known as the "Pisa shock", with performance becoming a national crisis. Reactions to Scotland's Pisa story may have been less dramatic but results show that, although we have been consistently above the OECD average, we have not been improving.
We have made important progress recently in our use of assessment in Scotland. However, debate about the use of the results of tests has, for at least 20 years, inhibited the development of sensible policies about the place of testing at school level. Building the Curriculum 5 provides a comprehensive overview of assessment policy and practice and should guide our approach at the national, local and school levels. We need to recognise that there is no intrinsic problem with testing. The potential problem lies in how results are used or - more accurately - fear of how they might be misused. That fear will only be allayed if we make the kind of constructive link between assessment and evaluation that the OECD advocates.
Graham Donaldson is a professor of education at the University of Glasgow and author of Teaching Scotland's Future.