Brian Lightman, general secretary of the Association of School and College Leaders, writes:
The performance tables published this week reveal that the number of schools below the floor target of 40 per cent have doubled. This staggering increase and that the gap between more- and less-disadvantaged pupils had grown has caused some fundamental questions to be asked about the validity of performance tables and about what the actual standards of achievement are in our schools.
Can it really be that standards have plummeted in spite of the massive efforts of every school leader in the country to raise standards further? Does it mean that the government’s reforms have failed? Or does it simply mean that these tables do not present the whole picture?
My interpretation is the last one, for three reasons.
First, these tables do not contain full information about the schools’ results. By only displaying first entry results, the actual outcomes many students achieved are missing. After all, this is what matters to employers and providers of further and higher education. Perfectly valid debates about the pros and cons of multiple entry belong elsewhere.
Second, these tables contain the results of a set of exams that are substantially different from those with the same name taken in previous years. The removal of speaking and listening from the English examination is an obvious example. Comparing this year’s results with those from 2013 is like comparing apples and oranges. The only way to make a valid comparison would be to calculate this year’s headline data using 2013 rules.
Third, vocational qualifications are now weighted differently in performance tables, meaning that the same grades will attract a lower points score in 2014 compared to 2013.
We simply do not know whether standards have risen, remained stable or declined. This is extremely worrying for school leaders, who need to be able to evaluate the effectiveness of the strategies they are using to raise achievement. It is confusing for parents and employers, who need to be able to understand and trust our qualifications.
All of the above is the result of piecemeal changes to qualifications rather than a planned implementation programme against a realistic and manageable timescale with an appropriate communications plan. The rules must never be changed after students have commenced their examination courses as they have been recently.
The sum total is a great deal of confusion which, worryingly, is set to continue for several years to come while the reforms are still being implemented.
For school leaders, the challenge is now to ensure that parents have access to the full story. School websites need to display all of their results and explain them. They need to point to the valuable information about pupil progress, which is in the public domain. And parents need to be assisted to look beyond these raw figures, visit schools, talk to staff and students and find out the whole story.
But picking up the pieces of a flawed system cannot be the right way to hold schools to account. Parents have a right to the full set of information.
This is why ASCL has been working with a number of partners to publish a set of alternative performance tables that do present a fuller picture. In a self-improving system, schools need to be agents of their own accountability; holding themselves rigorously to account and demonstrating a complete commitment to open data.
Perhaps the light at the end of this tunnel will be the time when, as proposed in ASCL’s blueprint for a self-improving system, school leaders take ownership of accountability in that way.