Coasting

Is it fair to damn the latest initiative to boost school performance as crude and misleading, asks William Stewart
21st November 2008, 12:00am

Ministers once again stood accused of talking down teachers’ achievements as they launched the latest phase of their school improvement strategy. Teachers’ leaders complained that the “coasting” schools plan was putting secondaries into crude categories, labelling them and misleading the public.

It looked as if ministers had again managed to alienate the profession, risking a backlash like the one that met National Challenge earlier this year. But would that be fair? The latest strategy seems different from the original scheme in crucial ways.

First, National Challenge immediately risked subjecting schools - 638 at the time - to damaging and demoralising publicity. Ministers may not have named them, but by revealing how many there were and exactly what GCSE results qualified them for the National Challenge, they might as well have done.

This time, the Department for Children, Schools and Families has not said how many “coasting” schools there are, nor provided a simple definition or qualification that allows anyone to work it out.

Instead, the decision over which schools should benefit from extra support offered via the coasting scheme is, on the face of it, being left to local authorities.

They have been asked to look for a range of indicators including low pupil progress, below-average contextual value added, stalled exam results, poor Ofsted reports, complacent leadership and poor-quality subject leaders. Authorities have until the end of January to work with National Strategies advisers to agree which coasting schools should be identified in their areas.

Until then, there is no risk of headlines naming schools. But what about afterwards? When schools have been identified, will the decisions be made public?

Ministers say they will not publish a central list and that it will be up to local councils to decide.

Ed Balls, the Schools Secretary, said parents must be clear about what was happening at their child’s school. But that would not necessarily require a media announcement to get the message across. Even if it did, a story about one local school receiving extra support is unlikely to do the same damage as a national story using terms such as “failure” or place them under threat of closure.

But some warn that even if the scheme does not damage individual schools, the rhetoric used is bad for the morale of the whole system. The Government has talked about schools with “an excuse culture”, a “lack of drive”, or where “too many pupils sit back”.

There is also the question of how schools are judged. National Challenge angered many by selecting schools on the basis of raw GCSE results without taking pupil backgrounds into account. The coasting scheme avoids that trap with its menu of indicators. But many of them are still data-based and come with a host of potential problems.

For example, the National Association of Head Teachers warns that contextual value added can discriminate against high-attaining schools. And those with high percentages of pupils with learning difficulties may struggle to make the same rates of individual pupil progress as other schools.

DCSF officials say “progress” should be the crucial factor in identifying coasting schools. Government analysis shows that nationally one in seven pupils does not progress at national curriculum level in English between 11 and 14.

On the plus side, when it comes to how best to measure this, the end of compulsory key stage 3 tests means schools could be left with a relatively free hand.

But the Government’s focus on progress as an indicator does slightly undermine its claim that it has not already taken a view on which schools should be classed as “coasting”. Indeed, it has already decided there are “hundreds” of them out there, that they are more likely to be in shire counties, and it has even surveyed schools that “could be perceived to be coasting”.

Some question the whole use of the school as a unit of analysis when looking at pupil performance.

Professor Stephen Gorard of Birmingham University said that about 80 per cent of the variation in exam results between schools can be explained by variation in the background or prior attainment of pupils, rather than the actual schools.

“Why do people expect that a system designed to be comprehensive should produce such different schools?” he said. “It doesn’t.”

He believes a scheme that attaches extra funding to individual pupils, rather than whole schools, might be more effective.

Given that ministers still see schools as the best unit on which to base reforms, heads may be glad that none has been named and shamed for coasting - yet. The possibility that their record will be judged in the round, rather than on potentially misleading statistics, also remains. So the department seems to have learnt some lessons from the National Challenge experience.

But teachers’ leaders wish it had learnt more. Chris Keates, general secretary of the NASUWT, said: “These announcements only feed the failure-fixated culture around schools and undermine confidence.”

With headlines such as “We’re failing bright pupils” and “Minister attacks schools which could do better”, she may have a point.

TIMETABLE TO TRACK THE ‘COASTERS’

November 2008: The scheme is announced.

January 31 2009: Deadline for local authorities identifying coasting schools.

March 2009: Local launch conferences to be held.

April 2009: School improvement partners to start working with identified schools.

Summer 2009: Schools start to benefit from additional, tailored support.