Especially useful

23rd January 2004, 12:00am

Share

Especially useful

https://www.tes.com/magazine/archive/especially-useful
David Jesson explains how he was able to measure fairly the value added by specialist schools

Value-added has come to the top of the Government’s agenda for improving schools’ performance.

Schools minister David Miliband’s speech to the North of England conference, emphasising the Government’s commitment to “intelligent accountability”, shows just how far we have come from the days when performance tables were foisted on the educational community. “Intelligent accountability requires that schools and parents be confident that performance is being compared on a like-for-like basis,” he said. This is precisely what value-added analysis attempts to do.

Value-added attempts to make comparisons of schools’ performance fair, by taking into account factors which are agreed to affect performance.

Foremost among these is that schools are accountable for the progress their pupils make. This hardly seems revolutionary - but the path to doing so adequately has been fraught with difficulty not least because of the competing agendas of policy-makers and statisticians. Policy-makers want a simple and transparent system, statisticians warn of the dangers of over-simplification. Caught in the middle, bureaucrats come up with methods which satisfy few.

One way forward is outlined by the Specialist Schools Trust. Its new report, published last week, provides schools with an evaluation of their 2003 GCSE performance covering the percentage of pupils scoring five or more A* to Cs.

To develop this framework, the performance of every non-selective school was assessed. Potentially this offers a way for every school to evaluate its performance.

What factors should be taken into account?

Prior attainment strongly affects later performance. This is clear from the annual Department for Education and Skills tables showing the chances of pupils of different prior attainments (at age 11 and 14) gaining five or more A* to C passes.

The gender composition of schools is a further factor of importance in explaining differences. The past 10 years have shown a consistent advantage for girls over boys - some 10 per cent more girls achieve this outcome than boys.

Over the years, other factors have been proposed, for example, using free school meals; pupils’ language background; measures of schools’ ethnic balance and so on. These are suggested because they are identified with additional problems some schools face in helping pupils make progress.

But inclusion of too many factors may provide schools with perverse incentives, or at least, obscure the major relationship between pupils’

starting points and their final achievements. Until recently, the percentage of pupils eligible for free school meals was used to compare school performance. If a few extra eligible children were found, this could substantially affect the evaluation of a school’s performance - by placing it in a comparison group with more disadvantaged (and hence possibly less well-performing) schools. This had little to do with enhancing educational progress.

By contrast, some recent evaluations show many schools performing well in areas of socio-economic disadvantage. So you could argue that if some can do so, why should others get an artificial advantage and have factors taken into account which may not be conclusive in determining performance.

To estimate school performance fairly you need to look at an expected outcome dependent on the characteristics of pupils in a school, and compare with what was actually achieved. The Specialist Schools Trust calculates schools’ expected performance by using three measures:

* the percentage of five or more A* to C GCSEs;

* the average key stage 2 points score of its GCSE cohort;

* each school’s gender composition.

Each specialist school’s 2003 GCSE results are evaluated in terms of the percentage of five or more A* to Cs. Pupils’ scores at KS2 are compared with their GCSE scores. The result shows which schools do better than expected and by how many points, which schools do less well than expected and which - the majority - do pretty much as expected.

Progress charts from the DfES can then be used to identify which groups of children are failing to progress, thus providing a catalyst for further investigation.

Subject department performance then comes under the spotlight since discovering how well pupils do in their individual subject areas is critical for identifying areas of under-performance. For example, if geography produces results well below what pupils “should” achieve, not only is whole school performance reduced, but, if not addressed, is likely to lead to further pupils in future years being similarly sold short.

Thus a value-added approach provides one means of “intelligent accountability” from which all schools, and their pupils, can benefit. And, as is happening in many specialist schools, can be one element in ensuring substantial improvements in performance.

Professor David Jesson is at the centre for performance evaluation and resource management in the department of economics at the University of York

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared