Measuring the difference a school makes

14th November 1997, 12:00am

Share

Measuring the difference a school makes

https://www.tes.com/magazine/archive/measuring-difference-school-makes
You can now identify any weak link in the learning chain: pupil, teacher, school or department. Anat Arkin reports

The John Bunyan school in Bedford has done an about-turn. Three or four years ago staff would talk about how they could improve departments. Now the focus has shifted to what they can do to raise individual pupils’ attainment using a mentoring scheme, a computerised integrated learning system, and a range of other measures to target their specific learning needs.

The catalyst for these activities was QUASE, a service provided by the National Foundation for Education Research (NFER) to measure the value added by secondary schools. The service allows this upper school to compare its actual and predicted GCSE results and to see how the school as a whole, individual pupils and departments have all performed.

Armed with this information, which includes the results of attitude surveys of pupils and parents, staff are able to build on strengths and tackle areas of weakness. In maths, for example, where a dozen or more pupils used to fail GCSE every year, a concerted effort to get lower-ability-band pupils to complete their coursework saw the number failing because of lack of coursework cut to just two.

“The more data we have on the students, the more we can individualise what we do with them and target what is needed,” says Neil Smith, the school’s co-ordinator for improving achievement.

By taking account of pupils’ backgrounds, QUASE has also put the school’s examination results into context. With almost half the pupils speaking English as a second language and a catchment area that includes two of the poorest wards in Bedfordshire, this is vital, according to Neil Smith.

He points out that while cognitive ability tests taken by last year’s GCSE cohort when they joined the school in Year 9 predicted that only 11 per cent would get five or more A-C grades, 23.5 per cent actually reached this standard. The difference between predicted and actual results was even more dramatic in certain subjects, especially history, where five times more pupils than expected gained A-C grades.

Inspectors from the Office for Standards in Education, who gave the school a favourable report two years ago, seemed impressed. “They were fascinated to see that, although our results would not appear to be very good, when they were set against what we were expected to get, we were doing fine,” says Neil Smith.

Many of the secondary schools that have adopted value-added approaches to evaluating their own work are, like John Bunyan school, near the bottom of their local authorities’ exam league tables. Those with good raw results often have little incentive to find out whether pupils are making as much progress as they could be.

But there are exceptions. Dartford Grammar School for Girls in Kent, whose GCSE results are way above the national average, has been using QUASE for the last three years to look at the performance of individual departments and subjects.

“We are not prepared just to rest on our laurels and accept that we get a 90-94 per cent pass rate,” says Roger Burgess, a deputy head. “There’s always room for improvement in any type of school and we are concerned that within individual subject areas we move towards more A and A* grades.”

Dartford Grammar uses another value-added analysis scheme, ALIS, to analyse its A-level results with the aim of improving them - although, with a pass rate of 99 per cent, Roger Burgess admits this could be difficult.

Developed in the early Eighties by Durham University’s Curriculum, Evaluation and Management Centre (CEM), which was formerly attached to Newcastle University, ALIS has been around for longer than any of the other value-added examination analysis schemes on the market. The centre now also provides similar services to track younger pupils’ progress and attitudes through YELLIS (Year 11 information system) which is used by around 800 schools, MidYis (middle years information system) and PIPS (performance indicators in primary schools).

Weston Road High School, an 11-18 comprehensive in Stafford, has been using ALIS since 1989 and in the early Nineties was one of the schools that piloted YELLIS. The school has found these two systems so helpful in tracking the progress of its 14 to 19-year-old pupils that it has just started using MidYis with lower years.

“The major benefit is that we have a foundation of information based on pupils’ prior attainments, which we did not have before, and which backs up our professional judgments,” says Geoff Cooper, Weston Road’s headteacher.

He and his staff use this value-added data to compare the school’s performance with other schools and to identify areas that need improving. While this analysis can show that pupils are doing better in some teaching groups than in others, Geoff Cooper warns against using value-added information to identify weak teachers.

“We do analyse it by teaching groups, but of course other teachers will have taught those pupils and there are also whole-school effects,” he says. “So you have to be extremely cautious about making judgements about the quality of teaching in relation to value-added information. Certainly, as a single indicator, I would not want to use it for that purpose.”

Schools using YELLIS and the other CEM value-added services, and those using the NFER’s QUASE tend to report very similar benefits. Yet the two organisations rely on different statistical methods to analyse schools’ test and examination results. While CEM uses simple regression analysis to work out how much value schools have added, NFER uses multi-level modelling. This is a more sophisticated statistical process that takes account of information about individual pupils, year groups and the school as a whole.

“The variables at each of those levels have an impact on pupils’ performance, and multi-level modelling sets up a model which enables you to analyse the data at each of those levels,” explains Lesley Saunders, head of the NFER’s School Improvement Research Centre.

Exponents of multi-level modelling say it is the only statistical model able to take account of the reality of pupils working in schools. But CEM’s director, Carol Fitz-Gibbon, argues that the advantages of using such a complex model are marginal and outweighed by the disadvantages.

“We use multi-level modelling for research purposes but we are utterly convinced that schools need to be able to replicate our analysis and see where the data comes from, so we have always stuck to simple analysis,” she says.

CEM, like the NFER, employs expert statisticians to analyse test and examination results. But some schools prefer to do their own thing. Hele’s School in Plympton, Devon, first became interested in the value-added approach to assessing pupils’ progress when the centre offered a slimmed-down version of ALIS at a reduced cost to Secondary Heads Association members. The school used this service for a few years and was pleased with the results.

“It was very useful for staff to start thinking about value-added and how well we were doing with particular cohorts of students,” says Dave Anderson, the examinations officer who is also responsible for information systems.

But Hele’s then switched to a value-added assessment module developed by education software provider SIMS. This made sense because the school already held data on GCSE and A-level results on its SIMS system. The programme’s flexibility was a further attraction.

“You could manipulate it to see what the effect of removing one subject from the list would be, and you could look at how the school was adding value in a much more detailed way because you were doing it on your own computer with your own data and comparing it with national figures,” says Mr Anderson.

A further advantage was that Devon schools already using SIMS for administrative purposes did not have to pay anything extra for the value-added module. But even if Hele’s had paid the full price, at Pounds 125 plus VAT, the software is cheaper than most other value-added analysis services.

The National Foundation for Educational Research uses a sliding scale based on the size of a school’s Year 11 cohort to calculate the cost of QUASE. Last year schools paid on average Pounds 700. CEM charges Pounds 450 plus Pounds 1 per student for its basic YELLIS service, Pounds 225 plus Pounds 2 per student for ALIS and Pounds 200 plus Pounds 2 per student for MidYIS.

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared