Why value added works

8th July 2016 at 00:00
the data doctor

Value added is a hot topic right now. With levels removed, VA is now the Department for Education’s only measure of progress and, consequently, its profile has been raised.

Of course, VA has been around for years but its status has always been overshadowed by levels because a) floor standards were linked to levels of progress, and b) hardly anyone understood VA. But now, with floor standards for both primary and secondary schools based upon VA thresholds, everyone understandably wants to know how it works.

VA works as follows: each pupil’s score at key stage 2 is compared against a national average KS2 score for pupils with the same key stage 1 Average Point Score (APS).

But we could just as easily do away with KS1 APS and replace it with something else – a colour, a letter, a symbol. Anything that enables you to group and identify pupils by their prior attainment.

Analogy time: imagine you enter a 10K race. When you register you are asked what pace group you’d like to be put in: slow, medium, fast. You’re a keen runner so you choose to go in the fast group and are handed a green vest to wear. Obviously, the medium-pace runners wear orange vests, and the slower group wear red (everyone loves a RAG-rating system).

You feel good that day, having trained hard, and run your race in 41 minutes. You’re thrilled because you’ve run a PB and you’re 10 minutes faster than the average time that day. Unfortunately, that’s not what the race organisers are interested in; they’re interested in how your time compares against the average time for the green vest group, which happens to be 37 minutes. Despite being above average overall, you are four minutes down in terms of your group.

At one end we have a vest colour and at the other a finishing time. I do not need to convert the colour into a time or the time into a colour to complete my analysis. The data at either end does not need to be in the same format.

Value added is no different. We just need a way of grouping pupils by prior attainment so we can compare their results against those of similar pupils.

I just don’t recommend we use vests.

This is a change for the better. Historically, there has been a conflict between the two progress measures because they were not linked – VA did not depend on pupils making so-called expected progress. Primary schools, therefore, could easily end up with negative VA scores despite the majority of pupils having made two levels of progress between KS1 and KS2.

Now, rather than a blanket expectation based on levels of progress between key stages, pupils are compared against other pupils nationally with similar start points. There is no longer a universal expected rate of progress – it differs from pupil to pupil, subject to subject, year to year.

Students get to run their own race.

James Pembroke founded Sig+, an independent school data consultancy, after 10 years working with the Learning and Skills Council and local authorities www.sigplus.co.uk

Log-in as an existing print or digital subscriber

Forgotten your subscriber ID?


To access this content and the full TES archive, subscribe now.

View subscriber offers


Get TES online and delivered to your door – for less than the price of a coffee

Save 33% off the cover price with this great subscription offer. Every copy delivered to your door by first-class post, plus full access to TES online and the TES app for just £1.90 per week.
Subscribers also enjoy a range of fantastic offers and benefits worth over £270:

  • Discounts off TES Institute courses
  • Access over 200,000 articles in the TES online archive
  • Free Tastecard membership worth £79.99
  • Discounts with Zipcar, Buyagift.com, Virgin Wines and other partners
Order your low-cost subscription today