Why value added works

8th July 2016 at 00:00
the data doctor

Value added is a hot topic right now. With levels removed, VA is now the Department for Education’s only measure of progress and, consequently, its profile has been raised.

Of course, VA has been around for years but its status has always been overshadowed by levels because a) floor standards were linked to levels of progress, and b) hardly anyone understood VA. But now, with floor standards for both primary and secondary schools based upon VA thresholds, everyone understandably wants to know how it works.

VA works as follows: each pupil’s score at key stage 2 is compared against a national average KS2 score for pupils with the same key stage 1 Average Point Score (APS).

But we could just as easily do away with KS1 APS and replace it with something else – a colour, a letter, a symbol. Anything that enables you to group and identify pupils by their prior attainment.

Analogy time: imagine you enter a 10K race. When you register you are asked what pace group you’d like to be put in: slow, medium, fast. You’re a keen runner so you choose to go in the fast group and are handed a green vest to wear. Obviously, the medium-pace runners wear orange vests, and the slower group wear red (everyone loves a RAG-rating system).

You feel good that day, having trained hard, and run your race in 41 minutes. You’re thrilled because you’ve run a PB and you’re 10 minutes faster than the average time that day. Unfortunately, that’s not what the race organisers are interested in; they’re interested in how your time compares against the average time for the green vest group, which happens to be 37 minutes. Despite being above average overall, you are four minutes down in terms of your group.

At one end we have a vest colour and at the other a finishing time. I do not need to convert the colour into a time or the time into a colour to complete my analysis. The data at either end does not need to be in the same format.

Value added is no different. We just need a way of grouping pupils by prior attainment so we can compare their results against those of similar pupils.

I just don’t recommend we use vests.

This is a change for the better. Historically, there has been a conflict between the two progress measures because they were not linked – VA did not depend on pupils making so-called expected progress. Primary schools, therefore, could easily end up with negative VA scores despite the majority of pupils having made two levels of progress between KS1 and KS2.

Now, rather than a blanket expectation based on levels of progress between key stages, pupils are compared against other pupils nationally with similar start points. There is no longer a universal expected rate of progress – it differs from pupil to pupil, subject to subject, year to year.

Students get to run their own race.

James Pembroke founded Sig+, an independent school data consultancy, after 10 years working with the Learning and Skills Council and local authorities www.sigplus.co.uk

Subscribe to get access to the content on this page.

If you are already a Tes/ Tes Scotland subscriber please log in with your username or email address to get full access to our back issues, CPD library and membership plus page.

Not a subscriber? Find out more about our subscription offers.
Subscribe now
Existing subscriber?
Enter subscription number

Comments

The guide by your side – ensuring you are always up to date with the latest in education.

Get Tes magazine online and delivered to your door. Stay up to date with the latest research, teacher innovation and insight, plus classroom tips and techniques with a Tes magazine subscription.
With a Tes magazine subscription you get exclusive access to our CPD library. Including our New Teachers’ special for NQTS, Ed Tech, How to Get a Job, Trip Planner, Ed Biz Special and all Tes back issues.

Subscribe now