Why you can’t measure progress

There is simply no point in measuring progress in the name of ticking boxes, writes the TES' data doctor

James Pembroke

News article image

When plans for assessment without levels were first announced, many set about simply recreating them, devising new systems of broad, best-fit bands and associated point scores in order to categorise pupils and count steps of learning.

These schools failed to realise that this perpetuated the fundamental problem of pupils being rushed through content with gaps in their learning. And they have either knowingly or unwittingly put their students’ learning at risk, too – the systems they have implemented do not support the main aims of the new curriculum, which are to consolidate learning and deepen understanding.

Why has this happened? One factor is a lack of clarity on why levels were removed and the opportunity this presents: to concentrate on assessment for learning, to develop simplified, insightful approaches, and to collect data to support this process that can have a positive impact on pupils’ learning.

The other factor is fear, which is best encapsulated by the following all-too-common question: “But we have to measure progress, right?”

Wrong. You have to show progress, yes, but measure it? How? I take issue with the progress measures we employ in our tracking systems.

Who are these numbers for? What do they tell us? Do they really have any impact on pupils’ learning?

Progress is catching up, filling gaps, overcoming barriers and deepening understanding. Can all this really be represented by a simple linear point scale? Or are we just doing it because we think we have to, to satisfy the data cravings of external agencies?

Like the original story of Cinderella, in which the stepsisters cut off toes in their desperation to make the slipper fit, we have attempted to carve up and shoehorn the curriculum into an ill-fitting predefined scale for the purposes of quantifying progress.

Our incessant desire to develop a numerical approximation of progress will often override the need for accuracy, meaning and usefulness.

And yet these measures that schools are so desperate to have can easily backfire on them. A system measuring progress in terms of curriculum coverage is at odds with a curriculum where pupils move through at broadly the same pace and are differentiated by the support they require and their depth of understanding.

What is the value of a system that shows the majority of pupils to have made “expected progress”? What is the risk of that system if the only way to show better progress is to push pupils on to the next year’s content?

The Commission on Assessment without Levels and the Workload Review Group have both provided plenty of support and justification for schools seeking to overhaul assessment; and Ofsted have further reassured schools by stating that they have no preferred approach to tracking pupil progress.

We need to free ourselves from complex systems and meaningless numbers. Remember: bad data is not better than no data at all.

James Pembroke founded Sig+, a school data consultancy, after 10 years working with the Learning & Skills Council and local authorities

This is an article from the 27 May edition of TES. This week's TES magazine is available in all good newsagents. To download the digital edition, Android users can click here and iOS users can click here

Want to keep up with the latest education news and opinion? Follow TES on Twitter and like TES on Facebook

Register to continue reading for free

It only takes a moment and you'll get access to more news, plus courses, jobs and teaching resources tailored to you

James Pembroke

James Pembroke founded Sig+, an independent school data consultancy, after 10 years working with the Learning and Skills Council and local authorities

Latest stories


Coronavirus and schools: LIVE 24/9

A one-stop shop for teachers who want to know what impact the ongoing pandemic will have on their working lives
Tes Reporter 24 Sep 2020