Ann Mroz

There’s not a lorra lorra point to exams if the result is blind data

Exams measure only what the government values – they cannot provide a full picture of a pupil’s achievements or a school’s strengths

Magazine article image

What is the point of exams? Are they to work out how much a student has learned or to gauge how much they have been taught? To equip students with evidence of their achievements or to demonstrate the success of the education system?

There has been much consternation this year about the rise in the number of unconditional offers made to sixth-formers by universities – some 23 per cent of teenagers have had at least one (up from 1 per cent in 2013), according to Ucas. There is concern that young people given these guaranteed offers of places will have “taken their foot off the pedal” and therefore fallen short of the results they could have achieved.

It’s easy to see why schools are worried: they are judged by A levels and GCSEs, and lower results hurt them. It’s easy, too, to see why the government is so concerned: it measures its own success against these results. And it’s especially important this year with the second wave of reformed exams.

Higher education minister Sam Gyimah has condemned universities as more interested in getting bums on seats than in helping students. Putting aside the fact that it is the government’s own changes that are making universities chase these “bums”, why assume unconditional offers aren’t helping students?

After all, pupils are getting what they want – a university place offered on potential – as well as relief from the pressure and stress of cliff-edge exams. What’s not to like? By the time they receive their offers, they should have gone through most of the syllabus (or would have done had the government not overstuffed it with its new reforms) All an unconditional offer deprives them of is extra revision. Most people could live without that.

What is more important, pupils learning things or us knowing that they’ve learned them? It feels as though the pendulum has swung way too far towards the latter.

Academic knowledge is not a proxy for all learning; those tests do not tell us much about how well a student can collaborate, adapt or self-regulate. They look at a very narrow band of school experience. Thus, they do not tell us much about how effective a school is overall.

And how useful are GCSEs and A levels as an indicator of readiness for university and the world of work anyway?

It’s a question that barely gets asked. Because these exams are not really about the pupils. If they were, many would not be pushed into doing them (or persuaded not to). Others would not get off-rolled and their choice of subjects would not be restricted.

Also, if the government were really serious about work-readiness, then it would have come up with a better policy than forcing students who hadn’t achieved a C or better in English and maths to resit these GCSEs at college. Surely it could create a better mechanism than the regime of endless retakes. Only 12 per cent of teenagers without passes in these subjects at 16 go on to secure them at 19 (see pages 52-55).

The definition of insanity is often said to be doing the same thing over and over again and expecting a different result. Making young people do this demonstrates that the government is barking mad.

In reality, GCSEs and A levels measure school effectiveness only in the areas that the government values. They’re a badge slapped on a student to prove that the state has fulfilled its responsibility. Whatever follows is now their own fault.

Has there ever been a time when an exam was less for the student? Their results are not a personal achievement but a contribution to an average. In this wonderful new world, children are but a unit of measurement. Won’t anyone think of the datapoints?