Resit jackpot doesn’t add up

9th September 2011, 1:00am

Share

Resit jackpot doesn’t add up

https://www.tes.com/magazine/archive/resit-jackpot-doesnt-add

Well, bugger me. For once my department did OK in the annual A-level lottery and we copped some good results. I should be cock-a-hoop because I taught a few winning students who lucked out in the jackpot resit rollover. But I’m keeping the champagne on ice because their triumph is as much a cause for concern as it is celebration.

How could two hard-working students gain a D and an E in the January exam, then an A and an A* in the resit? Either that makes me the world’s smartest teacher, or there is something rotten in the state of the exam system. Such bizarre anomalies in student performance can only be explained by one thing: the English exam system is a joke with a tediously predictable punchline - same board, same students, only a different marker.

Of course, since their first attempt the students may have improved their exam techniques, but I wouldn’t have thought that adding the occasional semi-colon or referring to Patrick Stewart boldly going where no BBC Macbeth has gone before would make such a monumental difference to their grades.

The only other explanation is that my final act of desperation - urging the kids to spatter the key words of the question across every paragraph like a form of lexical dysentery - finally paid off so that any marker would see they were on task. Either way, it doesn’t look good for the integrity of the exam.

I should be jubilant with their new grades, but I’m not. I remember too well those awkward meetings back in March where I tried to console tearful parents about their children’s disastrous results.

I suspected then, as now, that our teaching wasn’t entirely at fault. But a teacher blaming an exam board is about as convincing as my husband downing two gallons of Old Speckled Bishop’s Arse, regurgitating it on the mat then blaming it on a bad pint.

So despite the happy-ever-after ending, we’re worried we’ll face the same aberrations next year. When an exam board employs randometers rather than professional markers to decide on a student’s grade, it’s not a case of “how well have I taught this?” but “do I feel lucky?” Of course, the board would argue that the cogent application of assessment objectives eliminates the markers’ subjectivity, but - to steal a proverb - if there’s many a slip twixt cup and lip, there’s a bloody cognitive avalanche between these descriptors and how they are applied. This would be hilarious if it wasn’t so tragic for our students.

So we’re thinking about changing exam boards. And since there is no chirpy exam board comparison site to guide us, we’re sounding out colleagues from other schools about alternatives.

Whoever we choose, it won’t be an easy move. Changing exam boards is like changing bras - you hang on to your old one because it’s comfy and doesn’t chafe your nipples. But when it makes your results droop like a spaniel’s ears and your CVA look like shit, then it’s time to uplift and separate.

Anne Thrope (Ms) is a secondary teacher in the north of England.

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared