Skip to main content

Monster made to measure

Some of our best efforts and intentions are doomed from the start. Some fail because they run counter to common sense. Making Iain Duncan Smith party leader was a great example.

Other potentially worthy notions make it through a swamp of prejudice and dogma only at the expense of their effectiveness: using top-up fees to bankroll universities is a case in point.

Others fall foul of the Heath Robinson law of efficiency: there is no point in setting up hugely complex mechanisms if the end product is ultimately trivial and can be done better and simpler.

Remember Heath Robinson? He devised fiendishly intricate machines with wires and pulleys and giros, gismos, mirrors, pins and thingummyjigs. They circumnavigated entire houses to operate a steel digit that would pick your nose less effectively than you could manage yourself.

In the sane world these things never get built. In the FE sector we employ armies to ensure they do. And the latest monster is now among us in the form of a Learning and Skills Council consultation called Measuring success in the learning and skills sector. This proposes costly and time-wasting systems to find ways of measuring what really counts as success in FE.

Since an inspectorate of one sort or another, the Further Education and Funding Council and now the Office for Standards in EducationAdult Learning Inspectorate, has been with us since we crawled out of the primordial pit of local education authority control, you would think they'd have cracked it by now.

External inspection is designed to improve standards and create success.

If, after 10 years, a billion pounds and a trillion reports, Ofsted hasn't found any success to measure, or a way of measuring it, then it has failed to do what it set out to do and should be abolished as pointless. And then we really should need to look for new ways of measuring success. Since Ofsted is still with us, presumably it is working, colleges are getting better and better, and the reports are measuring it.

Ah, I hear you say; but they measure success from a limited viewpoint and don't take account of the different challenges colleges face. Colleges full of Simons and Emmas in Surrey get better reports than colleges with Traceys and Sujindhers in Birmingham.

Sadly, you are probably right, but the new set of measures is not designed to replace Ofsted or correct its in-built bias. Instead, it is searching for new things and predictably it has hit on the old"value-added" chestnut.

In a value-added system, you are judged not on what you achieve in absolute terms, but on how well you use the material you have to hand.

In a value-added world, Poland would be judged the most-visited country anywhere. Belgium would be the most exciting, Iraq the most peaceful and Germany the place to go for a good laugh. A worthy attempt to correct natural injustice, but hardly convincing.

And value-added is not the only proposal by any means. There is one which seeks "to develop a summary of qualification level success for each provider, taking into account curriculum profile". I think I know what this means and it would be a great outcome.

Put simply, if your curriculum profile involves teaching level 3 key skills to bored teenagers, you are unlikely to match the success rates of a college which hands out governors' certificates for a three-day course on aromatherapy.

Now I understand this and you probably do, but will the woman who employs our leavers to sharpen her widget-grinders? Will we ever find a way of expressing such a complex situation in a simple enough form to mean anything to anyone not steeped in our peculiarities? And if the measures aren't for them, who are they for?

Well, I think I know what the problem is. Whoever drafted the learning and skills act made a monumental error when they gave the LSC the task of driving up standards and measuring quality.

The poor old LSC has spent a fortune in the past three years trying to find out how it can carry out that role without being Ofsted, without creating a bureaucratic juggernaut and without shattering the fragile partnerships it is building with its providers. In the process, it has created a true horror, hated throughout the sector, mocked and jeered at by everyone who experiences it, known as "provider performance review".

It started as a three-times-a-year exercise, came down to two, is now at one and will shortly, I confidently expect, disappear into the orifice whence it emerged. In the process, those LSC staff with intellect and compassion suffered a collective crisis of guilt at having to inflict this pestilence on already over-regulated colleges.

Measuring success in FE is the response to that guilt and the product of liberal, well-meaning Heath Robinsons keen to let the world know about the real FE. They want fairness, justice, equality, truth, and all the other things we aren't used to in colleges, to shine through at last. To assess their chances, see above.

Graham Jones is principal of Sutton Coldfield college

Log in or register for FREE to continue reading.

It only takes a moment and you'll get access to more news, plus courses, jobs and teaching resources tailored to you