Performance indicators must be clear and fair
To be honest it would have come as a bit of a shock if further education providers were no good at getting people into jobs or expanding their educational horizons (page 31). Vocational training and transition is their bread and butter.
Even so, a truly remarkable 95 per cent of providers were judged by those who passed through their portals to be good or outstanding (99 per cent if one includes those judged satisfactory) at getting them into work, a better job or on to the next educational rung.
There is more good news in the Framework for Excellence headline outcomes in that more than two-thirds of providers (68.7 per cent) were judged by their students to be providing a good or outstanding standard of education. This is a well-deserved pat on the back for FE's lecturers, trainers, assessors and facilitators.
Things get more complicated when one then looks at providers' qualification success rates (QSRs). Here we find that despite the fact that nearly 69 per cent of learners said their providers delivered a good or outstanding education, just over half of providers were actually good or outstanding when it came to making sure their students gained the qualifications they were aiming for.
Nearly one in five providers were assessed as having an inadequate success rate, far more than the 7 per cent rate of inadequate college provision assessed by Ofsted.
The mismatch is intriguing. Are we to assume that students at the fifth of providers with poor achievement rates blamed themselves for their lack of achievement rather than the college or provider? Perhaps they did. It is not too difficult to imagine a college serving a particularly disadvantaged community with poor achievement rates that is held in highest regard by its students who think the education provided - including the teaching and support - is outstanding, as well it might be.
This raises an interesting issue for next year's framework indicators that will be published for individual education providers. The college in a deprived part of town that bends over backwards to recruit and retain its often low-achieving students may have an inadequate QSR grade and possibly a good or outstanding grade for its education from its learners. A neighbouring college serving a better-off area is likely to have a better QSR and may also have a good or outstanding score from its learners despite some evidence that higher achieving students are more critical. In a direct comparison the second college would appear the better institution. Except, of course, it would be next to impossible on the indicators alone to say which college was doing the better job in their respective circumstances.
So before the publication next year of individual indicators - and it is just as well these were postponed for a year - a greater degree of sophistication and sensitivity is needed. Efforts must be made to include some measure of value added to reflect properly the differing circumstances of different providers.
Likewise, care must be taken to devise measures that include and accurately reflect the achievement of independent learning providers. It proved impossible this year to award grades in every category to private providers due to a lack of data, meaning the independent sector is underrepresented in the national figures.
There is time and the will to improve the indicators for next year. As a sector further education has nothing to fear from performance indicators as long as they are fair and transparent.
Alan Thomson, Editor, FE Focus; E: email@example.com.