Performance league tables seem to be here to stay, but how much do they reveal about what schools have achieved and how can they be made to do this job more effectively? The TES reports.
Performance data from regular inspections of colleges should replace crude exam results when compiling league tables, say the heads and organisations representing the sector.
The first annual report of the chief inspector on the newly independent sector is published today - rating everything from management quality to exam results on a scale of one (top) to five (bottom). Eighteen categories are used - based on detailed inspections of colleges - including 10 for the curriculum areas and assessment.
Terry Melia, chief inspector for the Further Education Funding Council, said: "It would be easy to aggregate these scores and rank achievements, based on quality of teaching and outcomes. It would be far easier, too, to have value-added factors included this way."
Colleges were quick to pick up on his point in league tables week and suggest the inspections as a more efficient measure than exam data. The report covers a quarter of all FE, tertiary, specialist and sixth-form colleges which were inspected in the year 1993-94.
Within two years, all colleges will have been inspected.
Under the FEFC style of inspection, a staff nominee works constantly with the inspectorate and a named inspector keeps a regular watching brief on the college all year round. This allows a more regular review of performance than is possible in schools.
The annual report already gives very strong indications of trends in the FE sector which the Government league tables are largely unable to reveal. On the academic side, the two measures are virtually the same: sixth-form colleges outshine the rest of the sector. This is not surprising.
What is more revealing is the fact that they also surpass the FE and tertiary colleges on a wide range of issues. They achieved more grade ones and twos on quality assurance, accommodation, governance and management and teaching standards. It raises serious doubts about the value of mergers, amalgamations and the push towards ever bigger, more cost-effective institutions.
Dr Melia said it went beyond the question of academic performance.
"Small institutions and monotechnics come out better. Some big FE colleges with 20,000 students are doing excellent work but they are clearly more difficult to manage."
The report on 150,000 students observed in 11,000 lessons shows the majority of colleges performing extremely well. In 60 per cent of lessons, strengths outweighed weaknesses.
In only 8 per cent of lessons did weaknesses outweigh strengths. The range of courses and responsiveness to students and employers was judged tobe good in seven out of 10 colleges.
"Although the new further education sector has got off to a good start, neither the colleges nor the council that funds them can afford to be complacent," said Dr Melia.
"Colleges need to address the unacceptably high wastage rates from some of their courses and poor exam results, particularly among those undertaking GCE and GCSE programmes in some general further education colleges."
Other concerns in his report are poorly developed quality-assurance systems in many colleges, inadequate management of key information affecting recruitment and the high proportion of time many teachers have to spend on administration.
Weaknesses were noted particularly in special-needs provision, links with industry and commerce and some practical areas related to National Vocational Qualifications.
Ruth Gee, chief executive of the Association for Colleges, said the inspectors' approach reflected what colleges were about - the Government's league tables did not.
Madeleine Craft, secretary of the Sixth Form Colleges' Association, said: "The debate is moving on from exam league tables. The debate now really needs to be raised about the wider performance of the colleges."