Dawn of the star system

2nd December 1994, 12:00am

Share

Dawn of the star system

https://www.tes.com/magazine/archive/dawn-star-system
All colleges will now be judged against six performance indicators, but some are already choosing to submit to more detailed measures of success. Neil Merrick reports.

Education sometimes appears in danger of being saturated by statistics. Whether it is through league tables, budget returns or databanks, schools and colleges rarely have to look far before they are confronted by a new set of figures or indices.

Mounting pressure to develop performance indicators (PIs) to measure the success (or otherwise) of institutions often means that teachers and principals look upon such data with suspicion. But there seems little doubt that PIs are coming and that they will be here to stay.

Last month the Further Education Funding Council confirmed that, in future, all colleges will be assessed against six basic indicators (see box). Most of the information is already held by the FEFC, so colleges should not need to spend too much time gathering extra data.

The results will appear in future FEFC inspection reports alongside the average or norm for similar institutions. Geoff Hall, the FEFC’s director of education programmes, said the PIs would establish a series of benchmarks which would allow meaningful comparisons to be made.

“The judgments made by inspectors will be accompanied by hard data. If a college is really out-performing its family group then it can truly say that it’s a successful college,” he said. “But the reverse is also true. The few colleges which are strugglers will find it difficult to disagree with the picture which is being created. There is no reason for not pulling their socks up”.

According to the FEFC, colleges welcomed the fact that the number of PIs was being limited to six, at least until 1995-96. But that has not prevented some from developing more extensive PI projects, either as consortia or through calling in outside consultants. FE College Services, which developed a complex database with management consultants KPMG, has enrolled 54 colleges. Each provides data on 144 PIs ranging from energy usage to the number of library books borrowed.

From January, colleges will be able to compare their PIs against other institutions which have each paid Pounds 940 to join the project. The information will appear in league tables, but colleges will only be able to identify themselves - the identity of other colleges will not be revealed. Dr Eric Avery, principal of Tile Hill College, Coventry, said the project would include the sums colleges spend on marketing. “I don’t know anyone else who has access to that sort of information”.

Consultants more used to developing management systems for the private sector have been quick to spot the opportunities presented by the multi-million pound FE industry. Nottingham businessman Ben Johnson-Hill has adapted his “Best Practice” project, which has its roots in the declining textile industry, and created a system with no fewer than 750 PIs. Mr Johnson-Hill claimed that, whereas most benchmarking systems were selective, his project measured everything from the average number of hours each week that teachers spend in contact with students to the cost of postage. Colleges could choose whether their data was compared against the norm for local colleges or similar-sized colleges nationally. Colleges had identified savings of up to Pounds 500, 000 per year, but that was not the main reason why they had chosen Best Practice, he added.

One FE college had been advised to move to a new site as a result of research undertaken by his company. “The cost-saving is the least important outcome, ” said Mr Johnson-Hill. “The greatest benefit in the majority of colleges is that we are challenging their strategy”.

Other colleges are going it alone. Research at Blackpool and the Fylde College in Lancashire is examining students’ postcodes to discover if the college is attracting a cross-section of students or whether they come predominantly from the same areas. Principal Mike McAllister said the study would be used to assess the effectiveness of college marketing.

In a recent report on careers guidance for 16 to 19-year-olds, inspectors from the FEFC and the Office for Standards in Education regretted the fact that schools and colleges did not monitor students’ destinations and failure or non-completion rates consistently. If they did, said the inspectors, it would provide a basis for developing PIs to measure the effectiveness of their careers guidance.

While colleges have been mostly concerned with developing PIs as a management tool, some schools have been pressing ahead with projects to measure the value-added to students’ exam results.

Professor Peter Mortimore of Institute of Education, London, admitted that progress has been slow. This was mainly because of the absence of sound information to establish the pupil base before their exam scores could be meaningfully assessed.

“The concept of value-added has been around for some time, but it’s not in most schools’ culture to think in these terms,” he said. “For all sorts of complex reasons there have not been many schools with the energy, time and skill to create a systematic approach .”

Professor Mortimore added that many teachers had been traumatised by the crude comparisons made in the raw exam league tables preferred by the Government. But this had also encouraged some schools to develop value-added measures, if only to present an alternative analysis.

The A-level Information System (ALIS) operated by the school of education at Newcastle University is supported by both schools and colleges. Other institutions have meanwhile chosen to make more basic comparisons between GCSE and A-level scores.

A study of GCSE results in Lancashire, carried out by Professor Mortimore and Sally Thomas, found that 10 per cent of the variation in pupils’ scores was attributable to the value added by schools once their backgrounds were taken into account.

The FEFC has so far steered away from asking colleges to contribute data to a value-added model of PIs. It has however asked the new Further Education Development Agency to look at three key areas. FEDA, which is taking over the work of the Further Education Unit and Staff College next year, will look into developing a national system for assessing the value-added for 16 to 18-year-olds taking A or AS-levels as well as measures for older students who are taking A-levels but who may not have other qualifications.

Finally, FEDA will be trying to break new ground by finding a method of developing value-added measures for vocational qualifications. Geoff Hall said the problem here was the absence of any obvious baseline score which was comparable to a GNVQ or NVQ.

Stephen Crown, chief executive of the FEDA, said the new association was keen to take on the value-added research. “Performance indicators and value-added measures are useful for measuring the success of a teaching programme or helping staff to understand individual students.” It was vital that value-added PIs were developed for both academic and vocational courses to ensure parity of esteem, added Mr Crown, but there were also technical issues to be resolved.

“If you want to produce aggregate measures at institutional level to put in comparative tables you adopt a different approach than if you want a more detailed student-by-student or course-by-course analysis to use internally within an institution.”

And, he added, “If you want value-added measures which are going to persuade the general public that they have credibility then you need to keep them pretty simple”.

* Ben Johnson-Hill Associates, 7 Gregory Boulevard, Nottingham,(Tel 0602 691863).

Six of the best * Achievement offunding target:a college’s effectiveness indicatedby measuring its provision of education and training programmes against the targets which wereset in its strategic plan.

* Student enrolment trends:a college’s responsiveness indicated by its percentage change in full and part-time enrolments compared with the previous academic year.

* Student retention: the effectiveness of a college’s programmes as indicated by the percentage of students who, having enrolled for a course on or before November 1, still remain on itby the summer term.

* Learning goals and qualifications:an indicator of student achievements measured by the numbers completing courses and achieving the qualification for whichthey were aiming.

* NVQs or equivalent:measuring the contribution a college makes towards national targets for education and training. Data will be collected through returns provided by training and enterprise councils.

* Average level of funding:an indicator of the value for money provided to a college, measured by dividing its recurrent funding fromthe FEFC including thedemand-led element by the total number of units earned the college during the year.The PIs are broadly the same as proposed by the FEFC in May. Colleges asked that the titles should be amended so that they did not overstate the scope ofeach indicator.

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared