Government’s score casts FE as second fiddle

11th January 2002, 12:00am

Share

Government’s score casts FE as second fiddle

https://www.tes.com/magazine/archive/governments-score-casts-fe-second-fiddle
Last month, the Department for Education and Skills published the results of its “value-added” pilot scheme in secondary schools. Hailed as a fairer way of measuring schools’ achievements by both politicians and headteachers, it was widely regarded as a significant step in the right direction, and many in FE are keen to see a value-added “performance indicator” appear in the college league tables. But will value-added work as smoothly in FE as it appears to do in secondary education?

The Scottish Further Education Funding Council doubts this after receiving a damning report on value-added from the University of Edinburgh’s Centre for Educational Sociology. After reviewing all the value-added models currently employed in the UK and elsewhere, the centre’s researchers concluded that none of them were capable of producing a reliable indicator of college performance, and that we are a long way from creating a scheme that would work. Such a complete rejection of the value-added PI is a sign that the implementation of a unilateral system in FE is not as straightforward as most people, including the DFES, believe. So why is the value-added statistic still so attractive? Currently, league tables are notoriously unfair. Based upon a simple “outcomes” PI, they merely represent the average grade achieved by a college’s student body, and hence, reflect the qualities of the students, not the qualities of their colleges or their teachers. As well as failing to represent college quality, they also encourage colleges to select only the most able students, and reject those of lower calibre. As a consequence, a peculiar class system has evolved. Colleges that happen to be well placed in the table attract students that maintain their position, while those further down become associated with failure and cannot attract the more able students that they must have to climb the table. Not surprisingly, Education Secretary Estelle Morris, is looking for an alternative. Her aim is to focus the league tables upon a value-added indicators, so that colleges are rewarded for the progression of their students, whatever their ability. But which value-added model should she favour?

The most serious contenders are models that quantify “progress” by assessing the difference between a student’s qualifications on entry and exit. The most primitive of these concentrates upon the “distance travelled” between these two points. It’s easy to apply, and can be calculated in-house, but in this case, colleges would benefit from admitting low achievers, since well-qualified students cannot “travel” as far.

Favoured by the SFEFC report, and the DFES, are “comparative input-based” approaches. These exploit national achievement data to establish a target for each student entering a college. The target is the average achievement of all students with the same entry qualifications as the entrant. Should all the entrants in a college achieve their targets, the college is awarded a value-added index score of 100. If a college’s students under-achieve on average, the score will be lower than 100; if they overachieve on average, it will be higher than 100. In the DFES pilot of this model, half of the 155 colleges involved scored between 97.6 and 102.3 points - a range that was deemed statistically identical to the “average institution”. However, the other 50 per cent were outside this range. At the extremes, two colleges shared the low score of 91.2 and one got the high score of 108.6. Should the model be adopted, the former would take their place at the bottom of the league table, and the latter would be at the top. It is encouraging to note that when you arrange the colleges on value-added criteria the list barely resembles existing league tables. Colleges that achieved relatively low average grades can appear high on the list if their students’ progression excelled the average, and vice versa.

Yet, although the data looks promising, the SFEFC report warns that FE should pause before employing the system: “No single variable strongly and consistently predicts attainment in FE in the way that prior attainment predicts attainment in school,” it declares, “the association between entry qualifications and course outcomes is weaker and more variable”.It appears that, on the whole, FE students don’t conform to the norms of the value-added index. Good students leave to take up jobs, mature students join with poor qualifications but new-found motivation, part-time students dabble in A-levels, and a large number of students take vocational qualifications for which there is no national dataset - for a whole number of reasons, the outcomes of the FE cohort are far less predictable, and this undermines the whole value-added system. Considering that a future performance-related pay scheme might be linked to these new value-added league tables, the fact that the University of Edinburgh calls the DFES’s favoured method “unreliable and unstable” should make even the most passionate value-added champion hesitant.

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared