What exactly is the purpose of the FE area reviews? Seriously, I think we should be told.
In March, the FE commissioner Dr David Collins wrote to FE colleges about the need to provide “better quality through the reduction/elimination of poor provision”. It’s another Lingfield moment. The Lingfield report recommended that not having qualified teachers in front of FE students would lead not only to improved professional status for FE teachers but also, by implication, to better teaching and a better learning experience. It was an awe-inspiring example of logical contortionism. But that doesn’t mean it made any sense. What we can conclude from it is that under former education secretary Michael Gove’s ministrations, FE teachers were held in such low esteem that unqualified status was rated more highly than qualified.
Now the FE commissioner is playing the "quality" card. And he may be right when it comes to the area reviews; we do need to start talking about quality in FE. Let’s be under no illusions, though, the area reviews are not really about so-called "performance" – they are a vehicle for cuts, for rationalising and reducing what is on offer. They are about mergers and, inevitably, closing campuses. So "quality" is going to be an issue.
In Birmingham, we’ve gone from having 12 FE and sixth-form colleges to two huge colleges (the legacy of 20 years of takeovers and mergers), one outlier (in dire financial circumstances) and a couple of surviving sixth-form colleges. Rumour has it that the area review will result in the birth of "The Birmingham College". If that is what happens, can anyone really believe that a huge college spanning the entire city across numerous buildings is going to be any more efficient, or that the provision is going to produce courses of a better quality than those currently on offer?
The FE commissioner’s letter in March identified a number of features that might identified as possible indicators that any given college could be in trouble and heading for a poor inspection, including
- Success rates are 5 per cent below benchmark and not improving.
- Teacher observation grades of good or better are below 80 per cent.
- Student surveys/focus groups show levels of satisfaction below 90 per cent.
Quality Improvement plans or post-inspection action plans do not state clearly for each issue the college's starting position, the targeted outcome, actions that will be taken to achieve that outcome, milestones along the way, monitoring arrangements and the individual responsible for overseeing delivery.
Now, begging Dr Collins’ pardon but this approach to quality is the same approach that brought us the Stafford Hospital tragedy, doping in international athletics and the Volkswagen emissions scandal. Each of those examples involved "optimisation behaviour": the institutional manipulation of performance data to present a favourable outward-facing picture.
So how does this apply to FE?
Simply, if performance data is taken at face value, then the approach is flawed. Furthermore, if the dominant approach to overseeing the quality of FE provision continues to rely on the same performance data, then real quality will continue to be undermined by the achievement-led funding regime as FE teachers and managers are forced to resort to "optimisation behaviour" and gaming in order to hit targets and garner funds.
Area reviews are only likely to intensify an approach to quality that is reductive and, quite frankly, lacking in integrity. In recent research I carried out, a lecturer explained how an institutional focus on performance data could act against raising standards:
“So, in this meeting…the manager passed a list of students to the teacher and said, ‘You need to tell me that they will all pass.’ The teacher began to speak but was interrupted: ‘No don't tell me, I don't even need to give this to you because they will all pass, won't they?'"
Indeed, there is no room for discussion, there is no opportunity to argue so no one does – at least within the meetings.
This approach to performance data is an aspect of the architecture of FE that has been ingrained over the past 20 years. I believe scenes like these are played out in FE colleges across England. Yet, I have never heard the FE commissioner, Ofsted or any other government agency ask questions about the social conditions in which performance data is produced. The conclusion is then that the FE area reviews will once more affirm the current approach to funding and the cultures around performance data that have arisen in response to it.
Dr Rob Smith is a teacher and researcher at Birmingham City University. He tweets at @R0b5m1th