Andy Gannon, director of policy, PR and research at the 157 Group, writes:
Much has been written about how the last few years have seen a decline in the standards applied to educational policymaking. Whether the complaints be about lack of evidence, heavy-handedness or lack of transparency, many are feeling increasingly distanced from the processes of Whitehall.
Struggling as I have been this summer with responses on behalf of 157 Group to the Department for Business, Innovation and Skills consultations on both traineeships and loans, I have had the chance to reflect a little more on precisely what I perceive the issues to be from a practitioner perspective. My conclusions raise some interesting questions.
We live in a policy world of extremes and contradictions. A guiding mantra is that the state should do less, and yet we have an army of officials who need to do more, despite reduced numbers, and who need to be seen to be making change.
Politicians seem to believe genuinely in the individualisation of educational programmes, and yet are badgered by a scrutiny process tied in to a level of national statistics that can only serve to make delivery ever more unnecessarily uniform.
The problem our beleaguered (while well-intentioned) policymakers have is a distinct lack of the right tools to undertake the tasks they want to do. They seek laudably to change values - the regard in which society holds vocational education, for example, or the degree to which employers see themselves having any responsibility for skills development. And they seek to change behaviours - people’s willingness to commit their own money to their learning, for example, or schools genuinely providing full and impartial guidance to their pupils.
But the tools at policymakers’ disposal are ill-equipped to change either values or behaviour. They can tinker around with the minutiae of funding or accountability regimes, or they can announce big delivery initiatives. The latter results in a sense among educators of being constantly ‘done unto’, and ever criticised.
Funding and accountability minutiae only ever drive behaviours towards a game to be played, rather than a joint commitment from both the policy world and practitioners to a powerful and shared vision.
The traineeships consultation, for example, asked, among other things, whether learners were more likely to continue studying English and maths beyond their traineeship if it had been funded integrally or separately. I can picture the scene:
Student: “Now tell me, tutor – did the college receive an additional payment for my English course or was my funding all delivered in one go?”
Tutor: “Well, the Skills Funding Agency is currently trialling a new funding methodology whereby we received a separate upfront recruitment payment for your English programme, while also drawing down an element of weighted funding based on your prior achievement as part of the overall payment for your study programme. And we will be given a bit more money if you accept that apprenticeshp place you have been offered.”
Student: “Excellent – then I will definitely go for GCSE next year.”
Faced with the insoluble dilemma of trying to change education for the better with only blunt target instruments (akin, perhaps, to trying to prevent forest fires by setting a target volume of water which each fire brigade should not exceed), it is perhaps unsurprising that public consultations have become too much like a SurveyMonkey exercise.
With both this summer’s major BIS consultations, I found myself scratching my head. Both had very clear proposals, but contained no description of the problem I believe these proposals were designed to solve – even less any evidence for its scale. I found myself pondering, “Do we have rafts of trainees being poorly advised because their providers are not funded based on destinations?” and “Are 19 year-olds keen to take out loans because they cannot access level 2 provision at the moment?”
But the questions to which I was invited to respond took as a given the notion that what was being proposed is a good idea (this article is not about whether I agreed with that notion or not). As a consequence, the questions themselves were an example of limited ways to seek views. Lots of closed questions (“Do you agree…?”) were followed by lots of detailed ones about minutiae that very few people could truly engage with. The questions asked me to predict the impact of something which has not yet been tested, and, as I do not yet have my certificate in clairvoyance, I felt ill-equipped to do so.
It is tempting to suggest that such an approach is borne out of the frustration, or inexperience, of those charged with implementing policy which has little or no potential to effect real change in either values or behaviour.
If we were to put this bigger conundrum to senior officials and ask, “Is it frustrating to be trying to change a whole system with only a few paragraphs of the funding guidance to play with?”, I suspect that they may wearily concur. But, within their own pressurised environment, they may well reply, “I would be very interested to have that discussion with you in more detail. But can we get this consultation out of the way first?”