As the Executive prepares to respond to the national education debate, Pamela Munn supports the validity of the process
CYNICS may argue that consultation is no more than an attempt to seek legitimacy for changes that have already been decided. Indeed governments and other public bodies are under an obligation, in my view, to explain the purpose of consultation and the ways in which responses have been used or rejected if they want to counter this cynical view.
The national debate on schools in the 21st century was an unusual consultation in many respects. It was not asking for responses to a specific policy proposal. Rather it was asking in a fairly open way some important questions about what schools in the future should be like. What should pupils learn? How could pupils learn more effectively? What were the best and worst things about the current system? What were the priorities for improvement?
The purpose of the consultation was to develop a policy agenda over the medium term going beyond the life of the next Scottish Parliament. People outside the normal "policy community" were encouraged to respond. Pupils, teachers, parents and others were invited to form discussion groups and to send in their conclusions. A number of events took place across the country and the results were reported.
The Scottish Executive Education Department prepared support packs to aid discussion. These consisted of a response form and a number of short briefing papers giving information about various aspects of the current school system and providing some international comparisons.
Now, of course, one cynical reading of this whole exercise is that it is an attempt by the Labour and Liberal Democrat coalition to "sew up" the forthcoming elections as far as education policy is concerned. While they can claim that their school policy is the result of a national consultation, critics will argue that it effectively stifles debate and distracts attention from policy failures.
A less cynical response is to see the debate as a genuine attempt to develop innovative ways of formulating policy in the context of devolution and to broaden civic participation in that policy-making process. Much depends on the Executive's response to the issues raised and on its commitment to sustaining civic engagement in policy-making. It is in these contexts that the validity and reliability of the analysis of the responses is especially important.
There have been many studies of the role of research in policy-making.
There are fewer accounts of the ways in which routine social science practice can help government to identify policy priorities and protect it from charges of partisanship. The analysis of the responses to the national debate provides an interesting case study of these matters. It went through several stages. More than 1,500 responses were analysed, most of them on the response sheets provided but some in other forms, such as letters, articles and posters. The questions on the response form were checked in advance of publication to remove ambiguity as far as possible. One response which used the form was analysed in terms of key content by all six researchers involved in the exercise and the result discussed.
A larger sample of responses was analysed by the six researchers to develop a series of categories. These categories were straightforward descriptions of the content. Nothing more sophisticated was attempted in the time available. Two researchers independently reviewed all the categories generated and developed a coding frame which collapsed some categories. The two coding frames were debated and discussed by all six researchers and a final version agreed. Coding rules were developed to minimise ambiguity of meaning and coding sheets designed so that the analysis of each response could be traced back to the original.
Any content that did not fit the coding frame was written out in full. All of these were collected and discussed at a regular meeting, and, if a pattern was emerging, a new category was added. Those responses that did not suggest a pattern were categorised as "other". Responses were analysed in batches of 10 per researcher and a descriptive report written to complement the coding sheets. Researchers also wrote their own impressions of the tenor of the response or highlighted aspects that had not been captured elsewhere.
FTER a detailed analysis of more than 160 responses, the first 300 were coded accordingly and then reviewed with minor adjustments to incorporate as many variables as possible. After establishing confidence in the coding frame, the frequency of all responses was analysed by System 3. This proved the robustness of the frame as very few responses were coded as "other".
The job of such a study is to look for patterns as it is clearly impossible to provide a nuanced account of every single response. Nevertheless, responses that stood out were carefully analysed and reported, as indeed was the cynicism of some teachers about the purpose of the whole exercise.
Everyone who sent in a response received a copy, and it is available on the web. The SEED stood back from the evaluation and awaited the report. It could thus legitimately claim that the analysis was independent of political influence when a journalist presented an "alternative" report which drew on a small sample of focus groups.
Academics have had no formal role in shaping the response to the debate.
Rather their role has been in ensuring that politicians, civil servants, the inspectorate and other interested parties are aware of the issues raised.
Pamela Munn is dean of the Moray House School of Education at Edinburgh University. She led the research team analysing the national debate responses.