Answers on a postcard

17th November 1995, 12:00am

Share

Answers on a postcard

https://www.tes.com/magazine/archive/answers-postcard-0
Lindy Hardcastle finds a school that listens and learns with an annual survey

John Herbert, chair of governors at Mount Grace High School in Hinckley, Leicestershire, has a very clear idea of the proper role and function of governors. Early retired, with time, energy and managerial skills to offer, he resists the temptation to become over-involved in the day-to-day running of the school. But he believes that governors should improve the quality of education for the children in their school. If they are not making a difference, they are not doing their job.

High schools in Leicestershire cater for pupils between the ages of 11 and 14. These schools, though they are often happy, productive places, have sometimes suffered in terms of development and motivation due to lack of accountability. They have lacked ways of measuring their success. If testing at 14 becomes established as a reliable and consistent tool for measuring performance, this may change. Pupils’ progress between the end of key stages 2 and 3 could provide a “value added” figure for the school, though even then academic achievement should never be the only criterion of a school’s success.

In the meantime, Barbara Vann, head of Mount Grace, with active support from John Herbert, has introduced a new concept of “value added” which is measured by the results of the school’s annual survey of staff, parents and pupils, which boasts an excellent reply rate of 90 per cent Originally compiled with the help of Leeds Metropolitan University, the survey is now in its third year and a vital part of the school’s development planning. John Herbert collates and analyses the replies.

The 30 questions remain the same from year to year to enable comparisons. The replies are sorted into sections on teaching and learning, satisfaction with staff, communications, discipline and pastoral care, facilities, governors and general satisfaction.

Parents are asked to tick boxes, responding yes, no or not sure, and comments are requested. A final question asks what single thing would most improve the school. Predictably, the pupil’s survey indicated a desire for a swimming pool.

Responses are analysed on computer spreadsheet, question by question, year by year, and also subdivided into male and female, for both parent and pupil responses.

The “value added” assessment can be made in two ways: by checking if the level of satisfaction of a group of parents or pupils has increased as they have progressed through the school, and by seeing if the current year is responding more positively than the previous corresponding years. Every written comment is recorded, and referred back to if any particular question shows a significant level of dissatisfaction.

The big question, of course, is how all this information is used. Taking the aggregated responses of all pupils and parents, each question is considered by teams of staff in terms of management issues raised and action to be taken. These then become part of the school development plan. An 80 per cent positive response is considered satisfactory. The management action indicated for questions which produced a high level of satisfaction was “Congratulate staff”.

Areas with a satisfaction level of 50 to 80 per cent are seen as needing further investigation or improvement, and less than 50 per cent requires management action. Sometimes the action indicated might not be a change in school procedure, but an improvement in communications with parents. In other cases, staff development might be the answer, or a review of school policy and practice. Such action has included better communication with parents about homework by using homework diaries and the establishment of a homework club.

Where action has been taken, particular attention is paid to the relevant questions the following year, to see if satisfaction has improved.

“As the project rolls forward, the information we gather enables us to gauge the results of our initiatives,” said John Herbert. “To quote Burns, it lets us ‘see ourselves as others see us’.”

He sees reporting back on the survey as an important component of the annual report to parents. Clearly parents regard the survey as being an effective way of communicating their concerns to the school, and believe in its effectiveness in making changes.

They value the opportunity to raise general broad issues anonymously, and trust the school to respond positively. Interestingly, only 18 per cent of parents replied to the comparable, though less detailed pre-Ofsted survey, which would seem to indicate that they are confident that their views are already known and acted upon.

So are parents now “running” Mount Grace school? John Herbert thinks not. There are many other factors, including the policies of the governors and the professional judgment of the staff, which have equal weight in development planning. But it is far from being a mere paper exercise.

“The main aim is to make the school more responsive to the needs of pupils and parents,” John Herbert explained. “We hold strongly to the ideal of a partnership in education and this is one way in which we can make that partnership real.”

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared