Assessment used to mean long hours of marking and collating data - then losing it in a filing cabinet. Which is why we should praise IT, says Gerald Haigh When I became a head teacher in 1978, the school had a tradition of giving the children a battery of published tests each year. These were hand marked and the results converted manually, using pages of tables, into standardised scores. It took hours of teacher time and generated cabinets full of data.
I was deeply sceptical. What we created was a data stockpile, amassed at vast expense in time, which was so labyrinthine as to be unusable. And no one was prepared to argue that any child was either better understood or better taught as a result.
So I stopped the tests when I was a headteacher, arguing that anything we lost would be more than made up for by the extra time available for lesson planning and preparation.
Now that schools are heavily involved in the collection of data whether they like it or not, the old questions about its usefulness are at least as valid as they were. Today's answers, though, might be different. For one thing, a lot of the data is statutory and common across schools, which makes it more useful.
More to the point, though, information and communications technology (ICT) is providing the means of making data accessible and intelligible. This, in turn, makes it a lot easier to use the information to influence the way that teachers teach.
The collection, storing, handling and easy retrieval of test and assessment data is meat and drink to a computer. A lot of it can actually be done with the aid of Excel or other standard data-handling tools (visitors to BETT 2000 interested in that kind of generic software from suppliers such as Microsoft will find that Management Software offers special prices for education).
There are limits, though, to what can be done like this. Says one primary head: "Reading ages are difficult, for example, because a non-educational system probably won't understand that a reading age of seven point eight is actually seven years and eight months."
Therefore, most schools are either using or working towards an assessment and recording tool from one of the major school management systems. The most common one, because of the ubiquity of the overall management system, is SIMS Assessment Manager from Capita Education Services. In addition, RM Management Solutions, the most significant competitor for Capita in this market, has its own Assessment Manager Module as part of its Integris modular system. But as independent ICT consultant Lynne Taylor says: "The software is the tool. What matters is the process - and the way the data is used." (See Lynne Taylor's column, p 73.) At Stoke Park School in Coventry, for example, Taylor has worked with deputy head Steve Allen and colleagues on making the most of SIMS Assessment Manager. Together they have honed and polished the process into three stages. The first is to define rigorously and limit the headings under which data is to be collected. Building on this they have then made the system easily accessible so that teachers can enter useful data themselves. Finally, teachers have been given ownership of the data; they can find it, look at it, and use it both for pastoral guidance and to inform teaching.
Just part of this has been the development of a school-wide system of simple performance indicators that can be used internally by teachers and in interim termly reports to parents. This letter grades pupils in each subject for achievement, and gives a numerical grade from one to five for effort, behaviour and homework. Across, say, 13 subjects a well-behaved pupil who worked moderately hard but did little homework might have a behaviour score of 60, an effort score of 50 and a homework score of 39.
The SIMS software will then present the results in a range of useful ways. It is easy to compare one subject with another: to see, for example, that a child is either generally neglectful of homework or is only failing to do it for one subject. Teachers can see and list the pupils whose scores are deemed unacceptable or have changed up or down since the last report. Allen explains: "For any score less than 40, alarm bells would start to ring."
Senior management, believes Allen, expects action from year and department heads. "We can say that these children are underachieving, and ask what is being done about it." Equally, of course, there is an expectation that high achieving or improving pupils will be recognised in assemblies.
All this information is available to staff on department computers via the SIMS system. So the history department, for example, could look at (but not change) the performance details of a child across all other subjects and answer the teacher's age-old worry: "Is my class the only one he's messing about in?" Similarly a head of year or a deputy head fielding a query from a parent can immediately call up a full performance profile of an individual child. "Before we did this," says Allen, "We had to ask the parent to come back later then send a round robin to all the staff asking for an up-to-date report."
Now the same performance profile, printed straight from the system, becomes a quick report to parents. However, Allen says: "We don't actually call it a report - it's an assessment update."
A parent can see their child's performance at a glance, and since figures from the last report are given in brackets, see progress or deterioration in any of the categories in any subject.
The interesting thing is the way Stoke Park teachers used Assessment Manager to create a tool of clarity and simplicity that was accessible and intelligible to staff, pupils and parents. And the Department for Education and Employment commended the school in its Busting Bureaucracy competition earlier in the year.
Teachers experienced in this area emphasise, though, that the mechanical bit of handling the software is not the main issue. At least as important are the management decisions. So visitors to BETT looking at aids to assessment and recording, might bear the following points in mind, all of which are to do with professional decisions rather than with the technicalities of this or that system:
* It is all too easy to collect too much data. If teachers are to "own" the system they have to understand it and be willing to use it.
* Management needs to decide who is to look at data with a view to pinpointing problems and teaching objectives - time has to be found for this. Then teachers have to be led to thinking about their practice, for if a group is shown to be underachieving, it follows that something different has to happen in the classroom.
* The issues around, say, the under-performance of a group or department have to be addressed. The software might show all of one teacher's groups are doing less well than other groups, but it will not show the teacher how to tackle this. It will not be enough for the head to say: "You must do better." It is a training issue.
* The whole issue of target setting - for individuals, and for teaching groups - has to be examined in the light of the performance data which is being collected. Major suppliers have been extending the target-setting possibilities of their software - this is something to ask them about at BETT.
* Similarly, the method by which reports to parents are prepared has to be viewed within the context of the way performance data is being collected. As Stoke Park demonstrated, with a little thought, reports can emerge from the system with little extra effort. The saving in time, and consequent improvement in morale compared to the traditional report writing process can be dramatic - again, something to explore with suppliers.
There is also a decision to be made about the way data moves from the teacher's notes into the assessment software. There are a few options. One is that assessment grades can be entered by the teacher on an Optical Mark Reader sheet, which is "read" by the system - many schools use SIMS for assessment as well as attendance in this way. Or there's the teacher's hand-held terminal, linked by radio to a central computer - Bromcom's wNet system does this. Its "folders" and mini-laptops for registration, communication and assessment data collection are constantly updated.
Xtol's keypad response product, the Series 8 Group Response System, which links keypads to a computer via radio is another option. The system has long been used in management training, but, as reported in Online in September, it is also being trialled at Cardinal Wiseman School, Greenford, where pupils use the keypads to record multiple choice responses, which are then transmitted to a computer, and the software marks the pupils' answers to produce instant results. RM is also showing an interest in the system and seeks teachers' views of its future. Assessment grades can also be entered on the nearest network computer, perhaps in the departmental office, which is how Stoke Park School uses Assessment Manager. Or teachers can enter grades using their laptop, which are then transferred by disk or other link to the system.
Each data entry system has its supporters. The decision is less to do with technical efficiency than with the management style of the school and the way that staff wish to work.
BECTA Stand: D70 www.becta.org.uk QCA Stand: D90 Tel: 0171 509 5555 Bromcom Computers Stand: C20 www.bromcom.com Capita Education Services Stand: D20H34 www.capitaes.co.uk RM Stand: D50E50 www.rm.com Xtol Stand: L10 www.xtol.co.uk Management Software: Stand: L15 www.mansoft.co.uk