Can you see clearly?
There's a mythical beast called the Data Driven School. Before you roll your eyes, take a minute to reflect that all schools are data driven in the sense that everyone working there responds to a torrent of information. Bad behaviour on the school bus - "Put someone on duty." Years of poor French results - "Get a retirement deal for old Smithy."
But we're talking here about rows of numbers in a database. The trick is to get the right numbers. On their own, numbers do nothing. Many schools found that when registration software was introduced, attendance figures actually got worse - because absences were more accurately recorded. You couldn't have a more striking example of the obvious principle that whatever improves school performance, it isn't computers.
So, why does every school now have a suite of software that stores a range of data about the performance of its pupils? Call up Sharon on the computer and you can look at her attendance, her Sats results, her detention record, ethnic grouping - even her shoe size, if youwant.
"So what?" asks the hard-nosed teacher, the one who sits at the back of in-service sessions with folded arms and averted gaze. As we all well know, what improves school performance is effective work by teachers. So what can the management information system do to earn its keep and help teachers to be effective?
The quick answer is that, aside from saving some admin time, the system is able to collect, store and intelligibly present the sort of pupil performance data that will show strengths and weaknesses, enabling you to plan what to do next.
David Miliband ,the schools minister, has had some cogent things to say about this. Talking of data and assessment, and their function as tools for school improvement, he observed: "To fulfil all of these requirements, the data upon which we base our accountability mechanisms must reflect our core educational purposes. They must be seen to be objective. And they must allow for clear and consistent comparison of performance between pupils and between institutions."
What does that mean in practice? At one level it's simple: "Let's look at the maths results. Down? We'll have a maths drive next term. Behaviour slipping? Let's screw things down a bit tighter."
But this isn't what Mr Miliband meant. He pointed to something much more specific: "Data that helps teachers develop themselves, that helps school leaders promote high performance, that helps parents support their children's progress." In other words, what we want is data in detail, and we want it from the management information system. But if it's going to play its part in school improvement, three conditions have to apply:
* First it has to contain the right sort of detail - and that means "right for us, in our school, right now".
* Second, it has to be up to date so that an accurate snapshot becomes a possibility. Without that, we'll miss trends and early warnings. A school does not progress or decline at a steady rate or on a uniform front. In reality, bits of it get better and bits get worse at different rates and at different times. The trick is to catch both trends early.
* And third, the data has to be easy to access, easy to read and easy to understand. And that means specifically easy for any user: head, governor, class teacher, parent, or student.
So, if English results are down, we in the English department might want to know precisely which pupils are failing to grasp what; that Darren in Year 7 is having trouble with the punctuation of direct speech. We may also wish to know which pupils, in which groups, are grasping the concept unusually quickly, as there are things to be learned there. And if it's behaviour, we might want to know which pupils in which teaching groups are giving exactly what kind of grief to which teachers - and to what extent the same pupils are meek lambs in other groups.
Can we expect teachers to provide the data for that kind of analysis? The answer is that they collect it already, all the time. Darren's English teacher knows exactly where Darren's learning gaps lie, and each teacher can tell you which children indulge in the kind of wind-breaking, pen-losing nonsense that directly affects everyone's learning. Each teacher, too, knows that some colleagues manage things better than they do, and often wonder about their methods.
What is rare is to record any of this stored-in-the-head data in an agreed form that can be added to a pupil's profile on the management information system. Those schools where staff do this end up with a dynamic database of information that feeds into planning and policy at all levels - from deciding what to do in tomorrow's lesson to planning the 16-plus curriculum five years hence. This, folks, is the Data Driven School.
The basic building block is the pupil profile. It shows as much, and as detailed information as the school wants and needs. We all know it's just as bad to have too much data as too little. What's certain is that it has to be information that's useful within the context of the particular school.
Put all of the pupil profiles together, and it's then possible to look at patterns and trends in whatever way you want: group to group, week to week.
Managers are able to make decisions about curriculum planning, about set groupings, about focused intervention, about staff development. Classroom teachers can read the progress of individuals and groups, and plan with some precision the work that will bring them all on.
For some, this will seem a tall order with lots of data being collected by long-suffering teachers who see it stored and never used. Ensuring that what you have is not just data but useful information is a function of leadership and management.
Among the most useful data is that produced by assessment. Yet more management decisions to make. Teachers assess all the time. They continuously weigh up the impact of what they're providing, reading everything that's coming back - pupils' questions and answers, body language and behaviour. A step further on, they set tasks to provide a quantified assessment: a quick mental maths test, a timed essay or a reading task. Each provides data that informs future teaching - it's called Assessment for Learning, and Mr Miliband had things to say about that, too.
What does this mean for the teacher? Is every bit of your assessment data going to go into the management information system to inform the pupil profile? Or do we assume that much of it is of the moment and that it's enough to make summary assessments every so often? The supplier of your system can't dictate that - though they'll give good advice based on wide experience.
In the end, you must decide, and the system on offer must respond.
Similarly, it's school managers who have to ensure the quality of whatever assessment data goes into the system. As far as is humanly possible a pupil's grade has to mean that this is the best judgment possible, against clear and agreed criteria. If we don't get that bit right, we've lost the plot. The information system won't add quality to the data that's put into it. What it will do is enshrine our mistakes and misjudgments, multiplying them together, summarising them and presenting them with seductive clarity so that head, teachers and parents end up discussing differences in performance that actually don't exist.
The information system has to be exactly what it says on the tin - a tool that helps managers of learning to gather and sort out detailed information. And if the system is integrated with a virtual learning environment - the resources for teaching and learning - then it will suggest and produce the appropriate learning material, too. But that, for the moment, is another story.