In theory there's no limit to the amount of assessment data teachers can enter into a school's MIS system. They could pile everything in - mental arithmetic tests, a spelling checks, a specimen exam question, a page of maths homework. The system could aggregate and summarise this - and you'd think that, in theory, the sheer quantity of data would end up producing a wonderfully accurate measure of a child's ability.
The key word there, though, is "quantity". You'd certainly have that. What you probably would not have, though, is quality. A moment's thought and you realise that many of these bits of assessment are likely to be sketchy affairs with no pretence to the kind of accuracy required, for example, of a GCSE or a Sat question. To put into the system a whole lot of hastily arrived at scores would be to produce an aggregate whose impressive size and apparent importance would be in inverse proportion to its real value.
That's why the thoughtful school manager enters a limited number of very carefully arrived at assessment scores. Ninestiles in Birmingham has its half-termly national curriculum levels, while over at Shirelands language college, in Sandwell, the approach is very similar: assessment entries are based on six key assessed pieces of work in each subject each year.
"There's a tendency to throw all sorts of gibberish into an assessment module on the naive view that once it's in there it's OK," says the headteacher, Mike Grundy. "We wouldn't dream of doing thatin a folder or mark book."
The philosophy holds good at Stoke Park school in Coventry. Simon Smith, Stoke Park's business manager, says: "We keep it to four manageable indicators per term in each subject and the marking policy across the school is linked to them."
The problem for school managers, of course, is that, on the face of it, inaccurate and useless information on a computer screen looks just the same as information that's accurate and valuable. And if it's dressed up as a coloured graph, with the school crest hanging over it, the deception starts to be insidious.
The lesson from successful schools is that if you want to be confident of the information coming from the MIS, you have to be painstaking about the quality of data that's going in. That means constructing assessment tasks with care and understanding of the principles, and that in turn means that in many schools they don't aim for more than a handful of assessment data entries per year per pupil.
And if you fancy you've been hearing that message for 20 years in the form of "garbage in, garbage out", then you're absolutely right.