I suspect my reluctant comprehension is similar to that of teachers who gradually realise that their passion for a subject is not always shared by the students they teach. The difference is that, while students have little excuse not to engage with their studies, schools and teachers have understandable reasons to be wary of assessment.
The hard truth is that a lot of assessment used in our schools is misconceived, misapplied and frankly redundant. In too many cases, it seeks not to inform so much as conform to processes whose purposes remain obscure to the teachers who have to apply it.
We shouldn’t feign surprise, therefore, when the whole rigmarole of inputting, analysing, recording and monitoring data is regularly cited by teachers as among the biggest factors contributing to their workload. (Isn’t it telling how often assessment is referred to in these workload surveys as “admin”?)
For several years now, periodic Department for Education surveys have found that data issues are blamed by teachers for adding to their burdens. Despite the good intentions of government and schools to tackle the problem, it remains a live concern.
In our own YouGov survey of teachers published earlier this month, a third of teachers said that tackling their data burdens would have the biggest impact on their workload.
But our survey found something else: teachers overwhelmingly recognise data’s value. They know that, deployed effectively, it can help them and their students. Six in 10 agreed that data could help them do their jobs more effectively. Fewer than a fifth (18 per cent) disagreed. Teachers are not anti-data; they are opposed to the bad practices that too often accompany its use.
Part of the problem is, of course, the use of assessment as an accountability tool for schools. That is unlikely to change any time soon – but it’s worth bearing in mind that any assessment designed to determine a student’s performance can have unintended consequences if used principally to assess an entire school.
Is it any wonder in these circumstances that the pressure to hit school targets tends to take precedence over the needs of individual children or the better understanding of their teachers?
The seductive appeal of systems
Then there is what may be termed the seductive appeal of systems and processes. This is the tendency, common in many organisations, to cleave to procedures that may have been developed, with the best of intentions, to address a particular issue but which have over time become an end in themselves rather than the means to reach a solution. The process becomes the point. Its original purpose is often lost – and perhaps the original problem long-since solved – but endless tweaking and constant demands for time and resources help disguise the fact that few can remember why it was developed in the first place.
Internal assessments that aren’t benchmarked to any national metric – of the type Ofsted has recently warned about – are a classic of the genre. Parent reports, littered with confusing numbers or terms, unattached to any objective measure of attainment, are another. They can give the illusion of rapid improvement, but they usually flatter to deceive.
Finally, there are bad practices, which have accumulated on assessment over the years like barnacles on a ship. They rarely do anything for a school’s performance, but they can be a huge drag on a teacher’s time. Multiple assessments, subjective assessments, unexplained assessments and assessments that fail to yield usable classroom information are far too common.
There is such a thing as good data
The sad fact is that there are too many such practices. We in the assessment business have been remiss in not challenging them more frequently. They need to be called into question because, as the teachers in our survey so rightly find, there is such a thing as good data, and schools and students can benefit enormously from its use.
Harnessed effectively, data can inform classroom practice, improve pupil performance, shine a spotlight on struggling students and reduce teacher workload by focusing the effort where it’s most needed.
There is, too, another casualty. Bad assessment practices have not only restricted the effective uses of data. They have also, all too frequently, obscured its potential.
Some schools I know are using assessment innovatively, to map more closely the interplay of pupil attitudes and academic performance, for instance, or to streamline the way they communicate with and channel parents’ engagement. But, if a school and its teachers are bogged down in data, they will hardly have the time – or the inclination – to think imaginatively about what they can do with it.
So, yes, I understand why assessment and data are seen as the villains. And yes, let us remember that getting it right, especially for schools in challenging circumstances, is rarely rapid – it takes time, and trial-and-error and patience.
But let’s also remember that effective data, properly applied, should be seen as an essential part of the solution, not part of the problem.
Greg Watson is chief executive of GL Assessment