After months of preparation, the School Information Dashboard – occasionally, but probably incorrectly, known as “Insight for the BGE” (broad general education) – has finally been released to the world at large. Did you miss it? If so, don’t worry. I fear you are not alone.
Despite the nervous anticipation of headteachers and directors of education across the country – worried that the data would be scrutinised in fine detail by parents and the press – the dashboard seems barely to have made its presence known.
I’m sure it’s not just me who is hugely surprised at the apparent lack of publicity it has generated, because there is a wealth of information contained within it that is important for anyone with an interest in education. Some of it is good. Some of it, I would contend, is spurious.
Maybe one of the problems is that it is a bit of a pain to find in the first place. Eventually, after looking through the ParentZone area on the Education Scotland website, I found it via a link to a Tableau Public data-sharing site (bit.ly/SIDashboard). Even there, the multiple banners at the top and the big block of text beneath were seemingly designed to confuse rather than enlighten.
However, once on to the site and looking at the secondary school characteristics, I was pleasantly surprised by the user-friendliness of the front page. There is instant information available on matters such as attendance, school condition and hours of PE per week, plus, in lovely round diagrams, “data” on the Curriculum for Excellence (CfE) levels reached by pupils in S3 (more on this later).
Further down the page, there are characteristics of the pupils and catchment areas, including Scottish Index of Multiple Deprivation (SIMD) quintiles, English as an additional language (EAL) percentages and additional support needs (ASN) levels – although the bar charts are slightly misleading. The 0-10 per cent category, being so big and blocky, may make people think they are seeing patterns that they are not. There are also nice charts showing attendance (authorised/unauthorised) by year groups and graphs of pupil numbers. So far, so good.
Scrolling through the tabs at the top of the page, users can also see how pupils are faring next to a virtual comparator. Although this is a concept most commonly seen on the senior phase Insight dashboard (where it has come in for criticism for taking a set of pupils out of the context of their schools), there is some value in it being used here.
Having had a sneak preview of the virtual comparators for my own school, there was no doubt they were very similar in terms of socio-economic profile (even if not, unfortunately for us, in terms of results).
So, if it all appears to be such a lovely package, why were I, my fellow headteachers and our bosses getting ourselves worked up, meeting with councillors for pre-emptive strikes about the bad news and preparing press statements explaining why black was white and that our schools were actually doing pretty well without the need for such public displays, thank you very much?
Here lies the crux of the matter. Going back to the CfE levels on the opening dashboard, a quick comparison with other schools across my authority (and some random ones I picked across the country) shows some severe discrepancies in the data. My own school, in an area of multiple deprivation with very high levels of EAL and ASN, shows high levels of literacy and numeracy – much higher, in fact, than the school in the leafy suburb with professional parents down the road.
We’re good, but we’re not that good. Either that or we are really good to start with, then make a right balls-up from S3 onwards, as pupils decline from the lofty heights of level 3 before being presented for Higher maths and English.
Too easy to misinterpret
I know, because we have sat down and talked about such things, that my teachers are confident in their assessment of their pupils’ abilities, but it doesn’t appear to be the same across our authority. Or maybe it’s because we did our level assessments in May and other schools did them in February or December the previous year. Certainly, despite much talk of moderation, the activities that we have engaged with do not seem to have worked. Much less so across the country as a whole.
We have no idea how teachers in schools 50 miles north, south, east and west of us have arrived at their own analysis. The picture is even worse for some of our primaries. In secondary schools, we are maybe more attuned to the need for robust data, but it definitely appears that some of our primary colleagues are a more cautious lot. At least one of our feeder schools looks, at first sight, to be going backwards – from 70 per cent achieving the expected level in P1 to 50 per cent in P4 and 20 per cent in P7. I should be thanking them for the miraculous gains it appears their children make as soon as they hit secondary.
On a national scale, things are no better, with those who know little about interpretation of statistics unfortunately being invited to watch the decline in standards across primary school and a very variable progression in secondary, depending on where you are.
Of course, these things are not what the information shows, but it is easy to misinterpret the presentation.
And, as with most things statistical, you put rubbish in, you get rubbish out. Hopefully, over time and with refinement, it may all become more accurate. But with the apparent lack of implementation of a national moderation strategy, we may well be arguing the toss for years to come.
The writer is a headteacher in Scotland