Closing the gap? Scotland lacks the data to know

Evidence on pupil performance is ‘remarkably thin’ and the new standardised tests won't help, MSPs have been warned

Emma Seith

Closing the gap? Scotland lacks the data to know

The Scottish government will not be able to say if the attainment gap is closing because the education system is “relatively data poor”, MSPs have heard.

The SNP’s key priority is to boost the school performance of disadvantaged pupils and to begin to close the gap between them and their more affluent peers.

However, the Scottish Parliament’s Education and Skills Committee heard today from Lindsay Paterson, professor of educational policy at the University of Edinburgh, that surveys used to measure national educational progress, such as the Scottish Survey of Literacy and Numeracy (SSLN) and its predecessor, had been scrapped.

He warned that the only survey left was the Pisa (Programme for International Student Assessment) rankings of 15-year-olds and this meant it was "impossible, at present, to know reliably if we are closing the attainment gap".

Related articles:

Keir Bloomer, convener of the Royal Society of Edinburgh’s education committee, who was also giving evidence to the committee’s inquiry into the new Scottish National Standardised Assessments (SNSAs), echoed this, saying that Scottish education system remained “relatively data poor”, especially in the years before the senior phase of secondary.

He added that, in a few years, when MSPs are expressing their views on whether or not the attainment gap is closing, they will be basing those opinions on "remarkably thin evidence".

Mr Bloomer argued that an enhanced version of the SSLN – which was a sample survey looking at performance in literacy and numeracy in alternate years, published for the last time in 2017 – should be reinstated.

Comparisons must be carefully drawn

Both Professor Paterson and Mr Bloomer told the committee the original purpose of the SNSAs had been to get better information at a national level on how pupils were performing – particularly in primary.

However, that purpose had changed over time and now the tests were seen as a diagnostic tool to support teachers' judgements about where pupils were in their learning.

That was fine, said Paterson, but pupils sitting the tests at different times invalidated the data which, he warned, could not be aggregated to draw conclusions about national performance – or the performance of local authorities and schools.

He pointed out that a P1 child would develop hugely over the course of their first year in school and their performance on a test would, therefore, be quite different depending on whether it took place at the start of the school year or the end of the school year.

The organisation behind the SNSA – ACER International UK – made a similar warning in its first report on the tests last year.

It said there was “clear evidence” that the time of year children sit the assessments can result in a “marked increase” in their performance – and any comparisons between pupils or groups of pupils must be carefully drawn.

Register to continue reading for free

It only takes a moment and you'll get access to more news, plus courses, jobs and teaching resources tailored to you

Emma Seith

Emma Seith

Emma Seith is a reporter for Tes Scotland

Find me on Twitter @Emma_Seith

Latest stories

Teacher training: Why one size doesn't fit all

Teacher development: why one size doesn't fit all

Teacher learning must be planned in the same way as their students’ is – with appropriate time, scaffolding and support all given proper consideration, writes Sam Jones
Sam Jones 14 Jun 2021