Skip to main content

Report on Scottish National Standardised Assessments: key findings

MSPs have today revealed a host of concerns about Scotland’s new system of national testing

Report on Scottish National Standardised Assessments: key findings

Concerns – first highlighted by Tes Scotland in February – have been raised that a five-year “data gap” was created when the Scottish government ended the annual Scottish Survey of Literacy and Numeracy, the last report for which was published in 2017.

That issue is highlighted in an inquiry report, published today by the Scottish Parliament’s Education and Skills Committee, about the Scottish National Standardised Assessments (SNSAs). They started in 2017-18, prompting concerns that there would be a five-year gap before first “national performance data” from the SNSAs would appear.

But what other key findings are there in today’s report?

‘Five-year hole’: Gap in literacy and numeracy data revealed

P1 tests: Teachers were given ‘little or no’ information

The financial cost: Bill for national tests is revealed

Scottish national assessments: the report

  • "Valuable data” that was produced by the SSLN “is no longer available” in SNSAs.
  • The 2016 SSLN results survey “showed declining performance in literacy on many of the [SSLN] measures".
  • The committee recommends a review that would consider whether to reinstate the SSLN.
  • On the SNSAs, “the evidence from certain witnesses to this inquiry reflected that the Scottish government announced policies quickly without meaningful collaboration with certain key stakeholders or establishing an in-depth evidence base for elements of these policies”.
  • The committee finds that a “shift in policy intention has contributed to a lack of clarity about who the SNSAs were developed to provide information for, policymakers or teachers”.
  • More local authorities than the government anticipated are continuing to run their own standardised assessments alongside the SNSAs.
  • The committee has become “concerned that there appears to be an inconsistency from, and between, organisations at a strategic level as to the purpose of the SNSAs” – an Education Scotland submission, for example, relayed that “the assessments are not summative and then discusses data being used to assess the performance of a particular school”.
  • The “desire to prevent any misconception that the assessments are part of a high-stakes accountability measure…has proved unhelpful in providing a clear understanding of the assessments, indeed the word 'confusion' was often cited to the committee”.
  • The committee wants clarity on when information on SNSA results can be shared with parents, and who can make this decision, as education secretary John Swinney suggested that “decisions can rest with the individual teacher, whereas some evidence suggests a prescriptive approach is being taken at local authority level in some areas”.
  • Certain government decisions have “contributed to low-medium stakes assessments becoming 'politically high stakes'”, including “the decision not to publicise the new policy to parents” and “confusion over the purpose of the assessments”.
  • The government, local authorities and schools should prepare for the releases under freedom-of-information legislation of data on the performance of schools or local authorities based on the SNSA, which “could increase any feeling of anxiety amongst teachers and parents and lead to the unintended consequence of the assessments becoming high stakes”.
  • *One “potential weakness” of the SNSAs is if “a sizeable number of teachers” decides that the assessments are “not telling them anything new”, and yet “a substantial amount of teaching time is being used to administer assessments and assess the output”.
  • The government should assess the workload implications of the SNSAs for teachers and other school staff.
  • The committee supports the view of Professor Andy Hargreaves, one of the government’s International Council of Education Advisers, that feedback from some teachers to suggest the SNSAs “are not adding value to their judgements” and that the government should “reconsider the content of the assessments based on this feedback”.
  • The committee is concerned about the evidence from Acer – the company helping the government to deliver the SNSAs – that there was “limited engagement with current teachers during the development of SNSAs”; the government should plan “direct engagement” with teachers that would help “prevent any mismatch between the benefits of the SNSAs in theory and the practical experience of classroom teachers”.
  • Education directors’ body ADES should provide “tangible examples of where SNSA data has contributed to improvement”.
  • The government and local authorities' body Cosla should work on an anonymous sample survey to estimate how many “relevant classroom teachers” have not yet done “SNSA-specific training”, and to gauge “whether the teachers feel equipped to run the assessments”.
  • The SNSAs are “reliant on access to good-quality ICT”, so an analysis of the capacity of schools’ ICT to accommodate the SNSAs should have been carried out before the SNSAs were introduced in 2017-18.

Log in or register for FREE to continue reading.

It only takes a moment and you'll get access to more news, plus courses, jobs and teaching resources tailored to you