Mr Morton's report, presented publicly for the first time at Monday's parliamentary education committee inquiry, indicates that the authority's problems were much more deep-seated than poor data or information management. The findings, compiled with help from the Scottish Executive, are:
The delivery of certification in 2000 had not been properly "scoped" with some assumptions appearing to have been based on the SQA's past track record.
Planning and preparation were poor. In some cases adequate staffing and other resources were not put in place.
There was some risk identification but only limited risk assessment, and no clear understanding of the true likelihood and consequences of specific failure.
There was no adequate contingency planning.
There was poor project management, including failure to test software fully in advance of use.
Management information was not robust, lacking specification, availability and reliability - and, where it did exist, it was not used effectively or fully acted on.
The SQA was unable to confirm that entry and registration data for pupils being presented was complete and accurate from the start, which had a knock-on effect on support processes such as moderation and appointments.
Centres (schools, colleges, etc) were not adequately engaged in the data collection process and centres were not always advised when problems began to emerge.
There was little evidence of proper checks and balances to prevent errors and omissions from being carried from one stage of the process into the next, which compounded the problems.
In the absence of proper planning and preparation, largely ad hoc solutions were attempted.
The form of the SQA has not been responsive enough to change.
Management structures are cumbersome and practices inconsistent with clear direction and ensuring effective performance.
There was confusion about the different functions of the business units, which led to a lack of cohernce, duplication of effort and a degree of rivalry between units.
Accountability was lacking, alongside a tendency to make decisions by committee.
Staff best equipped to make decisions were not empowered to do so and some were clearly over-stretched, particularly in the operations unit (which had responsibility for the conduct of the exams).
Separate offices in Dalkeith and Glasgow meant that communication problems were compounded and cultural divides emanating from the predecessor bodies were exacerbated.
There was a lack of customer focus and some staff were not sufficiently aware of, nor sure how best to respond to, customer needs.
Training and development opportunities were not always taken up and there was an too much reliance on past practice and individual knowledge which caused further problems when key staff were unavailable.
Staff concerns about difficulties were not responded to either effectively or in time.
There was evidence of concern over bullying.
Performance management did not operate as it should have.
Communication, both externally and internally, was poor.
The SQA board sought assurances but the absence of effective management information questioned the usefulness of what assurances were given.
Mr Morton's recommendations are almost self-evident in the light of his findings - better customer relations and communication, better planning for risk and contingencies, simplified procedures, clear management information and upgraded staff skills.
The SQA has also taken more immediate steps to ensure no repeat of this year's problems for the 2001 exam diet. This includes new resources, simplified transmission of data between schools and the SQA, and a communications strategy to satisfy the needs of schools and other customers. "The first priority will be the efficient and effective collection and management of data," Mr Morton said in his submission.
He also plans to restructure the organisation and will present details to the SQA board tomorrow (Saturday). It is now estimated that 2.7 per cent of exam results and 16,748 out of 147,000 candidates were affected by missing or incomplete data.