In the week before the 2001 certificate exams began, senior executives at the Scottish Qualifications Authority were confident that they now have the plans and resources in place to say that their computer system will work the way they require.
At the heart of last year's SQA crisis, when 17,000 students received the wrong results in their school-leaving examinations, lay the handling of the data that had to be fed into the computer system. The inputting of nearly 4.5 million pieces of data and 30,000 different course codes caused chaos.
With the introduction of Higher Still with Higher, Intermediate 1 and 2 exams and innumerable internal assessments, the volume of material to be processed was hugely different from previous years. Data was not being processed and backlogs were building up. In the effort to plug gaps and correct errors, some data was even keyed in twice.
The result was a national scandal. Heads rolled at the SQA and confidence in the Scottish examination system evaporated.
This year teachers, pupils and parents want desperately to have confidence in the system, but it's hard. Almost every week there have been press reports quoting insiders saying the system is in a "shambles", it urgently needs testing, there are problems with the computer interface between the SQA and schools and the SQA is "three months behind schedule". Already the date for exam certificates to be posted has been put back by a week to August 13.
So, why should people believe any reassurances they are given, after last year's broken promises?
Colin Urie, head of the SQA's information technology unit, and Billy MacIntyre, head of data management and interim director of awards, present a convincing picture of being on top of the situation. They were brought in to unravel last year's chaos and would like the mantra for this year to be "No backlogs", and say: "Everything we have received, we've processed."
"If you process it, it's in the system and you don't need to worry about it," says Mr Urie. "If you process it and there's a problem, you have time to fix it. It is absolutely not true that we are three months behind. Judge us from the facts and figures available now."
Mr MacIntyre cites as an example Standard grade, for which there are 470,000 course entries this year. "One of the big influxes of data over the past month is Standard grade estimates and internal assessment mark forms. The receipt date from schools was the end of March. These have been through our data preparation bureau and are now being processed. Within the next few days we'll know where the gaps are."
Fears of a deluge of internal assessment results for Higher Still all coming in by the single deadline of May 31 do not seem to worry them, though they emphasise the importance of schools and colleges getting results in on time. The volume of results will, they say, be more or less the same as the volume of entries and the entries have all been processed
"We will get a peak but we have the capacity to deal with it. We have processed everything we have up to now," says Mr MacIntyre. "There are 300,000 entries on the system already processed, out of almost a million. They are coming in on a daily basis."
The key, says Mr Urie, is to be able to simply slot the incoming results into the space already on the system for each candidate. "That should be straightforward, because the box already exists."
The system is, he says, capable of processing 100,000 entries in one batch overnight. "We've tested it with well over 100,000 results and it's worked."
Both men deny allegations that the system requires urgent testing, principally on the grounds that the computer system worked last year: it was the human inputting of data which failed.
One area which does require testing is the "derived grades", using estimates the SQA receives from exam centres and comparing them with actual results, to identify anomalies where perhaps candidates have had a bad day and, hopefully, pre-empt appeals. "That didn't exist for Higher Still courses last year," says Mr MacIntyre. "Testing plans are in place."
Another outstanding job is the reformatting of certificates after criticisms last year that they were too complicated and difficult to understand.
Reported problems with the computer interface between the SQA and exam centres are met with puzzlement. "What does it mean?" asks Mr Urie. "It's a nice soundbite, but it doesn't reflect reality. The interface is a simple file format. Phoenix or Seemis (the schools' administration systems) creates a file, that is sent to us and we load it on to our system.
"The proof of the pudding is that with the same file formats as last year we haven't had any serious problems: data got input and we've proved we can process it.
"Ingress is the relational database system we use and we're happy with it. We have no plans to move away from it.
"Seemis and Phoenix use databases under their system: they're just filing systems.
"We chose Ingress because it's been used by the Scottish Examinations Board for years, very successfully. We had people who had built the SEB system."
So no drastic changes have been made to the SQA's computer system this year. What needed to be addressed, says Mr MacIntyre, were data management, certification and confirmation of reports, and that has been done.
In the short to medium term there is the possibility of a web-based system, which would give schools and colleges constant access to their SQA accounts, as with Internet banking. A pilot project, SQA Net, is under way with further education colleges but there are issues of cost and security.
"It's not the priority for now," says Mr MacIntyre. "Three years down the line we might move to web-based technology if it's in our interests and the interests of our customers. Our priority now is to stabilise the system we've got.
"I'm confident that everything we could have done we have done, and we have strong foundations for successful delivery in August."
Mr Urie agrees: "I'm comfortable that we have plans and resources in place to say that the IT components are going to work the way we require them to."