When the FE Choices website - which contains the latest performance data for colleges and training providers - came in for strong criticism following last month's launch, FE minister John Hayes was quick to defend the government's "steps to increase openness and transparency in public services".
But TES has learned that significant errors in the data have since been quietly corrected without the approval - or, indeed, the knowledge - of either the minister or senior civil servants at the Skills Funding Agency (SFA), prompting fears of a "cover up".
A source at the SFA, who personally worked on the data, said that there had been "a clear case of maladministration" within the Data Service, the agency responsible for publishing FE data, which was commissioned to carry out the work. "This will impact data for every provider," the source said.
TES understands that a number of errors came to light within 24 hours of the website going live, but no statement was issued to alert users to the inaccuracy of the data. More than 60 figures have since been amended.
Because the FE Choices data are classed as official statistics, the source has alleged that the actions were in breach of the UK Statistics Authority's code of practice, which states that agencies should "correct errors discovered in statistical reports, and alert stakeholders, promptly".
After TES alerted the SFA about the issue, a notice was posted on the website on Tuesday morning to notify users of the problems with the data. The agency has now launched an investigation.
In a letter sent to Mr Hayes, the SFA source wrote: "It is clear that the Data Service has been delinquent in its commitments under the (code of practice) . No stakeholder has been informed (not even the project sponsor)."
The source added: "When the data was corrected, there was no notice placed to say that a change had been made. This, to me, shows no regard for its users and is a clear case of a `cover up'.
"Government websites bring with them integrity, honesty and accountability. The actions of this executive department have undermined these key values with a clear case of maladministration."
The original set of data includes more than 60 figures in the "learner satisfaction" category that have since been altered. These relate to the national average scores, summarising providers' performance in areas such as the quality of teaching on offer, support available for learners and "how good or bad" the provider is.
Just over half of the amendments saw the marks decrease. As a result, some website users would have been given a misleading perspective of how colleges and providers had performed in relation to their rivals. For instance, the original data stated that the lowest mark given to a provider by learners on level 1 courses for "teachingtraining" was 4 out of 10; this has since been changed to just 1 out of 10.
Among adult learners at all levels, the lowest-performing institution for "listening to learners' views" was initially said to have been rated 7 out of 10; this was later amended to 3. The average college performance in the category was also marked down, from 8.6 to 8.1.
"The original data showed that many independent training providers performed well in comparative terms across all areas," a spokesman for the Association of Employment and Learning Providers (AELP) said. "They will want to use the data to market themselves to prospective employers and learners and, therefore, they will be concerned if its accuracy is being questioned. We hope that any identified issues are ironed out before the next set of data is released."
AELP's concerns are shared by the Association of Colleges (AoC). "The FE Choices data has been made available to help students, parents and others make better-informed comparisons of further education and skills providers," said AoC deputy chief executive Lesley Davies. "However, to be useful to these groups, the information does need to be accurate and fair.
"We would hope that the agencies involved address any questions that have been raised about the data in order to reinforce our collective confidence in its accuracy and utility."
The SFA statement published on FE Choices on Tuesday finally confirmed that there had been a problem. "It has been brought to our attention that some figures for the learner satisfaction indicator have changed since the original release on 26 January," it said. "We are investigating this issue and will provide more information in due course."
The whistleblower, who was on a temporary contract at the SFA, was asked to leave the agency on Friday. The data errors have caused major concern among SFA executives, who are worried that learners and providers could lose faith in FE Choices as a result of the damaging allegations. The promised investigation into the episode should go some way towards restoring the sector's confidence.
What has changed?
Figures that have been corrected on the FE Choices website relate to the national average statistics for the "learner satisfaction" categories, which are:
- help in the first few weeks;
- respect from staff;
- advice on what to do next;
- listening to learners' views;
- acting on learners' views;
- how good or bad is this organisation?
Original headline: SFA under fire over claims of a `cover up' at FE Choices