Nearly 10,000 entries were left out of the published statistics, even though the students had entered for the exams this summer.
This is because the student, or their school, effectively turned down the chance to be counted. They were therefore not included in the official "failing" figures. Only those actually awarded a U grade were counted as having failed.
If the missing entries were included as a fail, the published national pass rate would have been cut from 97 to 96 per cent. However, had they been included, the number of failing grades would have increased by a third.
Statistics from the Joint Council for Qualifications, an umbrella body for the UK's five exam boards, show that students were entered for a total of 837,052 A-levels this summer. Yet only 827,737 grades were published, a drop of 9,315.
In one subject, "other modern languages" - which are languages other than French, German, Spanish, Welsh or Irish - more than 5 per cent of those entered for the exam failed to get a grade.
In further maths, the disparity stood at 4 per cent; in science subjects other than biology, chemistry and physics it was 3 per cent; and in law, it was 2 per cent.
It is already widely known among teachers that students who drop out of an A-level course after disappointing results at AS level can help to boost A-level pass rates, as they are not counted as entering the A-level exam.
However, the new statistics show that even some of those completing the course and entering their final exams are also not counted in the official data. These figures are based entirely on success rates among those students who choose to get a grade at the end of their course.
Dr Jim Sinclair, director of the joint council, put forward two reasons for the difference in entry figures and the final statistics.
First, a student may have failed to turn up for all their exams, forfeiting the chance to be awarded a grade. Second, a student may decide not to accept, or "cash in" their grade, because they did not want it on their record. Hence it would not feature in the passfail rate.
Professor Alan Smithers, of Buckingham University's Centre for Education and Employment Research, said: "It does look as if the results do not include those candidates who are failing themselves. The exam boards ought to be explicit about the basis of the statistics, and point out what is and is not being included."
Dr Sinclair said that the entry data figures were only "snapshot" statistics, collected every year in early June.
The reality was that they were changing regularly up until two weeks before the A-level results were published in August, as schools and colleges amended the details of their entries.
This snapshot picture was taken solely to enable comparisons with the previous years' entry figures at the same time of year. It should not be taken as the absolutely definitive set of data, he added.