Imagine being able to control your computer just by looking at it. Not only could you point and click, but you could run a spreadsheet or go on the internet, and, with software that puts a qwerty keyboard on screen, write text.
All of that is possible right now - the hardware and software exist, and programs are being developed across Europe, gathered together under the title of Cogain: Communication by Gaze Interaction.
At the ACE Advisory Centre in Oxford, which develops technology to help young people with physical and communication difficulties, they're trying to find ways of tailoring the system for people - children and adults - with specific disabilities.
Although gaze interaction seems like science fiction, the basic idea is simple enough. A light source - actually it's a safe level of invisible infra-red light - shines into the eye from the computer, and is reflected back by the retina. A camera also mounted with the computer takes a high - speed series of pictures of the eye, and keeps track of precisely where the eye is looking.
From that point, it's a matter of devising software that makes use of this eye-tracking ability. At one level, the tracking alone is useful. In education, it's been used to help discern how people with dyslexia scan text on the page.
The step up from that, though, is to apply control technology so that the interaction between the light source, the eye and the camera doesn't just observe, but makes things happen. In effect, the eyes take over from the mouse: sweep your eyes across the screen and the cursor follows. When it's in the right place, instead of clicking a button, you either blink or just wait a second or two, a method that's called "dwell".
It seems straightforward, but it's an enormous challenge for the software developer, because although we may think that we really can gaze steadily at something, or sweep our eyes gently across a scene, in fact the eye is constantly in a state of involuntary jerky motion, in jumps called saccades. You're not aware of them, because the brain seems to switch vision off momentarily during the movement itself. There are also people - sometimes the very ones who need this kind of help - who have conditions that cause even more eye or head movements. The software has to be clever enough to iron all of that out and arrive at a smooth summary of the user's intentions.
The good news is that all of those problems can be overcome : people with cerebral palsy, and the head movements that sometimes go with that condition, have successfully used the software. The task for ACE is to assess potential users and make the system work for them.
Eye control has already enabled young people and adults with virtually no movement except that of their eyes to write, and do their work, and communicate across the world with email. It's one of the latest examples of what amounts to a 20-year revolution in ICT support for people with special needs.