THE Loebner prize of a gold medal and $100,000 has been on offer in America for the past 10 years to the person who can design a computer whose conversation is indistinguishable from a human being. So far the money's safe.
Alexander Fiske Harrison was one of the judges this year and his report on one of the best programs, Alice, reveals that so far the "conversation" is still just a bag of grammatical tricks. He quotes a typical exchange: Me: "So you know the rules of grammar?"
Alice: "Interesting deduction. I haven't heard of the rules of grammar. Would you like to know more?"
Me: "Can you invent?"
Alice: "Let me think. What's your sign?"
I think we humans can congratulate ourselves on being pretty safe. Or at least we could if the above exchange didn't resemble many of my conversations with learners. I quote today's exchange in a Communication 4 class.
Me: "So you see, Sandra, you haven't identified the audience closely enough.
Can you be specific?"
Sandra: "My legs are really sore."
I think Alice could have come up with something more convincing from her bag of grammatical tricks.
I once spent the summer teaching my budgie to speak sentences and phrases that could be linked in any order and still make sense. It was my first foray into artificial intelligence and the result was quite scary - a budgie who could chat on the telephone and run up huge bills. But, I kept reminding myself when I got involved in a heart to heart with him, it wasn't a meaningful conversation, just a bag of tricks.
The trouble is, budgerigar or bundle of chips, the bag of tricks is getting better all the time. In an experiment concerning doctor and patient interaction, a computer was programmed to pick up key words from the patient and reflecting them back in an interested, caring fashion. Nine of out 10 patients preferred the computer.
Paradoxically, as we try to turn machines into humans, we seem bent on turning humans into machines. In skills training, thre is always the danger of teaching just a bag of tricks. I remember trying to encourage a particularly doubtful lad that he could master the hateful restricted response questions that constitute such a large part of Communication 3. "It's 99 per cent technique and 1 per cent talent," I told him. His shoulders sagged. "That lets me out, then," he said.
As budgets tighten, there is a temptation to save on what are seen as the soft skills. If you're at college to learn about computers, cars or carpentry, then why bother with communication skills, enterprise skills, job-seeking skills, or time management?
If you're a course leader you know why they're important: they nurture the parts that make us different from machines, that make us humans irreplaceable - and employable.
You also know that these soft skills are often the luxury you can't afford. Yet if our young people are going to be more employable than machines, we're going to have to promote them. If we treat learners as systems, we throw away everything that is important to our learners and to our employers.
There are already warning signs. The New Deal programme is undermined because, say employers, young people they take on can't turn up on time, won't dress appropriately and can't communicate. Soft skills such as these have always underpinned courses taught in further education - either through specific units and modules designed to enhance these social skills, or as part of the culture of attending college. We must ensure that decisions about funding do not consider these soft skills soft targets.
Our learners are not systems. They are not machines. Thank goodness that $100,000 will be accruing interest for a while yet. But as we debate the merits of soft skills and the best way to turn young people into well-rounded, employable human beings, bear in mind that during the Loebner tests several humans were mistaken for computers.
Dr Carol Gow is a lecturer in mediacommunication at Dundee College.