>The point is that we don't understand consciousness in people,
>so we can't build it into machines.
Yes.
At the moment, I am happy to sit on whatever side of whatever fence I am sitting on, and say that the acquisition of a language system through being in the environment of language users, requires whatever complexities of brain are needed to achieve what we for lack of a better term call consciousness.
All of which means, not that we can say of a 'conscious' machine 'it can talk', but that 'it learned to talk.'
We can programme machines to talk, but so far, and precisely for the reason Robin states above, we can't make a machine that we can put in any environment that will end up talking in any moments of time.
Anyway, that's my test, and yes, it's a personal one. It is _not_ a Turing test, which, incidently, I do not consider a valid test of consciousness, also for the reason above.
Wade T. Smith morbius@channel1.com | "There ain't nothin' youwade_smith@harvard.edu | shouldn't do to a god."