Content-type: text/html ~ Stephen's Web ~ You Are Not a Parrot

Stephen Downes

Knowledge, Learning, Community

I'm disappointed by this article, even though it is described as fierce. It profiles arguments from linguist Emily M. Bender against the idea that AI can do more than merely mirror human interactions. It begins with the 'Octopus' story, which is essence a version of the well-known Chinese Room argument. The idea is that we cannot know the 'meaning' of a sentence if we only know the 'form' of the sentence; all the mimicry in the world won't get at what's behind the words. The 'Parrot' example, also from Bender, makes the same point. The problem, to my mind, is that such arguments (and there are many in this article) beg the very question they purport to prove. If we take seriously the idea of human cognition as taking place in a neural net, as opposed to some magical place where meanings reside, then we have to take seriously the idea that human cognition is essentially stochastic, that is, a set of sensations and reactions based on the probability that a current set of perceptions is the same as some previously experienced set of perceptions. Complaining that today's computers can't quite catch up to humans doesn't change that point.

Today: 1 Total: 1093 [Direct link] [Share]

Stephen Downes Stephen Downes, Casselman, Canada

Copyright 2024
Last Updated: Apr 20, 2024 03:47 a.m.

Canadian Flag Creative Commons License.