By JAY THOMPSON [Kenyon Review] – Since, Searle says, computers perform only syntactic, switch-based operations, they’re the equivalent to John S. in the box, using inputs to arrange lexical content they don’t “understand,” but can sequence.
Google cross-references your search string and image tags to bring you that picture of a sneezing panda; the chatbot weighs your idioms and mood and aims straight for the middle in its prefab reply. But the brain, Searle stresses, does more than enact formal computations on symbols; thought requires semantics, the mysterious contentual something that Searle mostly characterizes by what it isn’t.
Some philosophers call the Chinese Room experiment a fallacy, akin to dismissing the theory of electromagnetism because you see no glow when you shake a fridge magnet. Others simplify Searle’s situation: would, say, a 3G network precisely simulating the neural activity of a brain in pain, feel pain?
The question goes farther, though: what the hell is understanding?
To understand understanding (lord), we have to wrestle with the idea of intentionality. Intentionality means the ability to be about or toward something, and, Searle claims, it’s unique to the brain. Language as it’s spoken or written means only what the listener or reader understands of it; its intentionality is said to be only derived. But our fancies, propositions, and longings themselves aren’t just empty vases. They feel to us like they have intentional content.
Continued at the Kenyon Review Blog | More Chronicle & Notices.
Post a Comment