Weird thoughts about LLMs, part 2
Jun. 8th, 2024 03:29 am![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
I saw an argument that the Chinese Room may have a consciousness separate from the human within it. The human doesn’t “mean” what they say, but the room means it. Therefore, an LLM might have a consciousness and mean what it says.
I think the point is made more clearly if, instead of using the Chinese Room for the comparison, you use No is Yes for the comparison. The girl says “No” and means “Yes.” It would be absurd to argue that her words create an independent consciousness that means “No.” The word “No” has simply been unmoored from its original meaning.
You can argue that there’s something within an LLM that “thinks.” Presumably, it goes “I want to make this text similar to the text I’ve read.” But the LLM doesn’t mean “No” when it mimics a text that says “No,” because the word “No” has been unmoored.