Shinoobie

Shinoobie t1_j3553l2 wrote

It's a kind of special pleading when we say calculators don't 'think' of answers. Something about it makes us uncomfortable so we create a linguistic distance by talking about it differently. Just like when people say 'human beings and animals' as if we aren't animals. We just want a distinction between what it's doing and what we're doing that is wider than the one that really exists. That type of thing is doing a lot of work on the people posting here.

ChatGPT absolutely has sensors and inputs, and it absolutely 'reflects' on things and knows about itself. Saying otherwise is literally like saying dogs don't have intelligence or a self because they can't answer the question "are you a dog." The persistence of a sense of self may not be present in planarians but they are objectively alive.

I'm not saying it is sentient or alive by any stretch of the imagination but it absolutely has more of the puzzle together than people seem to be giving it credit for.

2