guyonahorse t1_ja1ftau wrote
Reply to comment by InevitableAd5222 in Why the development of artificial general intelligence could be the most dangerous new arms race since nuclear weapons by jamesj
Well, ChatGPT's training is pretty simple. It's trained on how accurate it can predict the next words in a training document. It's trained to imitate the text it was trained on. The data is all "correct", which amusingly leads to bad traits as it's imitating bad things. Also amusing is the qualia of the AI seemingly being able to have emotions. Is it saying the text because it's angry or because it's just trained to imitate angry text in a similar context?
But yeah, general intelligence is super vague. I don't think we want an AI that would have the capability to get angry or depressed, but these are things that evolved naturally in animals as they benefit survival. Pretty much all dystopian AI movies are based on the AI thinking that to survive it has to kill all humans...
Viewing a single comment thread. View all comments