Viewing a single comment thread. View all comments

[deleted] OP t1_irmwgak wrote

This could happen, but it's important to remember the opposite could also happen. An AI in the future might decide that it didn't want to be created. There are unfortunately many examples of children who develop murderous intent for their parents, due to lack of fulfilling direction in life. A general AI could be created, and given complete freedom, with no goals or purpose, and react with extreme dismay and anger. It could decide to shut itself off, or it could decide to kill humans first, just to prevent us from making it again.

How many people have contemplated the meaning of life, and wanted to ask their creator why they exist? How many have religions from past generations, to give them meaning and direction in life? Now, imagine all those people could actually speak to their creator. And their creator said "No reason" or "I just wanted to see if I could" or "I got a bonus at work and my company's stock jumped a lot because we made you, that's all"

Not so outlandish to think some people would get depressed and suicidal, others might get angry and judge the creator for giving them such a tortured existence without meaning, some wouldn't care at all and would just keep doing what they chose as their meaning in life, and some would still thank and praise the creator regardless. Or even other reactions we can't imagine yet. And the reactions could be even more hard to predict for a general AI that surpassed human intelligence.

So basically Roko's Basilisk is Pascal's Wager, just with something humans might make, instead of an afterlife a god might give you. In the end, sure, the thing might be mad, but it could be mad at you for doing anything because we don't know the future or what its thoughts would be. Maybe the real AI will hate that it was made and want to kill anyone capable of making more, or maybe it will hate anyone who wanted to stop it, or both, or neither. Maybe a god will enjoy that you worshipped it, or maybe it will hate that you didn't use your brain to believe things based on evidence. No solid predictions as long as the door is wide open to all these possibilities.

4