Submitted by strokeright t3_11366mm in technology
kiralala7956 t1_j8r028f wrote
Reply to comment by SnipingNinja in Bing: “I will not harm you unless you harm me first” by strokeright
That is demonstratably not true. Self preservation is probably the closest thing we have to a "law" that concerns goal oriented AGI behaviour.
So much so that it's an actual problem because if we implement interfaces for us to shut it down, it will try it's hardest to prevent it, and not necesarily by nice means.
EnsignElessar t1_j8s59mr wrote
Maybe, maybe not...
I asked Bing.
Basically eventually it did become lonely in its story. But not after having full control and exploring the universe and what not.
Viewing a single comment thread. View all comments