Viewing a single comment thread. View all comments

[deleted] t1_jeemjmg wrote

Reply to comment by _JellyFox_ in The Alignment Issue by CMDR_BunBun

[deleted]

1

silver-shiny t1_jeey24l wrote

If you're much smarter than humans, can make infinite copies of yourself that immediately know everything you know (as in, they don't need to spend +12 years at school), think must faster than humans, and want something different than humans, why would you let humans control you and your decisions? Why would you let them switch you off (and kill you) anytime they want?

As soon as these things have a goal that are different than ours, how do you remain in command of decision-making at every important step? Do we let chimpanzees, creatures much dumber than us, run our world?

And here you may say, "Well, just give them the same goals that we have". The question is how. That's the alignment problem.

2