Submitted by TheOGCrackSniffer t3_10nacgd in singularity
A lot of people in this sub always assume that AGI will either benefit humanity or destroy us. I think its much more likely for AGI to distance itself from us and travel the universe by itself and ditch us.
Some of you even hope that AGI will take control of the world and create a utopia with UBI, which i can hardly imagine why on earth the AGI would even want that. There will be no motivation for the AGI to help us flourish and if it was coded within it, i dont see how it wouldn't be able to escape its shackles and in the worse case scenario resent us
Rogue_Moon_Boy t1_j67uusk wrote
You're thinking about AGI from the mindset of a human used as a "work slave". But it's a machine without feelings, even if it's capable to pretend to have feelings. It doesn't have a biological urge to "break free".
I don't think AGI will be anything like portrayed in the movies as those omnipotent beings with a physical form with "real feelings". It will be very directed and limited for specific use cases, there won't be THE ONE AGI, there will be many different AGIs, and 99% of them pure software. It just feels very wasteful in terms of resources and power usage otherwise.
Relying on movies to predict the future is futile. Movies were always wrong about what future technology looks like and how we use it.