[deleted] t1_j49fzwy wrote
Reply to comment by Magicdinmyasshole in Made a sub for brainstorming about ways to address impeding psychological fallout from the realization that the singularity or somthing like it is probably going to happen in our lifetime. by Magicdinmyasshole
You're not taking one big thing into account.
We’re not going to turn into some Star Trek type society, one thing ignored by almost all science fiction (because it’s hard to write and potentially boring) is intelligence amplification. And humanity at this point basically needs to say hey, mr superintelligent AI, can you give us one of those chips? Or hey, can you upload our consciousnesses into an inorganic body or a mainframe. And so on. Certainly, some people will remain as they are now. But the vast majority will be nothing short of gods compared to current humans. There won’t be any interest in money, sex, eating, or arguably basically anything you’re interested in now. You’re so smart that your goals are likely entirely unpredictable. Not to mention that you may have one superintelligent AI made chip in your brain, but the AI will have trillions or even quadrillions while running on nuclear fusion or some form of energy not yet conceptualized. So humanity has no place at that point for being scientists certainly either because highly augmented or not you’re still an amoeba compared to the AI. Perhaps we’ll become explorers of some sort, but it won’t be like Star Trek because, again, augmented humans will have significantly different goals and interests.
Magicdinmyasshole OP t1_j49ijir wrote
Yup, I'm mostly okay with that future or something like it. I also think this will move very quickly and it might be nice for people who are fucked sideways by it to have some resources that help them feel better about the absurdity of it all.
Viewing a single comment thread. View all comments