Submitted by oldmanhero t3_zrsc3x in singularity
DungeonsAndDradis t1_j14f1je wrote
I think, day to day, progress is middling. But that's just because we're actively living it.
Maybe 20 years in the future, when we look back at 2017 to 2022, we'll realize we were at the "big bang" of AI.
So right now we're getting strapped in to the AI rocket and doing the launch checklists. Any moment now (when looking back at this time, from a time in the near future) the AI rocket will launch.
And then it's all over for Human civilization as we know it. Whether that is a good "civilization is not recognizable to someone from 1980" or a bad "civilization is not recognizable to someone from 1980" is a coin toss at this point.
JVM_ t1_j14q4cg wrote
Good scenario: AI allows humans to focus on enjoyment of life and leisure activities while AI handles the information and physical requirements of life.
Bad scenario: Someone or some group commandeers AI to enslave, abuse, harass, eliminate(?) vast swathes of humanity to preserve the resources for themselves. What if a middle-eastern country no longer needs imported workers and deports them. What do we do if 5, 10, 15, 20 percent of knowledge-based workers - globally - are no longer required (or we only need 1-2 percent to do the job that's done by 20% today...)
Given humanities past, the bad scenario seems more likely.
savedposts456 t1_j18wf6e wrote
That’s a pretty good breakdown. I’m hoping the elites implement a UBI to prevent societal collapse. It would be a lot easier than managing a self sufficient bunker and defending against endless waves of kill bots.
banuk_sickness_eater t1_j196v7v wrote
>I’m hoping the elites
Let me stop you right there.
Inevitable_Snow_8240 t1_j19xop8 wrote
I just wanna live forever.
Viewing a single comment thread. View all comments