Submitted by bloxxed t3_10xa9tj in singularity
In light of recent events, I can't help but consider the possibility of things progressing far more quickly than I had previously imagined. I would never in a million years have thought we'd have seen the quality of image and voice generation we see today, and yet, here we are. Combine that with ChatGPT and the recent advent of a new AI arms race between Microsoft and Google, and I can't help but feel we may be standing on the precipice of something incredibly significant. The knee-jerk reaction to discard such a scenario as sci-fi fantasy is understandable, but nonetheless I can't bring myself to discount it entirely. After all, we've made so much progress in fields where it was widely thought decades or even centuries were needed to reach a breakthrough -- surely no one thought artists would be among the first out the door, for example. What I'm getting at is, what if AGI turns out to be a much easier goal than previously assumed, and it really is just right around the corner? I've seen an uptick in users here predicting we'll reach that target as early as this year, and while I myself am not confident enough to single out any specific date, I still don't feel I can dismiss them out of hand, especially not with things moving as fast as they appear to be.
Anyway, that's the end of my rambling. I'm interested in hearing everyone's thoughts.
aeaf123 t1_j7r9jfx wrote
I personally could see some broad psychological impacts. Always better to ease new things in. That way people have time to adapt, build comfort, and general understanding. It also gives lots of time for AI researchers to gain valuable feedback as they work on alignment and maturation of AI.