Submitted by Beautiful-Cancel6235 t3_11k1uat in singularity
We are all getting whiplash from the breakneck speed of AI development and adoption/integration. This is likely funded by an AI arms race—this arms race coupled with just general inability for our government to effectively regulate anything, explains this crazy rate.
What would be something that could slow this down? Someone mentioned the limit of transistors on chips but I don’t understand that and it seems like they’re making super chips anyway.
What about alignment? Is there a scenario where, as AI becomes more and more powerful, scientists find they can’t control it or align it and it’s power can only be used for finite (and hopefully noble) projects like solving climate change.
DungeonsAndDradis t1_jb5ek3g wrote
According to history, this will only accelerate (towards extinction, I think).
To answer your question, the only thing that would slow down AI research is a large scale, civilization-affecting issue. Massive meteor strike. Deadly plague. Nuclear war. CME (coronal mass ejection) that takes us back to the 1800s.