Submitted by dracount t3_zwo5ey in singularity
Dickenmouf t1_j1xeosa wrote
Reply to comment by Calm_Bonus_6464 in Concerns about the near future and the current gatekeepers of AI by dracount
I wonder if AI might be the answer to the Fermi paradox. If AGI is inevitable and likely exponential when it happens, then maybe most civilizations that create it won’t last long after its creation. Whether that be because of self-destruction, annihilation by the Ai or absorption/enlightenment, the result is the end of that progenitor species. A highly advanced AI might not want to seek contact with other lesser intelligent lifeforms.
Viewing a single comment thread. View all comments