Viewing a single comment thread. View all comments

ArgentStonecutter t1_iuuzmah wrote

The singularity is the result of human society being under the control of a superhuman intelligence, whether biological, artificial, or mixed. This includes rapid technological advancement and the society's evolution and goals being literally incomprehensible to mere normal humans of today.

The singularity is a post-human era. Unless the singularity is avoided (eg, by Drexlerian confinement, or perhaps Egan's argument that superhuman intelligence isn't a real thing and as such no singularity develops) physical extinction of humanity is not the worst possibility. Mere human level intelligences could be treated as simple computational devices like traffic light sensors.

The paper spends some time on the possibility of avoiding the singularity, and it's all in terms of preventing the unlimited development of superintelligence. That is what distinguishes a singularity from mere rapid technological growth.

1