Submitted by Effective-Dig8734 t3_ykhpch in singularity
ArgentStonecutter t1_iutyjvd wrote
Reply to comment by Effective-Dig8734 in Do you think we could reach a singularity without the invention of agi? by Effective-Dig8734
This isn't just a random source, this is the primary source on the singularity in the modern sense. Vinge has been the primary promoter of the concept since the seventies. Well before Kurzweil shifted from creating computer companies to writing speculative fiction.
And the sentence you quoted doesn't say what you claim.
"The precise cause of this change is the imminent creation by technology of entities with greater than human intelligence."
"Creation of entities with greater than human intelligence" is the important point.
Effective-Dig8734 OP t1_iuu0zxk wrote
I dont agree he is the primary promoter in fact I’ve never heard of him before now.
Also what is the “change” he is referring to? The change is the singularity and he just thinks advanced intelligence is how we will achieve it . It’s not necessary, you can have a technological singularity without higher intelligence and you can have higher intelligence without the singularity
ArgentStonecutter t1_iuucx83 wrote
Sorry you’re out of touch, not my fault.
Effective-Dig8734 OP t1_iuud4so wrote
Do you have anything else to say?
ArgentStonecutter t1_iuue3l4 wrote
Did you finish reading the paper?
Effective-Dig8734 OP t1_iuugag7 wrote
Yea and can you clarify your claim, is it that the singularity is the invention of a superhuman intelligence, that a superhuman intelligence is necessary, or what?
Edit: because in the original comment I responded to the poster said “To surpass human-level intelligence you need human-level intelligence. No getting around the definitions of the singularity” Implying the definition of the singularity is surpassing human level intelligence. Which (if we assume this paper is the be-all end-all) isn’t supported by this paper.
ArgentStonecutter t1_iuuzmah wrote
The singularity is the result of human society being under the control of a superhuman intelligence, whether biological, artificial, or mixed. This includes rapid technological advancement and the society's evolution and goals being literally incomprehensible to mere normal humans of today.
The singularity is a post-human era. Unless the singularity is avoided (eg, by Drexlerian confinement, or perhaps Egan's argument that superhuman intelligence isn't a real thing and as such no singularity develops) physical extinction of humanity is not the worst possibility. Mere human level intelligences could be treated as simple computational devices like traffic light sensors.
The paper spends some time on the possibility of avoiding the singularity, and it's all in terms of preventing the unlimited development of superintelligence. That is what distinguishes a singularity from mere rapid technological growth.
Viewing a single comment thread. View all comments