Submitted by Gortanian2 t3_123zgc1 in singularity
Memento_Viveri t1_jdzn6ts wrote
I don't disagree with much of what is stated in the first paper, but think it sets the wrong goal posts. I have no idea what the author means by a three orders of magnitude increase in intelligence. I am already in awe of the smartest humans. Even if you could produce a machine intelligence that was only as smart as the smartest humans, I struggle to fathom the consequences. The machine intelligences can be reproduced ad infinitum. They don't need to sleep and never die. They can communicate between each other in a nearly instantaneous and unbroken manner. They have access to the sum total of all human knowledge and near instantaneous and inerant recall. An army of Einsteins and von Neumann's in constant, rapid communication that never sleeps, never forgets, and never dies.
What are the abilities of such a creation? I don't need an explosion of intelligence of three orders of magnitude. I believe the existence of even one machine with the intelligence level of a highly intelligent human will shake the foundation of society and have implications that are unimaginable. It will be a turning point in human history. Maybe there will be an explosion of godlike intelligence through self improvement, but I don't think this is a necessary condition for society and life to undergo revolutionary and unimaginable changes as a result of machine intelligence.
Gortanian2 OP t1_je0yq76 wrote
βAn army of Einsteins and von Neumann's in constant, rapid communication that never sleeps, never forgets, and never dies.β
I wonder how fruitful those conversations would be if one already knows everything the other one knows. I think it may become something more like an einstein-level intelligence with an army of bodies to explore with. A hivemind.
Thank you for your comment, it has given me new ideas to ponder. And I agree. We would not need unbounded exponential growth to drastically shape our reality.
Viewing a single comment thread. View all comments