ElvinRath t1_j4dw91r wrote
Reply to comment by SoylentRox in Does anyone else get the feeling that, once true AGI is achieved, most people will act like it was the unsurprising and inevitable outcome that they expected? by oddlyspecificnumber7
Oh, yeah, sorry.
I was answering the op and I used "original research" because he mentioned that, but I was thinking "independent" (Term that I use later in my post), meaning, "without human intervention" (Or, at least, not more intervention that "Hey, go and research on this")
​
No human intervention is the requirement for the concept of singularity (Well,,, or augmented humans that can comprehend in seconds what actually take years, but that's probably not a human anymore... :D )
SoylentRox t1_j4edj94 wrote
I am not sure that you would not get a singularity if there was only a small amount of human involvement, say 1000 times as much research as we do now with the same number of people working on it.
Viewing a single comment thread. View all comments