Submitted by Effective-Dig8734 t3_ykhpch in singularity
tedd321 t1_iuuazvh wrote
No I think we need one. There are theoretically infinitely different ways to get there. It looks like right now we’re starting by having many narrow ais. Eventually there’ll be some super tool which is a combination of the best.
AGI is a tool that will spawn many narrow AIs. Because having a platform where people can train a single neural network to do any task is a great way to get to the singularity - every human can be used to direct and create more AIs.
Even human level intelligence is usually just focused on a specific task. But tasks are sometimes made of many things, so generalizing is an important skill.
But what exactly is general intelligence? Do humans even have it? If by AGI you mean a human like intelligence then yes we need one (many) that is not inside a normal human, which we can enslave without any consequences, that eventually becomes better than a human.
Viewing a single comment thread. View all comments