Viewing a single comment thread. View all comments

KungFuHamster t1_je0mtoj wrote

Yeah, the invention of AGI is often referred to in science fiction as the technical singularity, because the speed of AI makes the future beyond that point literally unknowable. If we can keep it from killing us, it should advance our technologies at a tremendous rate.

10

Trout_Shark t1_je0p2vb wrote

Implementing something like Asimov's "Three Laws of Robotics" should be a major priority. The singularity is of course a major concern. Anything that can learn at an exponential rate is going be difficult to keep under control for long.

6

acutelychronicpanic t1_je0z1p4 wrote

I agree with the sentiment, but a lot of work has gone into this since those 3 laws. Its still an unsolved problem.

7

nybbleth t1_je2tf6s wrote

The three laws don't really work, though; on multiple levels. They're far too simplistic and ambiguous, and effectively impossible to implement in a way that AI could consistently follow.

4