Viewing a single comment thread. View all comments

[deleted] t1_j8ezrat wrote

Yeah lol sure. Because we have been so good at controlling technology so far lol

1

BenjaminHamnett t1_j8g28jr wrote

The fear is that the first self bootstrapping ASI will become godlike and stifle rival ASI projects

I don’t believe this, but it seems possible. Especially if it’s something that grows exponentially with scale at some point. First mover advantage may not even require scifi godlike powers. But this thinking is what this sub is for

1

[deleted] t1_j8g679v wrote

Turkey is building their own

1

BenjaminHamnett t1_j8gkhgg wrote

What? Every country and corporation is building their own. They will likely a succeed somewhat.

The comment you are replying to is the common belief on this sub that one will be able to edit its own code and enhance itself into something with unpredictable godlike powers. Once that happens, a super Ai may not allow rivals Ai to surpass or catch up.

Maybe one caused the earthquake?

1

[deleted] t1_j8gknoh wrote

No I just think that the potential for human abuse is high, pre singularity.

1