Viewing a single comment thread. View all comments

mootcat t1_ivez45m wrote

IMHO humanity will not be able to maintain anything close to its current levels of control over global mechanisms if we are to have any shot at surviving what is to come.

A major improvement would simply be a singular focused intelligence determining things like resource allocation, controlling weapons of mass destruction and preventing the abuse of positions of power.

If we carry the same methodologies and power structures into an AGI assisted future, we will find utter destruction even faster, or dystopia beyond anything we can imagine.

1