Submitted by apple_achia t3_ynmu55 in singularity
mootcat t1_ivez45m wrote
Reply to comment by apple_achia in In the face on the Anthropocene by apple_achia
IMHO humanity will not be able to maintain anything close to its current levels of control over global mechanisms if we are to have any shot at surviving what is to come.
A major improvement would simply be a singular focused intelligence determining things like resource allocation, controlling weapons of mass destruction and preventing the abuse of positions of power.
If we carry the same methodologies and power structures into an AGI assisted future, we will find utter destruction even faster, or dystopia beyond anything we can imagine.
Viewing a single comment thread. View all comments