Submitted by razorbeamz t3_z11qz3 in singularity
FomalhautCalliclea t1_ixaqme7 wrote
Reply to comment by Drunken_F00l in When they make AGI, how long will they be able to keep it a secret? by razorbeamz
Paradoxically, i think a materialistic realist AGI would provoke more turmoil and disbelief than a metaphysical idealist neo buddhist one : many people that already have this opinion will feel cuddled and comforted.
Even worse, whatever the answer the AGI would produce, it could be a trapping move, even outside a malevolent case : maybe offering pseudo spiritual output is the best way of convincing people to act in a materialistic and rational way. As another redditor has said it below, the AGI would know of the most efficient way to communicate. Basically, alignement problem all over again.
The thing is that type of thought has already crossed the mind of many politicians and clergymen, Machiavelli or La Boétie themselves thinking, in the 16th century, that religion was a precious tool to make people obey.
What fascinates me with discussions about AGI is how they tend to generate conversations about topics already existing in politics, sociology, collective psychology, anthropology, etc. But with robots.
TheLastSamurai t1_ixarxnz wrote
That sounds horrible
Viewing a single comment thread. View all comments