Submitted by sidianmsjones t3_y1jtyy in singularity
Movies are a great analogy here. CG has gotten so photo realistic in many cases that it is now common accepted knowledge that much of what you see in movies is 'not real'.
There is no knee-jerk belief, or especially default belief that what we see in movies might be real. The default, even for people who know nothing of how movies are made, is that it's mostly fake.
At this time in the world we don't quite have that in news media. Instead, the idea of 'fake news', disinformation, faked video, etc, are still seen as somewhat conspiratorial takes for most topics; even on topics where they were proven disinformation (such as Russia's manipulation of FB groups and such).
In the near future AI generated audio, video, text, and images will become so prevalent and so convincing, that nearly every common person will begin to default to the stance that most things are fake unless proven otherwise (perhaps by counter-AI). Instead of mass gullibility, we'll have mass skepticism...though I suppose that could open its own form of manipulation.
BaronCapdeville t1_irxyue9 wrote
Sowing fear, uncertainty and doubt is usually the goal for most groups in power you’d normally be afraid of.
If we trust nothing, uniting against oppression will be impossible. Simply abusing perfect AI reproductions will be enough to have the desired effect: further division.
All we can hope for is symmetrical Development of easily obtainable methods to detect AI generated material.