> but I think overall society grew less substantially less gullible over time.
Absolutely not. All that happened is most people became aware that "Nigerians offering you money are scammers." But they still fall for other get rich quick schemes so long as it diverges a little bit from that known pattern, and they'll confidently walk into the scam despite being warned and saying, "It's not a scam, dumbass. It's not like that Nigerian price stuff." If anything, people seem to be becoming more confident that they're in on some secret knowledge and everyone else is being scammed.
I'm inclined to disagree just because photoshop has had a measurable effect on the population being skeptical of photos which at one point were practically treated as the gold standard of evidence. It's still easy to find people who have fallen for photoshopped images, but it's also easy to find people expressing doubts and insisting they can "tell by the pixels". Sometimes even legitimate photos get accused of being photoshopped which seems healthy.
The other side of this is that ai tools are being treat like magic to the point that people are denying well documented events happened at all, such as the shooting of Charlie Kirk - conspiracies abound!
Also bizarrely a subsection of the population seems to be really into blatantly ai generated images, just hop onto Facebook and see for yourself. I wonder if it has something to do with whatever monkey brain things makes people download apps with a thumbnail of a guy shouting or watch videos that have a thumbnail of a face with its mouth open, since ai generated photos seem very centralized around a single face making a strange expression.
People of a profound initial bias will, in general, believe anything that supports that bias, and reject anything that challenges it, in both cases without any real consideration or thought whatsoever. So I don't think examples of individuals being "misled" by e.g. AI generated images or video, to extremes, is entirely realistic. Rather they were already at those extremes and will just eat up anything that appeals to those extremes.
To take a less politically charged example, imagine there is fake content 'proving' that the Moon landing is faked. Is that going to meaningfully sway people who don't have a major opinion one way or the other? Probably not, certainly not in meaningful numbers. And in general I think the truth does come out on most things. And when people find they have been misled, particularly if it was somebody they thought they could trust, it tends to result in a major rubber-banding in the opposite direction.
I call it the antibody effect. My favorite example is clickbait headlines like, "Five things you MUST do if you're doing this thing. You'd never guess #3!" It used to be everywhere and now it's nowhere.
AI is starting to show this effect - people stay away from em-dashes. There's that yellowish tinge and that composition which people avoid on art. Some of this is bad, but we can probably live without it.
Instead of blocking channels, trying going through your watch history and deleting anything you've watched in the past that's similar to those. YouTube heavily leans on your history for recommendations, so if it's recommending those, it's because you've watched related stuff.
My YouTube feed never recommends any of that garbage.
Absolutely not. All that happened is most people became aware that "Nigerians offering you money are scammers." But they still fall for other get rich quick schemes so long as it diverges a little bit from that known pattern, and they'll confidently walk into the scam despite being warned and saying, "It's not a scam, dumbass. It's not like that Nigerian price stuff." If anything, people seem to be becoming more confident that they're in on some secret knowledge and everyone else is being scammed.