Facebook has studied extremism on their platform and they disagree:
> The high number of extremist groups was concerning, the presentation says. Worse was Facebook’s realization that its algorithms were responsible for their growth. The 2016 presentation states that “64% of all extremist group joins are due to our recommendation tools” and that most of the activity came from the platform’s “Groups You Should Join” and “Discover” algorithms: “Our recommendation systems grow the problem.”
I really don't know how you solve this problem because finding people like you is like the #2 reason you even join a social network. Same with IRL social networks. I can't even imagine how much worse my life without the online queer community, it only exists at all because these kinds of algorithms connect people who would never find each other IRL because of fear and physical separation.
Plus this is a very narrow interpretation of the problem. The public discourse issues comes from friends, family, friends-of-friends sharing and spreading articles from the news, obviously fake rage-bait memes, and generally just having an audience for their opinions.
Unfortunately what is the alternative? Does every social network or forum after X users need to consider getting a disinformation center?
The only other way to scale this is to make it illegal to spread disinformation, this way citizens can potentially police one another by suing one another.
It happens in France for specific things, for example if you deny that the holocaust existed.
I don't like your solution and it definitely wouldn't fly in the US but your statement of the problem is correct. This isn't a "Facebook", "Twitter", or "Reddit" problem -- they just happen to be the companies with enough users for the problem to surface. I'm not really sure what people expected to happen in a digital pseudo-public square.
I know it wouldn't fly in the US, as seen per the downvotes I'm receiving, but people also say a lot of other things we see in Europe wouldn't fly in the US, and yet we see more and more discussions about these things here (drug legalization, free education, universal healthcare, gun ban, no death penalty, etc.)
Well then what constitutes misinformation? Is speculation misinformation? Are opinions misinformation? How does misinformation differ from someone exercising their right to free speech, but miswording what they say?
I think that is a slippery slope. In fact you just described communist russia where the government would get citizens to turn each other in for saying the wrong thing(this can easily be used for nefarious purposes). You are now allowing the government to suppress less-popular opinions by labeling them misinformation.
I see this slippery slope argument used a lot, but that pretty much impacts every single law. Every law has to make a decision on where to draw the line, and for free speech the US has decided for a very long time not to draw the line at all (and thus we end up with a lot of misinformation, hate speech, smear campaigns, internet bullies, etc.)