One is 'put up or shutup' for appeals of moderator decisions.
That is anyone who wishes to appeal needs to also consent to have all their activities on the platform, relevant to the decision, revealed publicly.
It definitely could prevent later accusations of secretiveness or arbitrariness. And it probably would also make users think more in marginal cases before submitting.
This is something that occurs on twitch streams sometimes. While it can be educational for users to see why they were banned, some appeals are just attention seeking. Occasionally though it exposes the banned user’s or worse a victim users personal information, (eg mental health issues, age, location) and can lead to both users being targeted and bad behaviour by the audience. For example Bob is banned for bad behaviour towards Alice (threats, doxxing), by making that public you are not just impacting Bob, but could also put Alice at risk.
This also used to be relatively popular in the early days of League of Legends, people requesting a "Lyte Smite". Players would make inflammatory posts on the forums saying they were banned wrongly, and Lyte would come in with the chatlogs, sometimes escalating to perma-ban. I did always admire this system and thought it could be improved.
There's also a lot of drama around Lyte in his personal life, should you choose to go looking into that.
But those users would be left alone in their pride in the put-up-or-shut-up model, because everybody else would see the mistakes of that user and abandon them. So the shame doesn't have to be effective for the individual, it just has to convince the majority that the user is in the wrong.
Right. To put it another way, this "put up or shut up" system, in my mind, isn't even really there to convince the person who got moderated that they were in the wrong. It's to convince the rest of the community that the moderation decision was unbiased and correct.
These news articles about "platform X censors people with political views Y" are about generating mass outrage from a comparatively small number of moderation decisions. While sure, it would be good for the people who are targeted by those moderation decisions to realize "yeah, ok, you're right, I was being a butthole", I think it's much more important to try to show the reactionary angry mob that things are aboveboard.
The most high profile, and controversial, "moderation" decisions made by large platforms recently have generally been for obvious, and very public, reasons.
It is expensive to do, because you have to ensure the content being made public doesn't dox / hurt someone other than the poster. But I think you could add two things to the recipe. 1 - real user validation. So the banned user can't easily make another account. Obviously not easy and perhaps not even possible, but essential. 2 - increased stake. Protest a short ban, and if you lose, you get an even longer ban.
I've never understood that idea that PM's on a platform must be held purely private by the platform even in cases where:
* There's some moderation dispute that involves the PM's
* At least one of the parties involved consents to release the PM's
The latter is the critical bit, to me. When you send someone a chat message, or an email, obviously there's nothing actually stopping them from sharing the content of the message with others if they feel that way, either legally or technically. If an aggrieved party wants to share a PM, everyone knows they can do so -- the only question mark is that they may have faked it.
To me the answer here seems obvious: allow users to mark a PM/thread as publicly visible. This doesn't make it more public than it otherwise could be, it just lets other people verify the authenticity, that they're not making shit up.
One is 'put up or shutup' for appeals of moderator decisions.
That is anyone who wishes to appeal needs to also consent to have all their activities on the platform, relevant to the decision, revealed publicly.
It definitely could prevent later accusations of secretiveness or arbitrariness. And it probably would also make users think more in marginal cases before submitting.