Hacker News new | past | comments | ask | show | jobs | submit login

He frames this as a behavior problem, not content problem. The claim is that your objective as a moderator should to get rid of users or behaviors that are bad for your platform, in the sense that they may drive users away or make them less happy. And that if you do that, you supposedly end up with a fundamentally robust and apolitical approach to moderation. He then proceeds to blame others for misunderstanding this model when the outcomes appear politicized.

I think there is a gaping flaw in this reasoning. Sometimes, what drives your users away or makes them less happy is challenging the cultural dogma of a particular community, and at that point, the utilitarian argument breaks down. If you're on Reddit, go to /r/communism and post a good-faith critique of communism... or go to /r/gunsarecool and ask a pro-gun-tinged question about self-defense. You will get banned without any warning. But that ban passes the test outlined by the OP: the community does not want to talk about it precisely because it would anger and frustrate people, and they have no way of telling you apart from dozens of concern trolls who show up every week. So they proactively suppress dissent because they can predict the ultimate outcome. They're not wrong.

And that happens everywhere; Twitter has scientifically-sounding and seemingly objective moderation criteria, but they don't lead to uniform political outcomes.

Once you move past the basics - getting rid of patently malicious / inauthentic engagement - moderation becomes politics. There's no point in pretending otherwise. And if you run a platform like Twitter, you will be asked to do that kind of moderation - by your advertisers, by your users, by your employees.




> Challenging the cultural dogma [doesn't work]

That is a byproduct of Reddit specifically. With 90s style forums, this kind of discussion happens just fine because it ends up being limited to a few threads. On Reddit, all community members must interact in the threads posted in the last day or two. After two days they are gone and all previous discussion is effectively lost. So maybe this can be fixed by having sub-reddits sort topics by continuing engagement rather than just by age and upvotes.

A good feature would be for Reddit moderators to be able to set the desired newness for their subreddit. /r/aww should strive for one or two days of newness (today's status quo). But /r/communism can have one year of newness. That way the concerned people and concern trolls can be relegated to the yearly threads full of good-faith critiques of communism and the good-faith responses and everyone else can read the highly upvoted discussion. Everything else could fall in-between. /r/woodworking, which is now just people posting pictures of their creations, could split: set the newness to four months and be full of useful advice; set the newness for /woodworking_pics to two days to experience the subreddit like it is now. I feel like that would solve a lot of issues.


The whole idea of "containment threads" is a powerful one that works very well in older-style forums, but not nearly as well on Reddit. "containment subs" isn't the same thing at all, and the subs that try to run subsubs dedicated to the containment issues usually find they die out.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: