Hacker News new | past | comments | ask | show | jobs | submit login

I like Yishan's reframing of content moderation as a 'signal-to-noise ratio problem' instead of a 'content problem', but there is another reframing which follows from that: moderation is also an outsourcing problem, in that moderation is about users outsourcing the filtering of content to moderators (be they all other users through voting mechanisms, a subset of privileged users through mod powers, or an algorithm).

Yishan doesn't define what the 'signal' is, or what 'spam' is, and there will probably be an element of subjectivity to these which varies between each platform and each user on each platform. Thus successful moderation happens when moderators know what users want, i.e. what the users consider to be 'good content' or 'signal'. This reveals a couple of things about why moderation is so hard.

First, this means that moderation actually is a content problem. For example, posts about political news are regularly removed from Hacker News because they are off-topic for the community, i.e. we don't consider that content to be the 'signal' that we go to HN for.

Second, moderation can only be successful when there is a shared understanding between users and moderators about what 'signal' is. It's when this agreement breaks down that moderation becomes difficult or fails.

Others have posted about the need to provide users with the tools to do their own moderation in a decentralised way. Since the 'traditional'/centralised approach creates a fragile power dynamic which requires this shared understanding of signal, I completely understand and agree with this: as users we should have the power to filter out content we don't like to see.

However, we have to distinguish between general and topical spaces, and to determine which communities live in a given space and what binds different individuals into collectives. Is there a need for a collective understanding of what's on-topic? HN is not Twitter, it's designed as a space for particular types of people to share particular types of content. Replacing 'traditional' or centralised moderation with fully decentralised moderation risks disrupting the topicality of the space and the communities which inhabit it.

I think what we want instead is a 'democratised' moderation, some way of moderating that removes a reliance on a 'chosen few', is more deliberate about what kinds of moderation need to be 'outsourced', and which allows users to participate in a shared construction of what they mean by 'signal' or 'on-topic' for their community. Perhaps the humble upvote is a good example and starting point for this?

Finally in the interest of technocratic solutions, particularly around spam (which I would define as repetitive content), has anyone thought about rate limits? Like, yeah if each person can only post 5 comments/tweets/whatever a day then you put a cap on how much total content can be created, and incentivise users to produce more meaningful content. But I guess that wouldn't allow for all the sick massive engagement that these attention economy walled garden platforms need for selling ads...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: