Facebook documents reveal how social network moderates content

Moderating content is a daunting task, and it isn’t going to be solved with hundreds of manuals. For instance, the fact that Facebook users can live stream “attempts to self harm” is simply too hands-off to make sense as a principle for getting the person attempting suicide help. Perhaps, Facebook will evolve its own rules and mores, social habits, and user assistance ideas.

But this doesn’t sound like the appropriate start — a comprehensive engagement with users to get their assistance in identifying and responding to controversial and life-threatening acts by users would allow a natural ethos of behavior to evolve. Of course, that takes time and patience, but Facebook is reacting to a “crisis” that will never abate without users’ involvement.

The training manuals, which were published by the Guardian on Monday, reveal how the social media group’s 4,500 global moderators judge when to remove or allow offensive content.

Source: Facebook documents reveal how social network moderates content

Leave a Reply

Your email address will not be published.