The Human Consequences of Content Moderation

While debate rages around the world about how censorship could or should be used to moderate content posted on social media platforms, spare a thought for those people who actually have to view the stuff that’s not fit for you to see.

In a report for The Verge, an American technology news and media network operated by Vox Media, reporter Casey Newton spends time with the moderators who are hired to decide what’s suitable for publication on Facebook. Based in Phoenix, Arizona, the staff are part of an organisation which The Verge claims Facebook is using to outsource its huge burden of viewing contentious online posts.

The Trauma Floor, the secret lives of Facebook moderators in America, follows the daily routine of some of those people. As Casey Newton says, they view violent and graphic content and have to make decisions on what does or does not contravene censorship guidelines:

Those challenges include the sheer volume of posts; the need to train a global army of low-paid workers to consistently apply a single set of rules; near-daily changes and clarifications to those rules; a lack of cultural or political context on the part of the moderators; missing context in posts that makes their meaning ambiguous; and frequent disagreements among moderators about whether the rules should apply in individual cases.

The report highlights the difficulty of making the correct censorship decisions in practice, even when following detailed rules, the low remuneration the staff receive, and the great cost to mental health of those actually doing it.

Censorship designed to maintain public order is a contentious matter. Recent controversial decisions include the banning of English anti-Muslim activist Tommy Robinson from Facebook and the removal of a Netflix comedy in Saudi Arabia on the grounds of maintaining public order and values.

The spread of so-called “fake news” and the ability for organisations, agencies or governments to use social media to manipulate public debate is a hot topic worldwide. The EU, for example, is seeking to impose more penalties on social media platforms if they make the ‘wrong’ decisions about filtering content.

The UK’s House of Commons Digital, Culture, Media and Sport Committee has been tackling the relevant issues, publishing a report on disinformation and ‘fake news’. However, there has been little discussion about the practical challenge of monitoring millions of individual posts, and time involved in doing so. Politicians may not have considered fully the implications of this, and so do not admit that platforms will inevitably err on the side of caution by potentially over censoring content.

As the Digital, Culture, Media and Sport Committee concludes in the report:

This has highlighted a worldwide appetite for action to address issues similar to those that we have identified in other jurisdictions. This is the Final Report in our inquiry, but it will not be the final word. We have always experienced propaganda and politically aligned bias, which purports to be news, but this activity has taken on new forms and has been hugely magnified by information technology and the ubiquity of social media.

The big tech companies must not be allowed to expand exponentially, without constraint or proper regulatory oversight. But only governments and the law are powerful enough to contain them. The legislative tools already exist. They must now be applied to digital disinformation and ‘fake news’: using tools such as privacy laws, data protection legislation, antitrust and competition law. If companies become monopolies they can be broken up, in whatever sector.

Facebook’s handling of personal data, and its use for political campaigns, are prime and legitimate areas for inspection by regulators, and it should not be able to evade all editorial responsibility for the content shared by its users across its platforms.

Eric Priezkalns has argued previously that EU rules are unrealistic and unenforceable and contrary to the spirit of the open internet. Censorship is a thorny problem on which it will prove difficult to achieve consensus.

Marianne Curphey
Marianne Curphey
Marianne Curphey is an award-winning freelance writer, blogger and columnist. She is a former Editor of Guardian Money online, City News Editor of The Guardian, Insurance Correspondent of The Times and Deputy Personal Finance Editor at The Times.