The price other people pay

I have just finished reading  The Secret Rule of the Internet. It is an extremely interesting and insightful account of the realities of how moderation works in practice within large social media companies. It is rather long but that appears to reflect the considerable amount of research that went into the writing. I feel rather embarrassed that I hadn’t seen it before.

The sub-heading speaks of the murky history of moderation and how it’s shaping the future of free speech.  However, it is by no means a dreary or predictable rant in favour of zero controls or maximum latitude to hurt, insult or offend.

I will not try to summarise the entirety of Secrets. It deserves to be read in full by everyone with a serious interest in internet policy. Look in particular at what it says, almost as an aside, about s.230 of the Communications Decency Act, 1996. Moreover, while there is a bit of good news there was nevertheless one aspect that conjured up disturbing reflections.

Near the beginning of the text we find an account of a moderator who had to look at a video someone had posted of something horrible being done to a child in a hotel room. Ten years later the image still haunts her. The issue of the welfare of the people who do the actual moderating is a major theme.

We are reminded that, ostensibly in the name of free speech, there is a human price that has to be paid in order to provide platforms for what are, in reality, some really sick individuals. But it’s not a price that is routinely paid by the golden, wealthy elites who own or have senior positions in the companies that give this stuff an airing. Neither is it normally paid by those who campaign so ferociously to defend the status quo. On the contrary. Moderation remains a relatively low-wage, low-status sector, often managed and staffed by women.

And guess what? A lot of it takes place offshore.

Secrets cites Digital Refuse, by Sarah Roberts in which the author shows us that the same places that are sent the unwanted physical waste of the more affluent world are now also being sent “our” virtual toxins.

child abuse and pornography, crush porn, animal cruelty, acts of terror, and executions — images so extreme those paid to view them won’t even describe them in words to their loved ones…

…there they sit in crowded rooms at call centers, or alone, working off-site behind their screens and facing cyber-reality, as it is being created. Meanwhile, each new startup begins the process, essentially, all over again.

 There is a reference to a Wired story from 2014 where Adrian Chen documented the work of front line moderators operating in modern-day sweatshops. In Manila, Chen witnessed a secret army of workers employed to soak up the worst of humanity in order to protect the rest of us.

However, in offices located  elsewhere, we are told

To safeguard other employees from seeing the… images… (the moderators were) sequestered in corner offices; their rooms were kept dark and their computers were equipped with the largest screen protectors on the market. 

Even so….

Members of the team quickly showed signs of stress — anxiety, drinking, trouble sleeping — and eventually managers brought in a therapist. As moderators described the images they saw each day, the therapist fell silent. The therapist… was “quite literally scared.”

I wonder how many therapists there are in Manila?

It is not all doom and gloom. Reassuringly we are told

 Some large established companies like YouTube, Pinterest, Emoderation, Facebook, and Twitter are beginning to make headway in improving moderation practices, using both tech and human solutions.

Let’s hope that is correct but again we have to take it all on trust because everything is surrounded by secrecy.

Poor people in poor countries, desperate for work, are not often described as being the foot soldiers, the poor bloody infantry, of free speech. Yet that is the reality.

In another part of the forest and under a variety of guises, there has been an entirely proper focus on the supply chain companies use to manufacture or deliver their products or services. Typically these initiatives have been designed to eliminate child labour, slavery or environmental harms. Isn’t it time Silicon Valley was pressed to do something about the huddled masses who daily have to face the unfaceable?

 

About John Carr

John Carr is a member of the Executive Board of the UK Council on Child Internet Safety, the British Government's principal advisory body for online safety and security for children and young people. In the summer of 2013 he was appointed as an adviser to Bangkok-based ECPAT International. Amongst other things John is or has been a Senior Expert Adviser to the United Nations, ITU, the European Union, a member of the Executive Board of the European NGO Alliance for Child Safety Online, Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Internet governance, Regulation, Self-regulation. Bookmark the permalink.