UCLA’s Sarah T. Roberts has written a deeply probing and incisive work about the global workforce that labors to remove as much objectionable online material as possible from the world’s most-popular social media platforms before the general public sees it.

In “Behind the Screen: Content Moderation in the Shadows of Social Media,” Roberts, an assistant professor in the UCLA Department of Information Studies, examines the shadowy world of commercial content moderation, or CCM. Roberts coined the term in 2010 when beginning her intensive research on the poorly paid and unknown individuals who are hired by social media firms — often as contractors — to detect and remove online material that reflects the worst aspects of humanity, like extreme acts of violence and sometimes even criminal behavior.

While the ability to eliminate this content from the internet seems straightforward, the conditions, laws and cultural mores under which CCM workers have to operate is anything but. In her book, Roberts delves deeply into the complicated environment that CCM workers, social media and tech companies, and public users dwell in, with a sympathetic and uncompromising eye on a workforce whose physical, mental and emotional health is constantly in jeopardy. She also takes to task the efforts — or lack thereof — of social media firms to ensure the safety of their CCM employees and the public.

Roberts served as a consultant on “The Cleaners,” an award-winning documentary by filmmakers Hans Block and Moritz Riesewieck about content moderators in the Philippines. In 2018, she convened the first known symposium on commercial content moderation, gathering scholars, experts, journalists and other stakeholders at UCLA to closely examine the social, political, legal, and cultural impacts of CCM.

Ampersand had the opportunity to speak with Roberts on the many changes that have taken place since she began her ethnographic study of CCM, the ever-evolving legal ramifications of free and not-so-free expression, and the widespread public interest and necessity for all users of the internet to understand how it is “sanitized” for their alleged protection.

What have been some of the greatest changes in commercial content moderation since you began looking into it in 2010?

The biggest change is that this is now a phenomenon that is much more familiar to most people in the general public, people who follow social media, people who use social media, and perhaps most importantly, to people in a position to exercise regulatory power. So that means government, our legislative bodies and others, including advocacy groups and industry itself.

When I began this research, many people who were in the social media industry were not completely involved in this particular part of the production of social media. They may not have even understood its importance to their own firm. So, all of those things have really shifted in time, due to a number of factors. I like to think that academic research and writing on the topic is one of the levers that has been pulled to exercise that change. I think also journalists that work in that area and advocacy work have also played a role.

Since that time, the CCM workforce has increased — have measures to protect and in some cases, rehabilitate these moderators, improved in your opinion?

It’s a complex question because one of the primary findings in my research was the way in which the workforce was a truly global one. And not only was it global but you will find people doing this type of work in many different working conditions. So, if you are a moderator working onsite in Silicon Valley versus a person doing this work on a digital piecework platform like Amazon Mechanical Turk, those experiences can be extremely different. You may be simply working out of your home somewhere out in the world with no peer support and not going to an office. But even those [CCM] workers who go to an onsite facility — which may look more like a call center in many cases nowadays — the level of support, the pay, and the conditions of the work vary so greatly from company to company and from place to place in the world. So, it’s hard to make really blanket statements about the kinds of conditions these workers will find.

That having been said, I was very heartened by the fact that about a month ago, Facebook made a significant announcement that they intended to raise the rate of pay across the board for anyone doing commercial content moderation in any of their third-party facilities in the United States. And they also made a number of other announcements around a variety of standards that they already had and/or intended to improve.

But Facebook is one company among many, many companies that require this kind of work. Of course, because it was their first move in this direction and they have a bit of an easier time around dealing with partners who are in the United States — that was a fairly limited improvement. I think that the notion is we will see those improvements roll out through Facebook [moderation] centers around the world, but that has yet to be announced.

And it also leaves open the question of what the status quo is in other places. Facebook has just raised the bar but will others follow suit? I think we have to be concerned about that as well.

Overseas moderators often have to be trained in understanding facets of culture in other nations that may not be part of their own culture – has this been really effective?

I think that that is a primary challenge in this sub-industry of social media. It makes us think about the gamut of cultures and languages, their attendant politics and identities, and other kinds of issues that are played out often along lines of language and culture. It is very easy to see that, for example, just because someone speaks Spanish, it doesn’t mean they’re going to understand references and incidents specific to Northern Mexico, for example.

Even when those linguistic competencies are in place — special regional knowledge, people’s understanding about the platforms, the firms’ willingness to allow for political conflict to play itself out on the platform itself — these moderators have a very significant role to play in terms of the decisions that they make.

As the author of the foundational work on CCM, how do you hope “Behind the Screen” opens a global discussion on what needs to be done to ensure the safety of the internet without compromising the physical and mental health of those who monitor it?

I think that the book operates on a couple of registers, that is my hope and goal. First and foremost, my goal was to give voice to a cadre of workers who have been structurally and systematically silenced, often as a precondition of their work. They may have signed nondisclosure agreements; they’re warned about talking about working conditions to the press and to people like me. Many of them risked quite a bit to be willing to contribute to my research and, in so doing, they really gave me this firsthand account of what it is they do and the complexities of what it is they do.

I think it also goes beyond a story of sensationalistic worker exploitation — or, that is part of the story and it’s an important part to understand, but it’s inadequate as the whole tale. What I try to do with the book is to give that aspect of the workers’ work-life experience the full breadth that it needs to be apprehended and understood. This includes the pride that they take in their work, for example, or the meaning that they make out of what may seem like fairly meaningless work.

I wanted to create a volume that could be placed in the hands of the entire range of people. You don’t have to be steeped in technology, you don’t have to be an academic to read and understand this book, in my opinion. My goal was really to write a book that is academic and based on academic research, but could resonate across the broadest possible audience. If I have achieved that even in part, I’ll feel that I’ve succeeded.

In fact, a colleague just told me that she had ordered the book for herself while visiting her parents, but her mother, who is in her mid-70s, had intercepted the mail and is now reading the book. To my mind, that was a wonderful kind of result, that someone like her mom would be compelled by this story.

I wanted to reach fellow academics and students, but also people who might be regulators, to help them understand the nature of this phenomenon in [the tech] industry and its impact. Another constituency [that I hoped to reach] is of course, people who are in a position to make change around these issues, and that includes people in Silicon Valley, for example. People who are in firms who I happen to know are reading the book because I’ve received communications from [them] telling me they are reading it. That makes me feel wonderful – that makes me feel that the right people are reading it.

Click here to read the full Q&A with Roberts and to learn what she thinks about commercial content moderation and media literacy, the advantages and disadvantages of freedoms of speech and expression in the United States.