Sarah T. Roberts: New Book Looks “Behind the Screen” of Commercial Content Moderation

Internet scholar publishes groundbreaking volume on hidden workforce that monitors objectionable online material.

Amid the intricacies of social media and its minefield of violent and heinous acts that are posted by users worldwide, Sarah T. Roberts has authored a revolutionary work on the global workforce that labors to remove as much objectionable online material as possible before the general public sees it.

With her recent book, “Behind the Screen: Content Moderation in the Shadows of Social Media,” Roberts, an assistant professor in the UCLA Department of Information Studies, examines the shadowy world of commercial content moderation (CCM), a phrase she coined in 2010 when beginning her intensive research on the poorly-paid and unknown individuals throughout the world who are hired by social media firms to detect and remove online material that reflects the worst aspects of humanity.

While the ability to eliminate offensive – and sometimes downright evil – content from the internet seems straightforward, the conditions, laws, and cultural mores under which CCM workers have to operate is anything but. With “Behind the Screen,” Professor Roberts delves deeply into the complicated environment that CCM workers, tech companies, social media firms, and public users dwell in, with a sympathetic and uncompromising eye on a workforce whose physical, mental, and emotional health is constantly in jeopardy. She also takes to task the efforts – or lack thereof – of social media firms to ensure the safety of their CCM employees and the public.

Roberts served as a consultant on “The Cleaners,” an award-winning documentary on CCM in the Philippines by filmmakers Hans Block and Moritz Riesewieck. In 2018, she convened the first known symposium on “All Things in Moderation” at UCLA, gathering scholars, experts, journalists, and other stakeholders to closely examine the social, political, legal, and cultural impacts of CCM.

Previous to joining the faculty at the UCLA Graduate School of Education & Information Studies in 2016, Professor Roberts was on the Faculty of Information and Media Studies and was a faculty affiliate in the Department of Women’s Studies & Feminist Research at the University of Western Ontario in Canada. She earned her doctorate in library and information science at the University of Illinois at Urbana-Champaign and her master’s degree in library and information studies, her bachelor’s degree in French and Spanish language and literature, and a certificate in women’s studies at the University of Wisconsin-Madison.

Roberts is a sought-after source across print, television and online media, including WIRED Magazine, Al Jazeera America, TV Ontario, TRT World, El Mundo, The London Free Press, and The Los Angeles Times. She has served as a media consultant for Showtime, Netflix, Radiolab, and VICE.

Professor Roberts is a member of the American Society for Information Science and Technology, the Association of Internet Researchers, the International Association for Media and Communication Research, the Society for the History of Technology, and the Union for Democratic Communications. She is a 2018 Carnegie Fellow and a 2018 winner of the EFF Pioneer Award.

Ampersand had the opportunity to speak with Professor Roberts on the many changes that have taken place since she began her ethnographic study of CCM, the ever-evolving legal ramifications of free and not-so-free expression, and the widespread public interest and necessity for all users of the internet to understand how it is “sanitized” for their alleged protection.

Ampersand: What have been some of the greatest changes in commercial content moderation since you began looking into it in 2010?

Sarah T. Roberts: The biggest change is that this is now a phenomenon that is much more familiar to most people in the general public, people who follow social media, people who use social media, and perhaps most importantly, to people in a position to exercise regulatory power. So that means government, our legislative bodies and others, including advocacy groups and industry itself.

When I began this research, many people who were in the social media industry were not completely involved in this particular part of the production of social media. They may not have even understood its importance to their own firm. So, all of those things have really shifted in time, due to a number of factors. I like to think that academic research and writing on the topic is one of the levers that has been pulled to exercise that change. I think also journalists that work in that area and advocacy work have also played a role.

&: Since that time, the CCM workforce has increased – have measures to protect and in some cases, rehabilitate these moderators, improved in your opinion?

Roberts: It’s a complex question because one of the primary findings in my research was the way in which the workforce was a truly global one. And not only was it global but you will find people doing this type of work in many different working conditions. So, if you are a moderator working onsite in Silicon Valley versus a person doing this work on a digital piecework platform like Amazon Mechanical Turk, those experiences can be extremely different. You may be simply working out of your home somewhere out in the world with no peer support and not going to an office. But even those [CCM] workers who go to an onsite facility – which may look more like a call center in many cases nowadays – the level of support, the pay, and the conditions of the work vary so greatly from company to company and from place to place in the world. So, it’s hard to make really blanket statements about the kinds of conditions these workers will find.

That having been said, I was very heartened by the fact that about a month ago, Facebook made a significant announcement that they intended to raise the rate of pay across the board for anyone doing commercial content moderation in any of their third-party facilities in the United States. And they also made a number of other announcements around a variety of standards that they already had and/or intended to improve.

But Facebook is one company among many, many companies that require this kind of work. Of course, because it was their first move in this direction and they have a bit of an easier time around dealing with partners who are in the United States – that was a fairly limited improvement. I think that the notion is we will see those improvements roll out through Facebook [moderation] centers around the world, but that has yet to be announced.

And it also leaves open the question of what the status quo is in other places. Facebook has just raised the bar but will others follow suit? I think we have to be concerned about that as well.

&: Other nations seem to have more policies control over content, in varying degrees. What are the advantages and disadvantages of our freedoms of speech and expression in the U.S.?

Roberts: I think it’s great that you bring up the comparative issue because certainly, other countries have been leading the charge in putting pressure on firms to respect their legal norms when operating within their markets. One example that comes to mind is the case of Germany, which has very stringent anti-hate speech laws, particularly as it pertains to Nazi war glorification material and anti-Semitism, but also on other grounds.

In 2018, Germany enacted a law, NetzDG, that applies to all social media platforms of two million users or more – which is frankly, quite a few of them – that says that their legal norms have to be respected with regard to any content circulating within Germany. What this meant was that the companies had to scramble to respond and that is where this issue of linguistic and cultural competency came into play in a very direct way. Call centers very quickly sprang up in Berlin and other parts of Germany to respond.

In the case of the United States, one of the things that I try to point out in the book and one of the arguments that I try to make repeatedly is the fact that while any American users of social media use those platforms under a sort of tacit impression that they are participating in a space that allows for free speech as a fundamental principle, that’s actually not at all true.

And the reason that that’s not true and it’s never been true is because the platforms that we all use, that we consider social media for sharing information and engaging with each other, are in fact designed to solicit advertisers and to monetize activity to appeal to their true primary customers – advertisers, businesses, merchants, and others.

And so, this means that these platforms have never, ever been in the business of relinquishing control over what materials circulate on their platforms. What they have done is use to their own benefit – and this is why I talk about commercial content moderation as a phenomenon that should be understood fundamentally and primarily as a function of brand management for the platforms themselves.

Since the platforms have existed they have made all sorts of content-related decisions about a variety of contexts. At the same time, in order to get all of us to participate, particularly in the American context where we have a very powerful relationship to notions of freedom of expression, there has been this tacit and sometimes overt claim that we should engage on these platforms in order to experience freedom of speech. And it’s just not the truth.

In these commercial and privately-owned spaces, that principle has never been applied significantly. We might think about the difference between the public square in a community and a shopping mall. Now, we may all go to the shopping mall to do our shopping, we may go there to have a cup of coffee with a friend or to do our exercise and mall walks, or we may go there to distribute pamphlets about a political issue.

Well, when we start doing things that impact the business of that shopping mall, we will find ourselves removed from the shopping mall. So, when somebody is trying to solicit or distribute political information, that might be the time they are asked to leave, and that is the kind of discretion that platforms can also exercise in the very same way because they are privately-owned spaces designed for commerce in a very similar way.

Therefore, one of the goals I had with the book was to truly articulate this process of commercial content moderation as a function of brand management for the platform so that we can really collectively understand what the real rules of engagement are and to understand that at the end of the day, we operate at the pleasure of the platform. The decisions they make will always come down to their best interest. Once we understand that, we can have a better and more robust discussion about how we all collectively feel about that relationship.

&: Overseas moderators often have to be trained in understanding facets of culture in other nations that may not be part of their own culture – has this been really effective?

Roberts: I think that that is a primary challenge in this sub-industry of social media. It makes us think about the gamut of cultures and languages, their attendant politics and identities, and other kinds of issues that are played out often along lines of language and culture. It is very easy to see that, for example, just because someone speaks Spanish, it doesn’t mean they’re going to understand references and incidents specific to Northern Mexico, for example.

Even when those linguistic competencies are in place – special regional knowledge, people’s understanding about the platforms, the firms’ willingness to allow for political conflict to play itself out on the platform itself – these moderators have a very significant role to play in terms of the decisions that they make.

&: How does an awareness of CCM intersect with the need for critical media literacy?

Roberts: Being that I am a professor in the Department of Information Studies at UCLA, which is itself a department within the Graduate School of Education & Information Studies, I am absolutely thinking all the time about the ways in which this heretofore unknown and underdiscussed practice and set of processes impact what people see and perceive online.

When we think about preparing future information professionals, librarians and other information intermediaries or preparing people for leadership positions in public schools or higher education, I think that having these kinds of discussions and doing this research that can present a much bigger and frankly, more honest picture about the pitfalls and limits of reliance upon online information sources, most of which are advertising platforms at their core.

We can help by preparing the graduate students that we have to be real stewards of knowledge and to in turn, help the rest of the public to more accurately make decisions about things like the voracity of information or the reasons for which some information remains [available] while other information does not, or why those decisions might be considered business decisions alongside informational decisions. I think all of that goes to this concept of increasing all of our collective critical information and media literacy.

&: “Behind the Screen” takes a hard look at the emotional and psychological impact on CCM workers who view countless hours of viewing objectionable content in order to do their jobs. As for the general public,  there is lag time between the uploading of objectionable material and moderators’ detection of it – what responsibility do tech companies bear for that, or do they?

Roberts: Again, this is somewhat site-dependent in terms of where in the world we’re talking about. One of the things that I discuss in the book is the existence of Section 230 of the Communications  Decency Act of 1996. Over 20 years ago, that gave freedom from liability to platforms in large part for the process of material that could be illegal or harmful. It also gave them the power and discretion to remove said material.

In the United States, Section 230 is the operating principle and paradigm under which social media platforms solicit and circulate user-generated content. I think it could be argued that Section 230 is what allowed for the proliferation of these massive, behemoth firms behind the platforms in the first place.

But Section 230 is a U.S.-based law and a standard. What happens is that the companies are highly global and they endeavor to capture the entire world as their user base, and Section 230 is not relevant in many places. The firms have sort of been sort of operating for a long time under the paradigm of U.S. law and have been in effect, exporting those principles based on how they run their platforms and run their businesses. But I think there will be greater pressure to again, obey local norms, to respond to new legislation that may come from, for example, the level of the European Union.

It is absolutely true that one of the changes that came about since I began this work is the proliferation of computational tools and automation that can be used, for example, to get some of the worst of the worst material [removed] before it is seen by any users at all. Obviously, the larger the firm and the greater capacity it has for financial expenditure and the greater brain trust it has in engineering, the better equipped it is to build the tools to do this sort of thing. But if you ask any industry insider, which I have done over the years, about the likelihood of those computational tools being able to full-scale replace the human [CCM] workers at any point in the foreseeable future, that likelihood is very, very low. There will always be some human oversight needed in the process, which means in some cases, the introduction of these computational tools will actually increase the workload and increase the need for humans rather than lessen it.

&: As the author of the foundational work on CCM, how do you hope “Behind the Screen” opens a global discussion on what needs to be done to ensure the safety of the internet without compromising the physical and mental health of those who monitor it?

Roberts: I think that the book operates on a couple of registers, that is my hope and goal. First and foremost, my goal was to give voice to a cadre of workers who have been structurally and systematically silenced, often as a precondition of their work. They may have signed nondisclosure agreements; they’re warned about talking about working conditions to the press and to people like me. Many of them risked quite a bit to be willing to contribute to my research and, in so doing, they really gave me this firsthand account of what it is they do and the complexities of what it is they do.

I think it also goes beyond a story of sensationalistic worker exploitation – or, that is part of the story and it’s an important part to understand, but it’s inadequate as the whole tale. What I try to do with the book is to give that aspect of the workers’ work-life experience the full breadth that it needs to be apprehended and understood. This includes the pride that they take in their work, for example, or the meaning that they make out of what may seem like fairly meaningless work.

I wanted to create a volume that could be placed in the hands of the entire range of people. You don’t have to be steeped in technology, you don’t have to be an academic to read and understand this book, in my opinion. My goal was really to write a book that is academic and based on academic research, but could resonate across the broadest possible audience. If I have achieved that even in part, I’ll feel that I’ve succeeded.

In fact, a colleague just told me that she had ordered the book for herself while visiting her parents, but her mother, who is in her mid-70s, had intercepted the mail and is now reading the book. To my mind, that was a wonderful kind of result, that someone like her mom would be compelled by this story.

I wanted to reach fellow academics and students, but also people who might be regulators, to help them understand the nature of this phenomenon in [the tech] industry and its impact. Another constituency [that I hoped to reach] is of course, people who are in a position to make change around these issues, and that includes people in Silicon Valley, for example. People who are in firms who I happen to know are reading the book because I’ve received communications from [them] telling me they are reading it. That makes me feel wonderful – that makes me feel that the right people are reading it.

To learn more about Professor Roberts’ work, visit these links:

“Behind the Screen” in the LA Review of Books

Interview with NPR (KERA, Dallas)

Interview with Wisconsin Public Radio’s Central Time 

Emmy nomination for “The Cleaners”

Photo by Stella Kalinina