Sarah T. Roberts: NSF Grant to Support Study on Online Content Moderation

International research will complement the work of the new Center for Critical Internet Inquiry at UCLA.

Sarah T. Roberts, an assistant professor at UCLA’s Department of Information Studies, has been awarded part of a $1.5 million Future of Work Grant by the National Science Foundation for her research on decision making processes and supports for tech workers who moderate online content. Roberts, a groundbreaking media scholar who will serve as co-principal investigator of the study, with Donghee Yvette Wohn from the New Jersey Institute of Technology and Libby Hemphill of the University of Michigan iSchool,will receive $300,000 of the NSF grant, in partnership with iSchools at the University of Michigan and the New Jersey Institute of Technology. 

For almost a decade, Roberts’ research has centered on commercial content moderation, a term she coined. In her work, she has documented the challenges to social media and tech companies in monitoring objectionable and dangerous material on their sites through the use of a human workforce that is largely employed in developing countries and some regions of the United States. Despite the use of algorithms that help to detect and remove problematic content, a great toll is taken on members of the moderation workforce in a field that is poorly paid. Little or no measures are taken to protect the psychological, emotional, and ultimately, physical health of these workers, whether through access to professional care or insurance benefits to obtain such care. 

“As an all-women team of three PIs from three different institutions, we will take a multidisciplinary, collaborative approach to documenting, and then working on developing solutions to, some of the primary challenges of decision-making for human actors doing the commercial content moderation of social media,” says Roberts.

The NSF project, entitled, “Augmenting Social Media Content Moderation,” includes outreach and engagement activities with academic and industry members, as well as with policy-makers and  the public to ensure that the project’s findings and tools support a broad range of stakeholders who are impacted by user-generated content and its moderation. Main objectives include improving an understanding of the decision making process through interviews of content moderation workers across a variety of domains; the socioeconomic impact of technology-augmented moderation through industry personnel interviews; testing of interventions to decrease the emotional toll on content moderation workers and optimize their performance through a series of experiments utilizing theories of stress alleviation; designing, developing, and testing a suite of cognitive assistance tools for live streaming moderators to help manage their emotional and cognitive capabilities; and employing a historical perspective to analyze tech companies’ content moderation policies to inform legal and platform policies.

Professor Roberts, who co-directs the newly established Center for Critical Internet Inquiry, or C2I2,  at UCLA with Safiya U. Noble, UCLA associate professor of Information Studies, sees the research for the NSF as an ideal match with the new Center’s aims highlighting the intersection of the technological and the social, toward the end of better and more socially just outcomes for all. 

“The new Center is a perfect vehicle to serve as a home for this research agenda, and Dr. Noble and I also anticipate that it will be an excellent clearinghouse for dissemination of our research to a variety of potential audiences,”says Roberts. “I am excited to see my decade-long research on commercial content moderation find an institutional home, alongside Dr. Noble’s pathbreaking work on algorithmic bias, in C2I2, here at GSE&IS, and at UCLA, more generally.”

Professor Roberts hopes that the NSF-funded research will help the public at large develop a better understanding of how content moderation is done and how to support the workforce that labors to keep the internet safe, open, and an asset to society rather than a plague or something much more sinister. 

“Not only do we see this research as having benefit to the content moderators themselves, whose work is frequently psychologically difficult and requires significant cognitive skill, but we expect our findings will be highly relevant to industry and advocates alike,” notes Roberts. “It’s exciting to be a part of this research team, and to receive an award from the NSF Future of Work initiative at a time when that emphasis feels critical. We are grateful for the funding of our proposal and look forward to getting to work.”

Photo by Stella Kalinina