Meet the people who scar themselves to clean up our social media networks

Posted on Posted in News

June 15, 2018 at 04:45AM

 

Sarah T. Roberts is an assistant professor in Information Studies at the University of California, Los Angeles, a former assistant professor at the University of Western Ontario, and a recipient of a Carnegie Fellowship for her research that unveils the practices of commercial content moderation around the globe. She served as an academic adviser for the documentary film The Cleaners. Her book on the phenomenon and its workers is forthcoming from Yale University Press.

It is nighttime, many floors above bustling street level, in an urban office high-rise in the Philippines.

The building is nondescript and largely unremarkable. It could be any one of hundreds of such skyscrapers towering over Manila’s modern business parks. Inside them are Business Process Outsourcing (BPO) firms—more commonly known as call centres—that serve the rest of the globe, doing everything from answering service requests to processing HR claims to writing computer code for the world’s businesses, large and small.

But here, in this office building, on this night, a young worker is toiling at something else. Her task is rote. A seemingly endless stream of images flows across her screen; she pauses only briefly as each new picture appears before issuing a subtle, expedient movement directing her cursor to one of two buttons: ignore or delete. She makes a choice, punctuated by the percussive plastic click of her mouse. Then her process begins again.

She will spend eight hours of her day repeating this task over and over, perhaps thousands of times: approving or deleting images from the social media networks so many of us now use—2 billion worldwide for Facebook alone—that might be pornographic, depict people or animals being harmed, show criminal activity, war zones, child abuse, beheadings. Some images may be innocuous and of no concern at all; others may be shocking, disturbing, and horrifying. She has no way to predict what she will see next, and so she must be ready to contend with whatever the updated stream will bring, as soon as it is presented.

The unpredictability of the images and videos she reviews is due to the fact that she works in a third-party call centre on behalf of North American social media platforms, screening the constant and massive influx of user-generated content (UGC) that is uploaded to them in the form of text postings, photos, videos and, to a growing extent, live streams. The content she sees has likely been flagged, or reported as inappropriate, by users like you or me. Such flagged postings are aggregated and streamed to workers who view, then decide, whether it should stand or should be removed. To most of the world, she is invisible; for the platforms who contract with her employer, her work is mission-critical. Her livelihood comes in the form of contract labour that makes her largely replaceable even as it may change her irrevocably. And for us, a world away, this anonymous woman in a Philippines office park is an arbiter of taste, an implementer of social norms, an enforcer of rules. She is a commercial content moderator, and her decisions, made in mere moments, go directly toward enforcing the policies set by those social media platforms, most of which are headquartered on the other side of the globe and many time zones, levels of influence, and dollars away.

While the Internet’s social communities have long self-organized to utilize the labour of volunteers to keep behaviour and content of its members aligned with its own norms (and some, such as Reddit or Wikipedia, famously still do), the rise of these practices as organized, for-pay work on a large and global scale parallels the widespread adoption of mainstream corporatized social media over the past decade-and-a-half. Put simply, big social media requires an army of workers to monitor its dissemination channels for reasons of brand protection and management, regulatory compliance and the maintaining of control. Commercial content moderators are key agents in the creation and curation of the social-media environments in which we spend so much of our social and interpersonal, and increasingly our work and civic, lives. Yet the existence of commercial content moderators has been pushed to the margins, away from the more glamorous aspects of the internet industries; it’s likely to be treated as social media’s dirty little secret, if it’s even acknowledged at all. Commensurate with its status at many social media firms, commercial content moderation has been bid out and shipped around the globe whenever possible, out of sight and out of mind for users and the platforms alike.

At a time when the outsized role that social media plays in our public and political discourse has come under fire from civil society advocates and governments alike, we are increasingly discovering that we lack a full and complete picture of the elements that go into creating the ecologies of our online life. The impact of the work that this young woman does in various forms around the world—along with perhaps upwards of 100,000 others—is invisible to almost all.

How we can identify and measure the impact of these practices, both on the workers and on the platforms, are the questions that I have been pursuing over the past eight years in my work as a professor, a researcher and an academic focused on the phenomenon of commercial content moderation. My first and most important task in this pursuit has been simply to unveil the existence of this legion of workers—to draw out its contours and its characteristics, and to follow the circulation of the work around the globe as it is outsourced to economies where labour comes cheap and the availability of a young and eager workforce is prevalent. The journey has taken me from the expected quarters of Silicon Valley to places I never suspected I would go, like the high-rise office blocks overlooking Manila’s business parks, or the TGI Fridays outlets below them where happy hour starts at 7 AM as the workers come off their night shift, moderating social media during the waking hours of the West.

While concern for the well-being of commercial content moderators, wherever they reside, has always been at the forefront of my research, unveiling the working conditions of CCM workers has led directly to engagement with the policies under by which they make their decisions. At the very least, these moderation policies are often favourable to the platforms and the values they espouse, which can have deep political implications and can impact people’s lives.

The Cleaners, a documentary film that opens on Jun. 22 in Canadian theatres, is incredibly timely. It focuses on Filipino commercial content moderators and the outcomes of their decisions at a moment when major stories like the exposé on Facebook and Cambridge Analytica or Google and its involvement with drone warfare are provoking the public to ask tougher and more complex questions about how social media works. Understanding who makes these media and knowing about the working conditions and the work tasks of those people are key to understanding more about these higher-order questions of where social media should fit into our social and political fabric. The Cleaners, like my own research, leads audiences on an important journey to connect the dots worldwide, leaving those who view it to ask, at the least, “What responsibility do we have for creating the necessity for this shadow industry by virtue of our own social media use?”

It is a question not easily or comfortably answered, but it is one we certainly all must ask, and at all echelons of our engagement—personal, professional and political. To be sure, some of these questions are being asked already and the platforms are increasingly on the defensive, with the rollout of the European Union’s powerful General Data Protection Regulation (GDPR) legislation on the horizon, and with Mark Zuckerberg himself appearing for the first time before the United States Congress in the wake of the ongoing Cambridge Analytica scandal. Further, such interrogations by the public, journalists, activists, regulators and academics like me, among many others, are prompting change in the way Silicon Valley is doing business. On Apr. 24, Facebook released a comprehensive and front-facing set of its content moderation guidelines—which had to that point been secret and highly-guarded information used by commercial content moderators to take action on user-generated content—to the public for the first time. This, coupled with a newly unveiled right to appeal content moderation decisions, marks the first look at a new era of transparency for Facebook where its content moderation practices are concerned. Will other firms follow suit?

Much less has been said, however, about transparency with regard to the human lives themselves. Where else in the world are these commercial content moderators? Under what conditions do they toil? What kind of psychological, emotional and monetary support do they receive for the work that they do, and what kind of support might they need after their period of employment is over? Burnout is common among CCM workers, and yet once they move on from their job as social media screeners, there is no aftercare and no follow-through. Creating best practices for support not only during employment, but after, is the next hurdle that must be surmounted in terms of improving conditions for CCM workers. While I hope that The Cleaners and my own years of research have moved the needle on this topic, we await a time when the social media platforms who rely on these workers can all answer these questions publicly, because it will mean change for the better.

We are not there yet. It is still too easy to click, and ignore, and delete.