Sarah T. Roberts: Conversation with Facebook’s Ellen Silver

Posted on Posted in News

June 21, 2018 at 01:57PM via https://ampersand.gseis.ucla.edu

Facebook now counts more than 2.2 billion users around the world, who are posting millions of photos, news stories, comments, links and videos every day. So how does a social media company that started at Harvard and is now headquartered in Silicon Valley, connect its global community of users — 87 percent of whom live outside the United States — while safeguarding against objectionable, or even dangerous, content?

An answer to that challenging question and others was part of a recent conversation, which was presented by the UCLA Tech and Innovation Initiative on May 10, between Sarah T. Roberts, assistant professor of information studies at UCLA, and Ellen Silver, Facebook’s vice president of operations.

“When Facebook started, we were on one college campus … Harvard,” said Silver, whose duties include oversight over the team of people who monitor user-generated content on the world’s largest social media platform. “We eventually moved to other college campuses and the types of challenges and issues we may have seen in those environments were very local and towards [a college] community … spam and maybe something like nudity. From that, we’ve moved to really challenging things such as hate speech, terrorist propaganda, or bullying.

“As you go from the United States and towards more international types of issues and types of expression … it is much more complex and nuanced in being able to understand the intent of what is being communicated and what is being shown,” Silver continued. “And a lot of external context is needed to understand what is being shared, as a community.”

Among other topics covered during the conversation, which was held before more than 100 people in Korn Convocation Hall on May 10, were early measures in commercial content moderation, how content moderation has evolved and the skills and job satisfaction of the people who do this kind of work.

Roberts, who worked in information technology prior to her academic career, noted that grassroots efforts in content moderation began in the 1990s.

“Usually it was self-organized,” Roberts said. “These were [moderation] communities that people participated in voluntarily. Some of them had reputations for …  a real laissez-faire attitude toward the kind of things that were permissible. Others had reputations that were more draconian, and there was everything in between. But even with those laissez-faire models, I would argue that a form of moderation policy has always been present.”

Silver, who heads Facebook’s Community Outreach Team, said that Facebook’s intent is to provide a forum for its 2.2 billion users, “to connect, to express themselves, to share,” with consideration for the numerous cultural, language-related, and ethical challenges that might arise from such a diverse online population.

“We know that there are going to be some things that offend or shock, because we are in a global community,” Silver said. “A couple of weeks ago, we launched our updated and revised community standards, that goes into a lot more detail of what is or is not allowed. Our job is to help enforce those policies consistently and accurately, because it creates those experiences for the broader community.”

To ensure Facebook is a positive experience for its users, Silver said the company’s goal is to hire a diverse content moderation workforce — with an emphasis on individuals who have linguistic and cultural skills — so that it can better monitor content that is posted by users worldwide.

Roberts began her study of commercial content moderation while working on her doctorate in library and information science at the University of Illinois at Urbana-Champaign. She said that the idea of content moderation, even in that scholarly space, was unheard of nearly a decade ago.

“From the smallest kind of website that might be doing [e-commerce] to the biggest ones out there — of course, Facebook was already important in 2010 — any site that opens itself up to user-generated content and then circulates it, has to have some mechanism to intervene,” Roberts said. “The funny thing for me was, it was hidden in plain sight. I started asking around, ‘Have you ever heard of this?’ These were people who were steeped in [online technology]. And to a one, they said, ‘I’ve never thought of that.’ The second thing they said was, ‘Don’t computers do that?’”

Roberts underscored the value of human cognition to monitor user-posted content as “essential.” Silver echoed those sentiments, saying that while automation can detect issues such as terrorist threats, more ambiguous misuses such as hate speech require the human touch. Roberts and Silver both noted that content moderation workers often take great pride in their work, and feel that they are providing a vital service to society.

“Some of the people that I’ve talked to who have been successful in this work … find an altruistic orientation to the work that they do,” Roberts said. “They have this sense that they are doing this kind of work, seeing what they’re seeing … to help people in crisis. Sometimes, content moderators are the ones who call in local law enforcement to help somebody. Sometimes, it’s just keeping some of these [objectionable] things out of other people’s view.”

Roberts, who recently was featured and served as technical advisor on the Sundance-premiered documentary, “The Cleaners,” said that content moderation workers often are not recognized as essential players within social media companies, although their occupational hazards rival those of some of the most dangerous and psychologically damaging professions.

“You never know who’s going to be good at it,” Roberts said. “People don’t necessarily know what will be difficult for them. Sometimes, people go in really stoic — ‘This won’t affect me, I’m strong.’ And it just hits them, hits them in the wrong way.

“A lot of people say it’s like being a journalist who is subjected to a lot of war-zone footage, or a police officer who sees a lot of tough things on the job,” Roberts said. “But the difference for the moderators that I’ve talked to, is that [content moderation] doesn’t have the social status. [It is] empowering about being able to talk about this work so openly. I hope we can bring some awareness about the work that people are doing.”

Above: UCLA Assistant Professor of Information Studies Sarah T. Roberts (at right) held a conversation with Facebook VP Ellen Silver, on the challenges of monitoring online content from users of the platform worldwide. The event was hosted by UCLA Tech and Innovation Initiative on May 10.