Don't Blame Your Indian Content Moderator
Sabrina Ahmad is a recent MSc graduate from the Oxford Internet Institute and a content designer in London, UK.
Most Internet users are insulated from the depths of human depravity online; that’s due to a curated process known as commercial content moderation (CCM). Who does the psychologically taxing task of sorting through gruesome imagery fall upon? Increasingly, poorly-paid Indian college graduates. Indian CCM firms are reportedly hiring at an all-time high to meet the content moderating demands of platforms like Facebook and Google. It’s an immense responsibility that implicates the well-being of billions of people.
More on:
While India and other developing economies might provide the labor that allows for more effective CCM, there has been skepticism about their services. In her research, UCLA Professor Sarah Roberts found a sharp dislike of Indian CCMs and the quality of their work. Multiple interviewees complained about the moderators’ work, calling the cultural difference “such a disaster” that they had to ultimately “cancel” the outsourcing. Roberts even admitted that those whom she interviewed cast a “downward eye” towards the CCM workers in India.
Examples of the supposed ineptness of Indian CCMs abound: In 2016, an American ex-YouTube content moderator described how the company’s Indian team had incorrectly moderated a video of fighting schoolboys. According to YouTube’s guidelines at the time, footage of minors fighting was prohibited, but the Indian moderators saw the schoolboys as adults and allowed it.
In another example, Indian content moderators were accused of failing to follow the Western norms of sexuality stipulated by the platform. In Roberts’ research, one interviewee conveyed an incident wherein Indian moderators repeatedly removed pictures of bikini-clad individuals despite such images adhering to guidelines. Applying a policy based on “United States or Western Europe” culture simply doesn’t work, the interviewee said.
More often than not, however, culture is used as a scapegoat to paper over deeper problems with platform content moderation policies.
Endeavoring to understand how culture affects CCM, I spent the past year at the Oxford Internet Institute interviewing Indian content moderators and executive leadership at Indian CCM firms, as well as U.S.-based policy specialists at global platforms who outsource moderation to India. Data from this research demonstrates that allegations of cultural bias are largely unsupported.
More on:
Few people outside the CCM community understand the range of topics that content moderators cover. These Indian moderators service a range of clients beyond the big players like Facebook and Google; their clients also include, for example, the world’s second-largest e-commerce platform, and niche dating websites for individuals with sexual diseases. Policing each of these platforms requires rigorous training, which acts as an acculturating process intended to help Indian moderators understand Western clients. One executive leader at an Indian CCM firm described it as such:
Aditya: We have a strict three-weeks training before we deploy [moderators] onto any live project. [...] For example if you are working with any US clients, then our trainers will talk to those content moderators regarding the culture […] As a moderator you have to know the in-and-out of the countries, regions, technology, accent, everything.
An actual Indian CCM worker backed up this claim:
Kirit: Yes, actually [we] get good guidelines, it’s almost like 80 to 90 pages of guidelines. Our director is very particular about guidelines, so we get every day a quality report [...] Some [clients] are happy and some are not, because not every employee is as good as another, no?
This training appeared sufficient to service a majority of content moderation requests, though in some cases the training was insufficient, especially with regard to sexual content. This point came up in one interview with a content moderator at a major video streaming platform:
Samantha: Yeah, at some point we actually removed our India team altogether […] It’s just cultural norms are different. In India what would be considered sex, and sexy, and too like over the top, we’re like, “But she’s got her clothes on, what are you talking about?”
The issue, however, might not be culture as much as the policies themselves, which lack clarity and are often confusing. Indian moderators are no more susceptible to bias as moderators from anywhere else, and they’re given comprehensive training to counteract it to the extent possible. While the major platforms are U.S.-centric in their policies, their policies are difficult to decipher even for American moderators. Facebook reviewers, for example, often make different calls on hate speech content and don’t always abide by the complex guidelines.
The double standard in accusing Indian moderators of cultural bias is clear. Where a complaint is made about an Indian moderator’s work, it is attributed to their culture. When complaints about the application of guidelines are leveled against content moderators in the aggregate, they are attributed to complicated guidelines or lack of sufficient training. The substandard quality of an Indian moderator’s work is dismissed on “cultural” grounds without considering other factors. As Roberts found, American CCM firms increasingly frustrated with their work going to non-U.S. vendors have even tried to leverage their cultural fluency with American culture and customs, using arresting taglines like, “Outsource to Iowa. Not India.”
The issue isn’t that Indian reviewers are incapable or lack the intelligence to moderate effectively. It’s that assumptions about the quality of their labor have been based on supposed cultural incongruence, without further inquiry. Content moderation, both human and algorithmic, will always be fallible. But platforms must identify what the heart of the problem really is—and it’s not the culture of its outsourced labor.
* All names used in this blog are pseudonyms