You know German filmmakers Moritz Riesewieck and Hans Block are sharp-minded characters when you learn they work collaboratively under the label “Laokoon,” named after the legendary Trojan seer Laocoön who in Greek mythology revealed the Trojan Horse as a dangerous fraud. In their works, Riesewieck and Block aim to reveal the Trojan horses of our time. This leads us to their incredibly timely and eye-opening film, The Cleaners, which goes behind the scenes of social media content moderation in unprecedented ways. The duo traveled to the Philippines to get to know the people who are paid (very little in American terms) to weed through controversial posts on Facebook and other networks, and also talked to many of the thought and economic leaders of social media technology.
The Cleaners was called “riveting… pack[s] a devastating wallop” in The Verge, and “equal parts fascinating and frightening” by the Toronto Star.
We talked to Hans and Moritz about how things have changed–if they have–in social media since the 2016 election and the start of this film, what they thought of the Facebook Congressional hearings, about German and American views on “freedom of expression,” about how the moderators see themselves as superheroes, and how the filmmakers made it safe for these people to talk about their grueling work despite worries about job and personal security.
What was your own entrance to using social media, and what was the moment or breaking point that led you to want to tell the important and complicated story in The Cleaners? Was there a particular case or story that made you go “Ah ha! We need to look into this and make a film”?
Moritz: We had a different social network in Germany before Facebook called StudiVZ, a similar concept. And then it completely disappeared when Facebook entered the market. That was the moment, I was quite reluctant, because I always found it weird to promote yourself with everything you’re doing every day. But I wanted to follow up with a colleague during travels and decided to install it, make a profile. I used it, became more and more a user, I found it a wonderful tool to stay in touch with a lot of people and follow their lives. And see what’s happening, find new people, get in touch. That was the starting point for me at least, before all the problems with Facebook.
Hans: Same with me, I was an average user of social networks but liked the utopian idea of connecting people around the world, give everyone a voice. That was fascinating.
The moment we started researching for the film, it was five years ago, in 2013, there was a cruel post on Facebook that was not deleted for a long time. It was a video of a young girl being raped by an older man. It stayed there for a while, it was shared 16,000 times. It was shocking. We asked ourselves, what happened? Why is material like this online? And: Why don’t we see it more often?
We started researching and got in touch with Sarah T. Roberts, a media scientist who was researching content moderation [for her upcoming book Behind the Screen: Content Moderation in the Shadows of Social Media], and she told us that there are “humans behind those decisions.” Because we’d thought it was maybe an algorithm, some kind of artificial intelligence doing a filter job. She said no, they’re humans and most of the work is being outsourced to the developing world, to countries you never knew [were involved]. One of the main spots was Manila, the Philippines.
That was interesting to us for several reasons. First, how hard it is to handle a job like this, where you sit in front of a screen and review 8-10 hours of the worst you can imagine. Second, that they decide about what we are supposed to see and not supposed to see. These are very important questions in times when the social media networks have become more and more popular, and for the younger generation, this is now the first tool to inform themselves.
So we started researching and went to Manila to discover a very secretive industry.
What were the challenges in filming this censors, moderators, in the Philippines? How long did you have to spend getting to know them before they agreed to talk to you and how did their companies try to thwart you or prevent it? What creative liberties did you have to take to protect their privacy and film them?
Yeah, it was not easy, first, to identify which of these outsource companies that we’d never heard about actually work for Facebook. Because they ask their employees to use code words instead. Like for Facebook, the workers are supposed to say that they work for the “Honey Badger Project.” [laughter] It’s the favorite animal of Steve Bannon but has nothing to do with this project. Anyway, we were surprised to find out about that by coincidence. We collaborated with students in Manila who helped in researching [who] told us a lot of people said they work for the “Honey Badger Project” and asked them what that means. It was secretly shared with one of our team members. These companies do everything they can to hide the fact that on the one hand, they work for these particular [social media] companies we all know, and on the other hand how disturbing this job is.“There were a lot of reprisals creating an atmosphere of fear around these office towers. Which is by design.”
At first, it seems a nice way to earn their own money right after college. People are literally recruited from the streets. But they don’t know what they are applying for. Only as soon as they are in the training that they see it means seeing disturbing content. It means, literally all the horror of the world, they have to face it for 8-9 hours without skipping anything, and they have to return to it every day.
In most cases, they are the breadwinners, their families rely on it. And nobody dares complain about the job because it seems nice, in a nice environment, safe, so to speak. Salary between 1-3 dollars which is quite okay compared to some other jobs there.
So yeah a lot of challenges because we did not want them to lose these jobs, if they didn’t want to leave it. We needed to find a way to avoid security checks around these offices, because they have airport security scanners. So no chance at all to bring a hidden camera in, and if we did, would it have been worth it, because what can you see with hidden camera footage? So we found an abandoned office in Manila, where originally this [same] work was done, there were still these work “cabins” [cubicles], workstations inside, in blue!
We got access to the same software and made a precise copy, so we could then invite the workers who were willing to share about their experiences, to come to this office and demonstrate how they do this work. What we see in the film is precisely that — except for the fact that [in the film] they speak it aloud, “delete and ignore” and explain their decisions, everything else done in those scenes are captured from their experiences.
And we can show that now, because these people decided in the meantime to leave their jobs. That’s why they are kind of protected. And we talked to lawyers to find out a gray area which allows us to have them share experiences from their jobs without disclosing hard facts. We never mention any company names or details from the contracts or their written guidelines.
So the employees signed a non-disclosure agreement but there were still things they could talk about?
Hans: Yeah because they all sign one, that’s how it is in the industry, but they are allowed to speak out because no one has the right to keep you from talking about your psychological problems, for example. That’s the gray area we used to get around it within the legal frame.
Moritz: But it’s really frightening, there was a colleague of ours who was [secretly] photographed while talking to a subject and this photo was sent throughout the company and a warning was sent that if anyone talks to us that they will lose his or her job. The private firms they hired to control their employees even scanned their Facebook accounts. There were a lot of reprisals creating an atmosphere of fear around these office towers. Which is by design.
Of course, that helps to keep the workers doing what they do every day without questioning too much. It keeps them from talking to journalists and it has worked for several years. Now we are probably the first to break the silence.
The newsman [Ed Lingao] in that film seems like the rare public person to speak out against Duterte. Did he have concerns about appearing in your film?
Hans: I don’t think so because in the Philippines he is a journalist, a public person who speaks out. That was not a problem. But what was very interesting to us, when we first went to Manila, that this was kind of a blueprint of something that has happened all around the world, in the States or in Europe, that there are new mechanisms in how to gain power in politics. Duterte is the product of social media so to speak. He stops talking to journalists, he had a media blackout. Mocha Uson, a famous porn star in the Philippines who is also a singer and dancer, very popular, she was the one who was able to interview Duterte. She made the campaign for him, that was the reason why he won the election a few years ago.
Moritz: His main campaign slogan was “I will clean up.” That is what he was elected for, the promise to find easy solutions to complicated problems, to clean up society–what that means is that around 20,000 people have already been killed in the streets because of his politics. So when the content moderator in the film says* that what he does for the online sphere is similar to what Duterte does in the streets, this is really frightening, because this is a narrative that then makes this job much more than just a 9-to-5 job, it’s then a political mission to normalize most of the content on social media, and that is not what it should be.
*[“And just like our president is doing everything he can to keep the Philippines safe, I’m doing just the same in my job. I protect the users who use the app.“]
Hans: Maybe that is also another reason why Manila is a main spot for content moderation. We asked ourselves, Why Manila? The history of the country, because it was occupied by Spain for many years and then the Americans came, we found out that some of the outsourcing companies promote the fact that there was colonization, that the Filipino people share the same values as the Western world. That it’s perfect for the job, which is weird in a way, and not true at all.
And you get at that in your film, with the religious strictness and other belief systems.
Hans: Absolutely, something we didn’t expect at all when we first went there, that something like 90% of Filipinos are Christian, 80% of those Catholic. Sacrifice is a very strong term in that culture. The more you sacrifice the more social credits you can gain.
And that’s why this is in a way the perfect destination for that job, because it gives them meaning, as they “sacrifice” themselves for the sinners of the world in the digital sphere.
Given the 2016 elections and the fallout over Russian interference, what do you make of the changes that Facebook has instituted? Are they more open books than they used to be? If you follow the elections in the US that we just had compared to the 2016 election, how has Facebook changed–if they have–the way they handle political content and censorship?
Moritz: It’s a difficult question. First and foremost, we don’t want to blame only the companies themselves. We watched the [Facebook] Congress hearings in full length. We were actually surprised about the lack of education about the internet on the side of the politicians. It was almost real-life satire what we saw there. It was like, now it’s no wonder why the companies don’t have to fear anything about regulations and stuff. [laughs]
Whenever Zuckerberg was confronted by these massive problems, his answer was always the same: “Oh yeah, we are already working on that. I will follow up with my team. We can fix it.”
No, you can’t — you don’t even get the problem. It’s not about fixing details of your business. It’s about the question of why you should be allowed to do this business in the way you are doing it.“We were surprised about the lack of education about the internet on the side of the politicians. It was real-life satire…No wonder why the companies don’t have to fear anything about regulations.”
This is now the digital public sphere, it is important how the rules are designed and how they are executed. And you can’t just handle it in a way that is the most profitable. You need to handle it as a public servant. This is what it mainly should be like. The politicians just seemed to accept these statements, to just let them go. Probably because [these CEOs] are so renowned and prestigious, so important for the economy. But I don’t know the actual reasons.
They always try to fix problems in a way that is like, “Okay, now we change this guideline, we define it a bit different. Now we will hire 10,000 more people so we have a better quantity.” That’s not problem-solving, it’s just a race they try to win with the public because they always try to pretend like they are able to handle all these problems, and that’s not the case. The case is these sides provoke the society to be split, that people are much more driven by affection [emotion] than they are by arguments. It doesn’t provide the public with an infrastructure made for discourse — it’s made for opinion. For confronting people even more. This is all by design.
So this would be changeable if they wanted but they don’t want to of course, because it’s not profitable.
Hans: There’s another important point, this is kind of a narrative that is really strong in Silicon Valley, which is that these tools are neutral tools. It’s just technology. Technology has no bias. And that’s not true. There are people behind the decisions. For example the content moderators, it is very important whom you are to decide what is visible and what isn’t. And there are algorithms that decide some political opinions more than others.
I think we have to reflect more as a society and as a user of these social media networks that this is a very powerful, influential tool that we are using. We have to become not passive but a member of the digital world. Fight for our rights and for more transparency behind those decisions. I think this is a process that has started but we have to put pressure on those companies so that we know what is going on behind closed doors.
Moritz: Talking about biases, for example, in a lot of cases the content moderators in Manila told us that they can’t just follow the guidelines. That they have objective guidelines that just need to be executed but that’s not the case. There are so many gray areas. Let’s talk about art, satire, caricature, about the questions of which guidelines to apply. In all these areas they need to decide in seconds about things that journalists usually take hours or days to make a decision about.
So what else could they do than just using their gut feelings, they told us, because they don’t dare lose their jobs if they’re not able to decide properly. Gut feelings mean nothing else than how are you ideologically and religiously influenced, what’s your mindset? It’s really important who these people are in front of the screens. It’s all about the humans who decide and who are affected by these decisions.
Do you also follow Twitter as closely, for these same questions? Their CEO Jack Dorsey has had to deal with–some would say not deal with–the same problems as Facebook. What are the differences between the two platforms?
Moritz: First of all, these content moderators work for all the big social media companies, including Twitter. We decided to focus more on Mark Zuckerberg, from the executive side, because he’s a very public figure, everyone knows him. It’s different for the CEO for Twitter who is a bit more behind the scenes. And of course many more users worldwide for Facebook.
Hans: And what we learned was that the guidelines for both are very similar, Facebook and Twitter. There are some regions where they decide a bit different but generally similar. So it’s the same problems. It would’ve been too complicated to try to say, This is Facebook, this is Google, this is Twitter.
Moritz: We really wanted to dig deeper. It’s not about this guideline or that guideline so much, it’s about the underlying principles. It’s about the idea of how to design this digital public sphere and make it a healthy environment. And is it fair to say that you want to create a healthy environment for a public of 3 billion people–what does that mean?
And talk about freedom of expression–I know for you guys here it’s even more of an important issue. Sometimes we feel so close to the US because of pop culture, from our German perspective. But then we found when we participated in film festivals, we’d talk to the audience about one scene from our film, where right-wing activist Sabo states that it should be his right to spread his words–I would say hatred–against refugees and minorities on social media, that it is covered by the First Amendment, that it should be his right. Most Americans in the audiences agreed, because the First Amendment is such an important part of the Constitution people don’t want to change it, they want to believe in it.
On the German side, it’s the opposite; when German audiences are asked about the same thing, they completely disagree. They feel that it is not covered by freedom of expression, it shouldn’t be his right to spread such hatred on social media. And of course, it has to do with our [German] history. But if we recognize that we are not even agreeing on the definition of “freedom of expression” between two very close befriended countries, we are not even talking about very different countries also participating in the same social network. As soon as we haven’t agreed on this, we haven’t agreed that we are a global village, it’s fake and misleading.
You spoke earlier about access. What was it like gaining the trust of these content moderators who have come from such harsh backgrounds?
Hans: That may have been one of the most challenging parts of making the film. To gain the trust of the workers, to take the time, not to pressure them. When we first tried to get in contact with these workers we realized it didn’t make sense to immediately speak about the job. We really tried to build a relationship, a friendship with them first. We spent a long time in Manila before we started talking about the work, so they would feel safe to share. It was also very important for us, since we are Western white guys coming to the Philippines making a film about them, that they have the chance to speak, that it’s the perspective of the workers sharing their stories with us, that we are not forcing them–they are comfortable to do so.
Moritz: So we let them participate in the treatment of the film, it was very important for us that they always reflect their lives, their stories, that they participate in the development process. For example, we didn’t want someone to say “My job is similar to what Duterte does,” if he didn’t know what that means. And we discussed that with them, we talked politics a lot, but it was their point of view. They insisted that was my position. So we integrated it. We were quite surprised at how proud they were about what they do. “You can’t even imagine what your news feeds would look like without us, it would be a mess.” They’re proud for some good reasons, proud that they’re the ones who guarantee it can be a so-called safe space for the rest of the world, and proud to sacrifice themselves, as we already mentioned for religious reasons and cultural reasons. This was the main reason they want to do this job.
So for them, it’s that they want people to know they’re helping the world, they’re contributing, rather than censoring?
Hans: Yeah, they describe themselves as policemen and lawyers, as superheroes. One moderator that we didn’t include in the film, she was a big fan of Wonder Woman, and she said, “it’s the same, I’m like Wonder Woman sitting in a hidden office trying to help the world be a better place.”
–Craig Phillips with Danielle Orange.
Moritz and Hans say:
Who is censoring your social media feed?
Watch #TheCleanersPBS tonight at 10p on @PBS. pic.twitter.com/Xik3A8YaFi— Independent Lens (@IndependentLens) November 12, 2018