The Cleaners – So You Copped a Facebook Ban?

Earlier this week I was on the receiving end of a three day Facebook ban. I’d already been banned once for 24 hours (that was for calling out a racist on their racist behaviour), so the next step was three days. 

The first time I was banned on Facebook, I used directed unsavoury language to describe the person making racist comments. It was – as per Facebook‘s rules – targeted harassment. I understood exactly why I had a 24 hour ban, and made sure that if I were shouting down anyone spewing out hate speech again that I would be less direct in my anger. So far, that has worked fine.

A post shared by The Curb (@thecurbau) on

Then, on Monday the 13th of August, I replied to a comment on a thread about James Gunn. See, James Gunn’s spot of bother with the pot of hot water he continues to find himself in continued as photos of him at a paedophile themed party surfaced. The theme of the thread was that paedophile themed parties were in poor taste (no argument there). I jumped in to comment that, yeah, they’re in poor taste, but what of Prince Harry and his Nazi costume? (If I weren’t given a ban, I was also going to point out that serial killer themed parties are a thing.)

To make my point, I dug into the Google-sphere and plucked up the notorious image of Prince Harry wearing a Nazi costume to a themed party. Ironically, I didn’t want to use the images of The Sun newspaper because I didn’t want to spread that toxic newspaper on the internet further. 

Less than a minute later, I was logged out of Facebook. I thought, that was strange, and went to log back in and was presented with the above notice from Facebook. I knew immediately what had happened – that Prince Harry and his Nazi attire had been scraped by Facebook‘s image scouring, and the swastika caused the ‘Nazi propaganda’ alarm to ring. I knew I wasn’t reported by someone, because the reporting function on Facebook simply doesn’t work that quick. 

And, y’know what, I was fine with the ban.

Why wouldn’t I be? Facebook have already stated that they want to remove hate speech, fake news and other toxic dialogue from their platforms. If I happen to get swept up in the wave of censorship on Facebook, then so be it. Sure, I’d innocently posted a photograph of a person wearing a Nazi costume, but that image alone could be perceived as being pro-Nazi or anti-Nazi. Context means nothing when you’re sharing an image of a swastika. 

Alongside my three day ban, I’ve lost access to being able to post on The Curb‘s Facebook page. This includes scheduled posts that I have had organised for #AUSgust and for reviews that I’d written. This is mildly frustrating, but really, it’s almost no different than what’s happened to Alex Jones. The main difference between Alex Jones and myself is that Alex Jones was a targeted removal from Facebook and YouTube for his continual lie driven hate speech, whereas I was simply victim to an algorithm that exists to scrape away the hate on the platform.

(It’s worthwhile noting that I posted the same picture on Twitter and nothing happened. But, we all already know that Twitter is a platform that thrives on hatred. After all, it’s had countless opportunities to ban the President for his hate speech, but has declined doing so because… well… reasons.)

The documentary The Cleaners screened at this years Revelation Film Festival. It looks at the outsourcing of censorship on social media, taking account of the anonymous people who sit in offices in Manila, day in, day out, actioning the reported images on social media platforms. They have their rules and guidelines of what to remove and what to keep active. 

The Cleaners looks at the impact that scouring through these images has on the people who are outsourced to ‘fix’ the work of major corporations like Google, Facebook and YouTube. It looks at the ethics of shovelling damaging work off to countries where the pay is low and the work is scarce. They witness everything from absurd artistic renditions of naked Donald Trump with a small penis, to beheadings, to child pornography, to acts of war – and their role is to censor and remove such images or events from being on social media.

Take the photo of Phan Thi Kim Phuc, aka ‘Napalm Girl‘, for example. This is one of the most powerful images about war to ever exist. It’s unforgettable. It speaks volumes about the horrific nature of war. Yet, because it contains nudity and violence, it falls foul of the rules and guidelines of Facebook.

So, it’s removed. 

The censor knows the cultural impact of the photograph, they know the value of it being in the public domain and understand why it is being shared, but because of the rules, it simply cannot remain on the platform. So, while there are campaigns to ‘free the nipple‘ and mothers are upset as another photo of them breast feeding is removed, the flipside is that the algorithms and censorship of these social media platforms will never allow such a thing to exist. Is it hypocritical, especially when men are allowed to show all the nipples in the world? You bet. 

But, let’s not forget that even though there are some 2.5 billion monthly active users, the fact remains that Facebook, Twitter, Instagram, WhatsApp, Snapchat, YouTube, and whatever other social media platform exists on the fringes of these behemoths, are free to use. Yeah, we may laugh or rant and rave at how old tweets come back to haunt us (hey Tomi Lahren and James Gunn!), and sure, we cringe at the idea of our data being mined incessantly, but we remain incessantly active on it. The reveal of Cambridge Analytica may have caused a downturn in users on Facebook, but it continues to be the social media app of choice that people can’t quit.

And how can they? Earlier this year I left Facebook for a few months. I needed a break. I needed to stop laying in bed at 1am scrolling through the feed waiting for a new update from somebody I’ve never met on the other side of the world, ignoring the fact that I needed to be awake at 6am to (ironically) go work as a data manager. So, I shut down my page and left the platform. I gave up Twitter

What I found was that I was more alone than ever. Sure, I’d gone through a separation, but the people I interacted with on a daily basis on Facebook or Twitter were suddenly gone. I’d sent an email or two to see if I could maintain a connection that way, but they were either rarely checked or the responses were late. The limits of my social circle (as in, it being non-existent) were suddenly evident. I knew nobody outside of Facebook and Twitter. Even though I existed in society, I was adrift in a world of strangers. It terrified me that I’d quietly become reliant on social media to keep a track of what was going on in the world. 

Now, it is possible to break free of social media and live a live without being tethered to constant notifications and part of a system that may just well bring about World War III. It is. But, when you’re running a website, and trying to keep up to date with society as a whole, you become extremely reliant on these platforms to deliver you news. Long gone are the days of the rss feed. Onwards with the connected digital future. 

And Facebook knows how integral it is to society. It’s well aware of the effect it has on elections. It has an issue with hate speech, and it has to do something about it. The UN advised that Facebook played a role in spreading hate speech against the majority-Muslim Rohingya minority in Myanmar. Footage of violence against Rohingya in Myanmar is shown in The Cleaners – showing the difficulty that censors have to grapple with in relation to live footage. How do they deal with footage that is showing the escalation and potential of violence?

One censor talks about a live streamed suicide attempt. He can’t cut the feed while a person is stands on a chair with a noose on their neck as they haven’t breached any of the rules and regulations on the platform. Yet, once the person steps off the chair and commits to the act of suicide, the feed must be cut as it shows violence. The effects of sitting there, waiting for a person to potentially kill themselves is terrifying – and it speaks directly to how impersonal the being of social media is. 

Running under the guise of connecting everybody everywhere, the Facebook, Twitter, Google, and YouTube, reporting features are extremely impersonal. Press on a tab, hit report, tick a box that fits the material you want to report, and a day or two later you get a response as to whether the post you reported broke the rules or not. When this happens to you, it’s even more impersonal. There was no way for me to say ‘well, I was posting the picture of Nazi Prince Harry to make a point about how we all have things in our past that we wish didn’t exist’. There is no nuance. There is no conversation. There is simple yes or no. 

Just as there’s no way for that censor in Manila to get authorities to the house of a person who is in the midst of committing suicide. They simply have to sit there and watch. The fact that these corporations see no issue with farming out the task of removing dick pics, beheadings, suicides, and nipples, from our news feeds shows how easy it is to dehumanise groups of people. Just because the pay is low and the work needs to be done doesn’t stop the fact that these are human beings who are made to look at these images so we don’t have to see them. There’s no counselling services available. There is simply a quota of images they need to go through for their work to be considered ‘complete’ for the day. 

Sure, Facebook or Twitter may release statements about why certain posts were removed or weren’t removed, but for the lay person, they simply tend to give a judgement and that’s that mattress man. Nothing more. Nothing less.

It’s recommended by some to report regularly, and to block often – to help send a message back to the algorithm as to what is ‘good’ and what is ‘bad’. This action has spawned a minor bout of activism on Twitter, where it has been suggested to block Fortune 500 companies to send a message to the powers that be at Twitter that Alex Jones and his dangerous conspiracy driven dialogue isn’t welcome anywhere. 

Social media platforms are used by the left, the right and the undecided to argue their points. The notion of ‘free speech’ has long been removed from social media. Our feeds are curated by always changing algorithms that we’ll never fully understand. We want racism and hate speech removed from social media, but it’s a double edged sword. Anonymity has opened the floodgates for people to feel comfortable with voicing their hatred. Yet, for those who simply want to exist and connect with others, the algorithms put in place to trap those sharing images or saying things that Facebook deems inappropriate or illegal, may very well catch them too.

I know that if I simply mentioned Prince Harry and Nazi that I wouldn’t have copped a three day Facebook ban – but that photo did me in. Would having had The Sun header have kept me safe? That’s not a test I’m willing to take. I’m fine with the ban, as it shows that the algorithm works – it’s doing its job in keeping Nazi material off social media. It’s a greatly flawed system. Even if utilising a photo of Nazi material is done to make a point, its removal showed that the system does sometimes work. Sure, I want to shake up the system and not play by its rules, saying that I’m not going to tow the line. But, the website is free, and the rules are there to be followed, because unfortunately, they have rapidly become the rules of society – whether we like it or not.

And here we are – the conundrum of social media.

We know that it is a sarlacc pit for our data, retaining them for longer than the 1000 year digestive cycle of that fictional beast. We know that it’s bad for us, but we’ve grown to rely on it so much that we simply can’t let it go. We know it does equal amounts of harm as it does good. We know that we have become the product. We know that social media fractures society further than it should do.

But we simply do not leave. 

Further reading:

Wired: Facebook’s Fight Against Fake News Keeps Raising Questions
The Verge: Who is Responsible for Taking Down Nazi Gifs?
The Verge: Why Facebook Banned Alex Jones – and Twitter Didn’t
Gizmodo: Legal or Not, Facebook’s Banning the Sale of Kodi Boxes
Gizmodo: Facebook Forced to Block 20,000 Posts About Snack Food Conspiracy After PepsiCo Sues: Report

Andrew F Peirce

Andrew is passionate about Australian cinema, Australian politics, Australian culture, and Australia in general. Found regularly talking online about Sweet Country, and reminding people to watch Young Adult.

Liked it? Take a second to support The Curb on Patreon
Become a patron at Patreon!