My experience at the pointy-end of YouTube content moderation

In August I published a video called Trump, QAnon and The Return of Magic. It’s about the increasing popularity of magical thinking, decision-making that’s based entirely on feeling, with few concessions to reality. While I poke fun at these beliefs throughout, the video ultimately is a plea for empathy and patience with people who believe in conspiracy theories like QAnon.

The video was well received and decently popular. It was a staff pick at Vimeo and led to my first assignment with The New York Times and a companion video about what to do about Q.

QAnon has been banned from all platforms. Anons (people who follow Q) are clever and resourceful at exploiting systems to their own ends. And if they can’t post their own vids, they’ll try to suppress videos that are critical of their beliefs. I wasn’t surprised to wake up one morning to an email notifying me that The Return of Magic had been removed from YouTube. Anons might have mass reported it or it might have gotten swept up in a QAnon dragnet. All the same, it had been removed from YouTube.

The email notification from YouTube

I received no information about why the video had been removed other than that it was “cyberbullying and harassment.” I couldn’t comprehend how anything in the video could qualify as either of these. I immediately appealed and said the video had likely been removed in error. I claimed it was of high social and educational value and said if there was a particular clip that was problematic, I could remove it.

A day later my appeal was rejected without elaboration. There is only one appeal, so after this there are no paths forward within the system. You are on your own. If you want your video back online, you have to engage in a guerilla PR campaign. So I did.

After my appeal failed, I went public

Since I have a moderate social media following, I was able to get tips from other creators and they said I should put the word out and tag @TeamYouTube. My posts generated a bit of buzz for a few hours and then TeamYouTube replied:

TeamYouTube’s reply

The next morning the video was back online without notice. Later in the day TeamYouTube replied:

TeamYouTube’s follw-up

Within 48 hours, I had been removed, I appealed and was rejected, I went public, and was then reinstated. Aside from being a distraction, it all worked out fine and yet I was left with some sobering insights into the YouTube content moderation system. Here’s the issues I see.

· Getting censored generates publicity. The most glaring backfire effect of censorship is it publicizes those who get censored. People love a fight and censorship creates a fight. I got a lot of engagement posting about this. When we censor, we often amplify speech we intend to suppress.

· The YouTube system is a black box. How does this stuff work? Did an actual human look at my case? Who are these people? Is it a smoky room full of men with cigars? Is it Rehoboam from Westworld? Nobody outside YouTube knows. People can’t trust something this opaque.

100% real footage of YouTube’s content moderation AI. Or Rehoboam from “Westworld.”

· Feedback was extremely vague. Any effective system needs to include us in a feedback loop where we are educated about the rules of the system and how to use it. This applies to everything from the stock market (you made money, that’s good) to video games (you died, that’s bad.) This can prevent users from ever needing to be censored because they’ve been persistently nudged back within bounds. The only feedback I received was that the video was “cyberbullying and harassment,” which — besides being wrong in my case — is very ambiguous. How was I breaking their terms?

· Getting censored is radicalizing. I’m too old to get that worked up over things like this, but I can easily imagine how being silenced with no recourse leaves people feeling victimized and angry. We have long memories with this stuff and we hold these grudges for a good long time. Much like military oppression will generate terrorism, censorship will generate digital soldiers with an endless appetite for conflict, vandalism, and mayhem. Each time we censor, we create enemies.

Alas, I think the unfortunate reality is we are stuck with censorship. The boundaries on major platforms should not simply be the boundaries of free speech, meaning anything that is legal to say can be posted. There are thousands of people out there who want to argue endlessly that the holocaust never happened. There are yet more who claim the Sandy Hook mass shooting was fake and the devastated parents of murdered children are “crisis actors.” And there are absolute hordes who think everybody and their dog is a pedophile and they will swarm arbitrary targets and hurl the most vile accusations imaginable.

Much of this activity might actually be legal, it might be within the confines of free speech, but I see no good reason why YouTube, Facebook, Twitter, Instagram and the rest should platform things like this. (I’ll discuss this in more depth in my next video, which is about Joe Rogan and Alex Jones. Subscribe on YouTube or join my mailing list if you’d like to be notified about new videos.)

The tool of censorship is necessary, but we need to use it in the most responsible and transparent way we can. This requires us to make the best system we can. I think this system must involve people who are not YouTube employees.

By continuing to censor using opaque and unaccountable systems, we will generate more and more ill will, we will create more and more radicals. Platforms like YouTube need to get ahead of this trend and build trusted governance. My fear is that if companies don’t create these systems themselves, the tech incompetents in Congress will one day foist some Kafkaesque dystopia upon us.

I don’t intend to disparage the moderation work being done at YouTube and other platforms. I am entirely certain the world is immensely better with them than without them. And TeamYouTube spotted my issue and corrected it quickly and courteously. Kudos to them. I recognize that the many good decisions here are invisible and the minority of bad ones are viral. But I think the current system needs strong reform now, before things get out of hand.

Now is the time to start creating a system that is radically more transparent, accountable and responsive. Now is the time to start devising a new kind of internet governance. Can we have a robust series of interventions that can nudge people from ever getting to the point where censorship is the only choice? Can these arguments and decisions be public? Can we have juries who are a diverse range of stakeholders, like other YouTubers? Can these huge monopolies cede important authority to their community?

I realize this undertaking is a galactic-scale can of worms. It will be difficult, it will sometimes be embarrassing, it will sometimes be ugly. But it will evolve and one day it will be something worlds better than what we have. It can be something that can be rolled-out across the industry. If tech companies don’t do this, I fear crusaders will storm the gates and build something worse.

Follow me on Twitter
Subscribe on YouTube
Buy my documentary, This is Not a Conspiracy Theory
Buy an Everything is a Remix t-shirt
Subscribe to my very occasional newsletter

Watch THE RETURN OF MAGIC for yourself

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store