The New Big Brother
February 13, 2018
In the aftermath of Logan Paul’s buffoonish video of a dead body in Japan’s suicide forest, YouTube and other social media platforms have been forced to reckon with the immoral and inappropriate materials that they unknowingly distribute. This reckoning comes after months of pressure and scrutiny over the content of these platforms, from Twitter’s banning of Milo Yiannopoulos to the outrage over monetized channels making money off of child exploitation. Logan Paul may not have been social media’s first controversy, but it was the controversy that finally pushed YouTube to change the way that it monetizes and regulates content.
This is dangerous.
Historically, platforms have failed at restricting content. This is because, although censorship may seem like an easy solution, it is not an effective one. Previous attempts from YouTube to regulate content through the use of keywords lead the platform to demonetize thousands of LGBT-related videos. While trying to restrict violent and sexual content, Facebook removed photos of breast-feeding moms and the historically-important photo of the Napalm Girl from the Vietnam War. All of these decisions were met, rightfully, with public uproar. Every attempt at censorship has lead to little improvement of content at the expense of free speech. However, YouTube’s most recent regulation is its most harmful. In an effort to regulate monetized channels, the platform demonetized all small channels. These smaller channels are now paying for Paul’s actions, literally. This decision will limit the ability of small channels to gain visibility for years to come.
Censorship always hurts the least powerful. These small YouTuber’s whose channels will suffer are only the first and probably the least important. The restriction of speech on platforms usually comes at the behest of governments using platforms as a tool to control the flow of information. Nations like China, Egypt, and Israel control social media as a way to control their population. For example, Egypt uses social media to shut down important conversations on governmental torture and brutality. At the Indian government’s request, Twitter users sympathetic to Kashmiri independence were banned. Facebook blocked the co-author of the Panama Papers for criticizing the Maltese government. In Israel, the government pressures platforms to uphold Israeli law by censoring Palestinians. The censored information ranges anywhere from photos and poetry to, in the case of Tamara Abu Laban, posting the words “forgive me” in Arabic on her status. For that crime, Abu Laban was arrested and sentenced to a fine of 11,500 shekels and five days of house arrest.
Outside of simply governments, platforms often put the tools of censorship in the hands of users, who often target speech that they dislike. In particular, Twitter has a problem with users seeking to prevent speech they dislike by mass reporting of users and content. Although this tactic is prevalent on both sides of the aisle, it is journalists and members of the #MeToo movement who have take the largest hit. Ukrainian newsite Liga was blocked from Facebook following false reports of nude content on their page. Rose McGowan was suspended from Twitter in the early days of her campaign against Harvey Weinstein. Even more damaging for Twitter is the perception that it applies its rules selectively, with white supremacist accounts remaining up and harassment reports often going unanswered. It took Milo Yiannopoulos’ targeted racial abuse of Leslie Jones to have his account permanently suspended.
But while some cases are cut and dry, others are not. People perceive things differently. What may simply be political speech to some may be offensive to another. The distinction isn’t always clear, such as in the case of Alex Zaragova, whose article on sexual harassment was removed from Facebook for its opening line: “Dear dudes, you’re all trash.” Was it humor or just an eye-catching opening? An attempt to draw awareness to the complicity of many men in the sexual harassment of women? Or was it blatant sexism? Where do we draw the line? How do we distinguish the political and the hateful, the propaganda and the opinion?
These are questions that have no simple answer. That is why increased censorship is such an ineffective solution to disturbing content. Any attempt to restrict speech only complicates the problem. Instead, platforms should work to enforce the rules that they already have in place. Twitter must start taking harassment reports seriously. Facebook must double down on its commitment to warn users of fraudulent content. Platforms must practice transparency and due process. Users need to know that the rules are applied consistently and fairly. Likewise, platforms need to take responsibility for the decisions, both good and bad, that they make. Only then can the dream of a free and safe internet be realized for all people.