As 2018 was ushered in, Logan Paul, a 22 -year-old YouTube star with 15 million followers—many of them teens and younger—posted a video of a dead body hanging from a tree to his channel.
It was eventually removed( by Paul , not YouTube ), and he issued two apologies this week in an effort to move past the outrage, but something about this felt different than other YouTube scandals. It proved the chasm between content and creator, and the consequences of the raw impulse to document everything. To viewers, it was insensitive and sickening. To Paul, it was “a moment in YouTube history.”
YouTube isn’t experiencing growing pains. It has reached a point where it’s promoting cognitive dissonance and accountability is absent. Advertisers have pulled money after ads were presented next to radical content; elsewhere, the LGBTQ community spoke out about YouTube apparently restricting content from certain channels as inappropriate. YouTube is attempting to trot out original content and live Tv, with ad revenue to match. Its YouTube Red originals are apparently seeing millions of views. But its house is not in order.
‘Please the algorithm’
A few months ago, I fell asleep to a YouTube video about aliens. It was one of those curiously automated clips categorizing the different “kinds” of aliens who have visited Earth. The tone was light, and the video featured cartoonish illustrations. When I awoke approximately 20 minutes later, YouTube’s algorithm had brought me to a channel that trades in videos about concealed emblems in entertainment and secret reptilian celebs, as well as more nebulous videos about “hidden messages” in Pepe the Frog and repurposed news stories about Vladimir Putin.( The channel lists its country as Russia .)
It’s not astonishing this content exists, but the swiftness with which I arrived there caught me off guard. YouTube’s always leaned on its algorithms to “learn, ” but it’s taken over in more ominous ways.
While creator-first content and community-shaping personalities still thrive, cracks have started to show. Most troubling, its algorithm revealed that there are thousands of videos that contain abusive content aimed at kids or involving kids. Elsewhere, pedophiles were allegedly use YouTube to post illicit clips of children.( YouTube declined to comment for this article. Instead, “were in” pointed to two blogs by CEO Susan Wojcicki about combating harassment and abuse in the community .)
In a BuzzFeed report from December, it was revealed that YouTube channels illustrating child abuse or endangerment under the guise of “family” entertainment were lucrative. In late 2017, the popular channel Toy Freaks, which features Greg Chism and his two daughters engaged in distressing activities, was shut down and Chism was analyse though not charged. Elsewhere, the weird subgenre of superhero videos is targeted at children but containing very adult( and often violent) topics was flagged by rapper B.o.B ., among others. Reached for comment on those videos in July, a YouTube rep claimed: “We’re always looking to improve the YouTube experience for all our users and that includes ensuring that our platform remains an open place for self-expression and communication. We understand that what offends person or persons may be viewed differently by another.”
In that same BuzzFeed report, a parent who runs a “family” channel and had his account demonetized because of questionable videos offered a telling quote: “In terms of contact and relationship with YouTube, frankly, the algorithm is the thing we had a relationship with since the beginning. That’s what got us out there and popular. We learned to fuel it and do whatever it took to please the algorithm.”
This is the beating heart of YouTube’s weirdness: it’s dishing out TV-style ad $$$ with absolutely none of the responsibility or accountability of Tv https :// t.co/ On8XhnU43y
— Tom Gara (@ tomgara) November 22, 2017
To reverse this new kind of worship, YouTube ostensibly requires more humans and has said it’s hiring a squad of 10, 000 moderators to review offensive content. But, as proven by Facebook’s tries at moderation, what humans see in the course of reviewing content has lasting effects. And the guidelines YouTube gives its human reviewers are apparently flawed: BuzzFeed recently interviewed people tasked with reviewing and rating videos, and found that it was often unclear what they were rating; YouTube’s guidelines put emphasis on high-quality videos, even if they were disturbing.
Free speech vs. hate speech
While Twitter has finally started purging hate accounts, YouTube’s scale and automation construct that difficult. Fake reports and conspiracy videos about the Las Vegas shooting depicted up at the top of search results in October. An August NYT report outlined the growing far-right presence on YouTube, one that’s young and “acquainting spectators with a more international message, attuned to a global revival of explicitly race-and-religion-based, blood-and-soil nationalism.”
Felix Kjellberg, aka PewDiePie, one of the more popular personalities on YouTube with more than 50 million subscribers and brand recognition, yelled the N-word during a gaming livestream and paid two men to hold up a sign that said “Death To All Jews.” The Wall Street Journal found nine videos on his channel that include Nazi imagery or anti-Semitic jokes. This led to his contract with Disney being terminated and his prove Scare PewDiePie canceled. Many fans claimed this was a violation of free speech, but approximately a year later his subscriber counting has not dwindled.( Kjellberg did not respond for commentary .) In a piece for BuzzFeed, Jacob Clifton was contended that Kjellberg isn’t some lone ogre; he’s merely one “symptom of a majority illness” that’s sweeping online platforms.
In July, poet and musician Joshua Idehen tweeted a lengthy thread about the “YouTube thingy” and the country of gaming culture, racism, and harassment on YouTube, including PewDiePie’s stunt.
Hi, I’m a YouTube Thingy. I talk abt gaming, geek culture, the only two genders and Hitler’s good ideas. Feminism is cancer
— Joshua Idehen (@ BeninCitizen) July 12, 2017
He told the Daily Dot he used to be an “edgelord, ” a term for those men( and it’s typically humen) who engage in provocative behaviour and ideologies for laughs, but that he “only stimulated it out with my humanity thanks to many, many good women.” Asked what YouTube should do to combat this sweeping tide, he offers: “Ban Nazis. Ban hate speech. Ban targeted harassment. Hire a dedicated moderation staff with all that world dominance money.”
YouTuber Megan MacKay tells “we’re getting a glimpse of the dark side of the democratization of content creation.”
“When everyone can pick up a camera and make a video without being beholden to anyone, we’re bound to eventually get content that traverses the uncrossable line, ” she tells. “I think elucidating and enforcing the terms of service is really the only route major platforms can ensure the safety of their local communities although we are stemming the flow of hate speech and alt-right garbage, but even when these corp take a performative stand against this type of content, they seem to fail or waffle when it comes to actually doing something about it. I can’t speak to why exactly they are still drop the ball, whether it’s dread of losing big-name users or conflicts with scale, but it’s a major problem that only ever seems to be halfheartedly addressed.”
This was reflected in YouTube’s official response to the Logan Paul video on Tuesday, which falls in line with so many of its other answers: A YouTube spokesperson told,” Our hearts go out to the family of the person featured in the video ,” but beyond that, they just reiterated the Community Guidelines. It doesn’t offer any solutions or answers, and Paul will likely continue making money.
But YouTube has always banked on drama, and in the process has fostered fanbases that know no borders. In November, fans of Alissa Violet and Ricky Banks trashed a Cleveland bar online, and menaced innocent residents of the city, after the two were thrown out of the venue. Jake Paul, the younger friend of Logan Paul, is also wildly popular with children and teens, but people who don’t enjoy his sophomoric, braying brand of humor weren’t too happy about living next to his chaotic prank house. He was called out for being a racist after he joked that a fan from Kazakhstan might “blow someone up” and a bully after fellow YouTubers the Martinez twins accused him of abuse.( His “pranks” on the twins involved airhorns and leaf blowers .) After Hurricane Harvey, he showed up in San Antonio to “help” and his rabid fans created even more chaos in a Walmart parking lot.
In a 2015 profile of Logan Paul, who got his start on Vine, he carried his desire to move past “clean” content and expand his fanbase: “I want to be the biggest entertainer in the world. That’s my deal. I’ll do whatever it takes to get that.” Two year later, he posted a video of a dead body and YouTube’s response was essentially the shrug emoji.
YouTube has built a massive global audience in its decade-plus of existence, one that hinges on self-starting. We’re discovering what happens when self-starting has no bounds when an open platform has to keep revising its Community Guidelines instead of ripping them up and starting over. In her December blog, Wojcicki claimed that YouTube’s goal for 2018 “is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube.”
But what if some bad actors are your most popular creators? And is one step genuinely enough?