YouTube should manage content responsibly

Written by Carolyn Kuimelis

In recent months, YouTube has been under fire for letting inappropriate—and sometimes disturbing—content slip through their filters. On Dec. 31, one of YouTube’s most subscribed-to vloggers, Logan Paul, uploaded a video to his channel in which he filmed a body hanging from a tree in a Japanese forest infamous for suicides. He starts the video by assuring his viewers that the title (“We found a dead body in the Japanese Suicide Forest”) is “not clickbait,” and that viewers should “buckle up.” Unsurprisingly, Paul faced immense backlash within hours of posting the vlog and the Views on this particular video sky-rocketed like never before. The YouTube community blacklashed his channel for his extreme disrespect and misuse of his influence. Unfortunately, Paul’s video is indicative of a larger problem surrounding online platforms. With exaggerated clickbait titles and grossly offensive videos becoming ubiquitous on the platform, YouTube is fostering an environment of extreme entertainment—one that has no morals and relies on shock value to get views.

It’s no secret that controversy elicits curiosity; people have been using flashy headlines to entice readers since the invention of the tabloid newspaper. But because anyone can enable their YouTube channel for monetization (meaning they receive money when someone watches an advertisement played on their video), set up an advertisement on their upload and create a flashy thumbnail, it seems that YouTube has turned into the perfect place for a competition on who can create the most over-the-top content, thus accumulating the most ad revenue. YouTube has over one billion users—almost one-third of all people on the internet. That’s a lot of clicks. YouTube’s increasing popularity, combined with the decline of traditional cable television as an entertainment source, makes it an attractive career option for aspiring entertainers. According to Forbes, the highest paid YouTubers made a combined total of $127 million in 2017. With more and more people relying on YouTube as their main income source, it’s no wonder that vloggers often resort to outrageous clickbait thumbnails to get views. Titles like “LIGHTING MY BROTHER’S POOL ON FIRE” and “EXTREME PRANKS GONE WRONG” make up the majority of videos on the trending page.

Of course, not all vloggers are money-crazy monsters; there are YouTubers who use their platforms to make a positive change. Last November, YouTuber Colleen Ballinger raised over $50,000 for childhood cancer prevention, and many YouTubers use their influence to promote good values and clean content. Still, the nature of YouTube is promoting a culture in which content creators will do and almost anything to get views, and many,

like Logan Paul, forget when to put the camera down.

Of course, YouTube does attempt to do its part by removing inappropriate content from the website. The problem is, they just aren’t that good at it. Paul’s video was deleted in under 24 hours, but not by YouTube. Despite the company’s policy that prohibits violent or gory content posted in a shocking, sensationalized or glorifying manner, the video accumulated over 6.3 million views before Paul himself removed it from his channel. YouTube didn’t release a formal statement until after the controversy.

For parents, YouTube is the perfect way to keep a young child occupied: it’s portable, engaging and there are thousands of videos aimed directly at children. Because of this huge market, there has been a vast proliferation of short, computer-animated kids’ videos. Additionally, many channels pirate popular T.V. shows they know children will recognize. The problem arises, then, with disturbing knock-off nursery rhymes and T.V. clips. Videos of a fake Peppa Pig eating her father or drinking bleach appear to an innocent child as inviting and safe. It may be easy for some videos to slip through YouTube’s filters, but when children are mistakenly watching their favorite cartoon character committing such atrocities, there is a problem.

Another way in which YouTube controls its content is through demonetization of videos that do not adhere to the Community Guidelines. Videos including drugs, violence, inappropriate language or sexually suggestive content, for example, will be removed of advertisements; the creator will not earn a profit from views on the video. The troublesome thing is that the guidelines seem to prioritize the wrong things. After the Las Vegas shooting, YouTuber Casey Neistat posted a video announcing a campaign to raise money for injured victims. He promised to donate all ad revenues from the video to his GoFundMe campaign. His video, however, was soon demonetized due to its content being related to a tragedy.

YouTube is popular because its format allows content creators to interact with viewers in a way that traditional media doesn’t allow. With the face-to-face interaction that social media brings, it’s easy for popular YouTubers to gain a following of loyal subscribers. Thus, it is of the utmost importance that YouTubers—especially ones with young, impressionable fans—understand the influence they have over their viewers and think beyond monetary incentive when deciding what to post (or film). Unlike most mainstream celebrities, Logan Paul didn’t have an army of publicists filtering his content for the sake of his image. But common sense and basic morals should have told him how grossly disrespectful and dangerous his filming and posting the video was, especially given the wide digital influence he has. Not everyone is going to make the right call in deciding what constitutes inappropriate content. In these cases, YouTube has the responsibility to remove potentially harmful content—before the damage is done.