YouTube this afternoon detailed new policy changes going into effect after the latest controversy featuring YouTube creator Logan Paul, who had advertising temporarily suspended on his account today over videos featuring the abuse of dead animals. YouTube has been reportedly working on these policy guidelines since the start of the year, as CEO Susan Wojcicki said earlier this month that one of her company’s goals in 2018 was to create clear policies to punish creators who do “something egregious that causes significant harm” to the community at large.
The changes are straightforward: YouTube says it reserves the right to strip a channel of its ability to serve ads and its access to premium monetization programs like Google Preferred and its YouTube Partner Program, as well as the right to cease recommending a channel’s videos across its network, if that channel proves harmful to the broader YouTube community. YouTube describes its community as including “advertisers, the media industry, and most importantly, the general public.” It’s not clear what exactly fits the definition of harmful in these cases, but it seems as if YouTube will take cues from both the greater YouTube community and the general public. More bluntly, YouTube
The Google-owned video site has always held these powers over channel owners as a platform-owning tech company. However, YouTube has been relatively hands off over the years as it’s ballooned into one of the largest media destinations on the internet, largely in part thanks to the symbiotic relationship it has with creators, who share in YouTube’s ad revenues. The company’s longstanding approach — combined with recent sea changes in public sentiment toward Silicon Valley and its responsibility to police its platforms — has resulted in a reckoning.
Offensive and immature creators like Paul, and to a lesser extent YouTuber Felix “Pewdiepie” Kjellberg, are now forcing the company to make tough decisions on what types of content and off-site and on-video behaviors it deems unacceptable. These new changes introduced today more transparently codify YouTube’s newer, more strict approach against behavior the company feels harms its reputation and makes all of YouTube look lazily unregulated.
“We’ve long had a set of community guidelines that act as rules of the road for what creators can share on our platform and a set of ad-friendly guidelines for what they can monetize. We also have a system of strikes we use to enforce those guidelines which can ultimately result in a channel’s termination,” writes Ariel Bardin, YouTube’s vice president of product management, in a blog post. “But in very rare instances, we need a broader set of tools at our disposal that can be used more quickly and effectively than the current system of guidelines and strikes.”
In no uncertain terms, Bardin describes — without ever naming Paul — the damage that can be dealt by having large, immensely popular channels acting without any regard for the consequences. “When one creator does something particularly blatant — like conducts a heinous prank where people are traumatized, promotes violence or hate toward a group, demonstrates cruelty, or sensationalizes the pain of others in an attempt to gain views or subscribers — it can cause lasting damage to the community, including viewers, creators and the outside world,” Bardin writes, referencing how channel-owners like Paul have millions of fans, many of whom are impressionable young children.
“That damage can have real-world consequences not only to users, but also to other creators, leading to missed creative opportunities, lost revenue and serious harm to your livelihoods,” she adds. “That’s why it’s critical to ensure that the actions of a few don’t impact the 99.9 percent of you who use your channels to connect with your fans or build thriving businesses.”