Entertainment

Banned From YouTube: Common Slip-Ups

There are lots of reasons why your video might get pulled from YouTube and not all of them are blindingly obvious. Here are some common violations that could see your clip pulled from the site by Google.

Rude picture from Shutterstock

Last week, I tested the waterproof LifeProof Nuud iPad case for Lifehacker by chucking it into a bath with my three- and five-year old daughters. The accompanying video, which contained contextually relevant mild nudity, was removed from YouTube within minutes of being uploaded.

“The YouTube Community has flagged one or more of your videos as inappropriate,” the stock email from Google explained. “Once a video is flagged, it is reviewed by the YouTube Team against our Community Guidelines. Upon review, we have determined that the following video(s) contain content in violation of these guidelines, and have been disabled.”

I have to admit this took me aback somewhat. While I understand there’s a lot of sensitivity around this issue, there’s clearly nothing insidious going on in the video — it’s just two kids having fun in a bathtub while testing the durability of an iPad case. To me, it would have been weirder to force them to wear swimsuits for their nightly bath.

Anyway, the incident got me thinking about the type of videos that get banned on YouTube and the reasoning behind these decisions.

Dems da rules

Google can choose to remove your videos — or even terminate your account — if you breach its community or copyright guidelines, which are broken down into their own separate strike systems.

While copyright violations are pretty obvious, the community guidelines are a bit greyer and is the reason our LifeProof video got flagged.

Some of the content that YouTube specifically highlights as unsuitable includes pornography and sexually explicit content, animal abuse, drug/substance abuse, instructional bomb making, graphic or gratuitous violence, “gross-out” videos of dead bodies, hate speech and spam.

In addition, there is zero tolerance for predatory behaviour, stalking, threats, harassment, invading privacy, or the revealing of other members’ personal information.

“Violations of the Terms of Use may result in a warning notification or termination of your account. If your account is terminated you are prohibited from creating any new accounts,” Google warns.

Here’s how YouTube’s community guideline strike system works, as explained on Google’s Youtube support page:

Community guidelines strikes

 

When we remove content for violating our Community Guidelines, the uploader will typically receive a Community Guidelines strike (which are distinct from Copyright strikes).

 

Receiving strikes

 

If you receive a Community Guidelines strike, you’ll receive a notification via email and in your Account Settings with information about why your content was removed (e.g. for sexual content or violence). If you feel that a video was removed without just cause, you can appeal the strike on your account.

 

We understand that users make mistakes, and don’t intend to violate our policies. That’s why strikes don’t last forever — if you don’t receive another strike for six months, your initial strike will expire. If you receive a strike, make sure to review the reason your video or comment was removed to learn from your mistake.

 

Here’s a bit more information about what happens with each strike you receive:

 

  • First Strike: The first strike on an account is considered a warning.
  • Second Strike: If your account receives two strikes within a six month period, you won’t be able to post new content to YouTube for two weeks. If there are no further issues, full privileges are restored automatically after the two week period.
  • Third Strike: If an account receives a third Community Guidelines strike before the first strike has expired, the account will be terminated.
  •  

    Sometimes a video is removed for the safety of the person who posted the video, due to a first-party privacy complaint, court order, or other non-malicious issue. In these cases the uploader will not receive a strike and the account will not be penalized.

Naturally, you should also only upload videos that you made or that you are permitted to use:

“This means don’t upload videos you didn’t make, or use content in your videos that someone else owns the copyright to, such as music tracks, snippets of copyrighted programs, or videos made by other users, without necessary permissions.”

Here’s Google on how its YouTube copyright strike system works:

Copyright strike basics

 

YouTube removes content when we receive complete and valid removal requests. When content is removed, a strike is applied to the uploader’s account. If you receive three copyright strikes, your account will be suspended and all the videos uploaded to your account will be removed. Users with suspended accounts are prohibited from creating new accounts.

 

Quick facts about copyright strikes

     

  • Receiving a copyright strike can limit your access to special YouTube features.
  • Copyright strikes are often confused with Content ID matches, which can result in a video being blocked. They are not the same.
  • You can view your strike information in the Copyright Notices section of your account.
  • Please note that deleting the video that received the strike will not resolve the strike.

While the above rules might seem pretty clear-cut, there’s plenty of wriggle room on either side which makes for some curious anomalies in Google’s decision making. For example, Robin Thicke’s saucy music video Blurred Lines was recently banned from YouTube for inappropriate content. Meanwhile, the barely distinguishable Tunnel Vision from Justin Timberlake was let off with just a warning. Clearly, the guidelines are extremely fluid and subjectivity appears to be a significant issue.

You could also find yourself inadvertently violating YouTube’s copyright rules without realising. Our colleague Alex Kidman once received a copyright strike for a video in which the product he was reviewing happened to be showing content from the Pay TV music channel MAX.

“YouTube’s auto-robots picked that it had a split second of some concert or another playing on MAX within it, and wanted a signed statement that I had clearances for that,” Kidman explained to us.

“Almost certainly fair use for the purpose of review, etc, but not worth arguing/having the channel pulled for it. The easy tip from my side there is to make sure there’s nothing on a screen that might have its own copyright if you’re also using ads. YouTube seems distinctly less fussed if you’re not.”

I have no idea which inappropriate content area our LifeProof clip was supposed to fall into — if it was flagged as pornography, I think that says more about the viewer than the video.