YouTube says Nashville body cam video can stay online

YouTube said Tuesday that police body camera video of Monday’s school shooting in Nashville, Tennessee, would normally violate its policy against graphic violence but that the platform will let the video go live with certain safeguards.
The Google-owned company said the video is in the public interest because it can let people know what happened in the shooting.
The company also said it monitors its platform for videos, live streams and comments that glorify violence in violation of YouTube rules.
“Following the tragic attack in Nashville, Tennessee, certain images released by the Nashville Police Department have been age restricted with a warning interstitial due to their graphic nature and will remain on YouTube as they are in the public interest,” said Jack Malon, a YouTuber. spokesperson, said in a statement.
“Additionally, to ensure people are connected with high-quality information about this current news event, our systems highlight videos from authoritative sources in research and recommendations, including by appearing on our homepage as well as on the Top News shelf above related search results,” he said.
Facebook featured the video in an equally restrained way on Tuesday: with a disclaimer that it contained graphical content that required two clicks to view the video. Meta, Facebook’s parent company, did not immediately respond to a request for comment.
The Nashville Metropolitan Police Department posted about six minutes of footage to the department’s YouTube and Facebook pages Tuesday morning, combining views from two officers’ body cameras. The YouTube video had over a million views as of Tuesday afternoon.
The video shows the moment Officers Rex Engelbert and Michael Collazo confronted and killed the gunman who killed six people, including three 9-year-old children, at Covenant School. Part of the shooter’s body is blurred in the moments describing and following the shot.
YouTube’s policy prohibits content depicting “traffic accidents, natural disasters, aftermath of war, aftermath of terrorist attacks, street fighting, physical assault, self-immolation, torture, dead bodies, demonstrations or riots, robberies, medical procedures or other similar scenarios in an attempt to shock or disgust viewers.” The ban also covers “crime footage” when there is no education for viewers.
Meta has an almost identical policy: prohibit particularly graphic content, but allow it with certain limitations to help people condemn violence or raise awareness.
YouTube has put several hurdles to viewing the images: users must say they are at least 18 years old and they must click on an interstitial message indicating that the content has been identified as inappropriate for certain audiences.
YouTube has used similar restrictions in the past, including in January when Memphis police released body camera footage of the assault on Tire Nichols.
Mass shooting videos have been a tricky problem for tech platforms like YouTube, Facebook and Twitter.
Most platforms have banned the reposting of videos created by the shooters themselves, which are produced to encourage and glorify violence.
These include videos from shootings in Buffalo, New York, and Christchurch, New Zealand, both of which were streamed live by the shooters themselves.
Tech has even created an industry group, the Global Internet Counterterrorism Forum, to help coordinate their defenses against extremists.
However, body camera footage and security footage of notable incidents have been treated as potentially more useful for public dialogue by the platforms, despite often depicting violence, killings, police brutality and other sensitive media.
In 2020, for example, at the height of the nationwide Black Lives Matter protests, YouTube dropped videos showing the killing of George Floyd in Minneapolis.
nbcnews Gt