Watch CBS News

Facebook rules reportedly allow livestreaming of self-harm

Facebook users are allowed to livestream acts of self-harm because the social networking giant "doesn't want to censor or punish people in distress who are attempting suicide," according to allegedly leaked internal documents revealed Sunday by The Guardian.

The images may be removed from the site "once there's no longer an opportunity to help the person," unless the incident has news value, according to the documents. The policy was found among a cache of more than 100 internal documents and manuals The Guardian says gives insight into how the social network moderates content on its site, including violence, hate speech, terrorism, pornography, racism and even cannibalism.

Facebook hires 3,000 to monitor Live after murders, suicides 03:15

Facebook Live, which lets anyone with a phone and internet connection livestream video directly to Facebook's 1.8 billion users, has become a centerpiece feature for the social network. In the past few months, everyone from Hamilton cast members to the Donald Trump campaign has turned to Facebook to broadcast in real time.

But in the year since its launch, the feature has been used to broadcast at least 50 acts of violence, according to the Wall Street Journal, including murder, suicides and a beating of a special-needs teenager in Chicago earlier this year.

The feature presents a dilemma for the social networking giant: how to decide when to censor video depicting violent acts. But Facebook's response to graphic content has been inconsistent.

The company has taken flak for removing material with social significance, like a livestream showing the aftermath of a black man shot at a traffic stop in July and a posting of an iconic Vietnam war photo because it included child nudity. (Both of those posts were later restored.)

To address the issue, CEO Mark Zuckerberg said earlier this month that Facebook will hire 3,000 more people over the next year to monitor reports about violent videos and other objectionable material. That team already had 4,500 people reviewing millions of reports every week.

Incidents of self-harm are on the rise for Facebook, a situation of concern for the social network, according to The Guardian. One document reviewed by the newspaper revealed 4.531 reports of self-harm in a two-week period last summer; a similar time frame this year showed 5,431 reports.

Facebook aims to combat clickbait 00:49

"We're now seeing more video content -- including suicides -- shared on Facebook," the company reportedly said in a policy update shared with moderators. "We don't want to censor or punish people in distress who are attempting suicide. Experts have told us what's best for these people's safety is to let them livestream as long as they are engaging with viewers.

"However, because of the contagion risk [i.e., some people who see suicide are more likely to consider suicide], what's best for the safety of people watching these videos is for us to remove them once there's no longer an opportunity to help the person. We also need to consider newsworthiness, and there may be particular moments or public events that are part of a broader public conversation that warrant leaving up."

The documents also purportedly explain how Facebook moderators are supposed to deal with posts that contain revenge porn, threats against President Donald Trump and images of animal abuse, among a laundry list of other questionable activities.

Facebook couldn't confirm the authenticity of the documents but said the safety of its users is its chief concern.

"Keeping people on Facebook safe is the most important thing we do," Monika Bickert, head of global policy management at Facebook, said in a statement. "In addition to investing in more people, we're also building better tools to keep our community safe. We're going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help."


Virtual reality 101: CNET tells you everything you need to know about VR.

Tech Enabled: CNET chronicles tech's role in providing new kinds of accessibility.

This article originally appeared on CNET.

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.