Facebook reveals secret guidelines used to police extremist content

Adjust Comment Print

The company is also giving users the right to appeal its decisions on individual posts so they can ask for a second opinion when they think Facebook has made a mistake.

"We made a decision to publish these internal guidelines for two reasons", noted Facebook's VP for global product management Monika Bickert, in a blog post. "Hopefully, doing this will help people understand where Facebook draws the line on issues" and second, allow people to give network feedback so the developers can improve on decisions and guidelines.

The content that is subject to removal from Facebook does not necessarily has to be illegal. To prevent Facebook from falling behind the times, the social network's safety team seeks input from outside experts and organizations every week, Bickert explains.

As Facebook's policies evolve, they will be releasing a searchable archive of their guidelines so users can track changes over time. What has not changed - and will not change - are the underlying principles of safety, voice and equity on which these standards are based.

It's all part of Facebook's attempt to better control - and do so more transparently - what's going on across the site, particularly in the wake of controversies around its involvement in the 2016 U.S. presidential election. However, the line doesn't just stop at things that are illegal: Facebook also prohibits "attempts by individuals, manufacturers and retailers to purchase, sell or trade non-medical drugs, pharmaceutical drugs and marijuana", even in places where marijuana is decriminalized or illegal. "Where the intention is unclear, we may remove the content", the section reads.

One challenge is identifying potential violations of our standards so that we can review them.

Facebook bans all threats and calls to violence and says it works with its team to determine the difference between "casual statements" and "content that constitutes a credible threat to public or personal safety". To begin with, appeals will be limited to posts that were removed for nudity or sexual activity, hate speech or graphic violence.

If Facebook removes a photo, video or post, it will alert the user that the content was removed for violating the site's Community Standards.

Russian Federation defense ministry says to deploy new air defense systems to Syria
Syria has only occasionally fired on Israeli warplanes, and then only when they are raiding Syrian airspace and bombing Syrian targets.

In those cases, Bickert said, formal written requests are required and are reviewed by Facebook's legal team and outside attorneys.

In May, Facebook will launch "Facebook Forums: Community Standards" in Germany, France, the U.K., India, Singapore, and the U.S.to ask for feedback about community standards directly from users.

As well as making it clear exactly what sort of content is likely to attract the attention of censors, Facebook is also introducing a new appeals process, giving people the ability to fight back if their content is removed. But we know there will always be people who will try to post abusive content or engage in abusive behavior.

Finally, for now at least.

Facebook's Community Standards page has more details on what is allowed and what is not on the platform.

How Will Facebook's Appeal Process Work?

"We do not tolerate harassment on Facebook", the company wrote in one of the sections. This means you can memorialize an account of someone who passed away. We will share more details about these initiatives as we finalize them.

Facebook has been under fire from several governments recently when Cambridge Analytica was found boasting about how it used the platform to sway not just U.S. elections but even certain state elections in India by manipulating the content that people see on Facebook.

Comments