We have known one of the most surprising news of the day a few hours ago, when we learned that the Facebook policies for offensive content. In short, how do you decide which images and videos (among other things) pass the filter and which do not (as well as they can serve to block a user’s account).
What we intend with this article is give concrete examples of how the social giant moderates content, so that we can offer you a clearer vision of which are the main lines of Facebook when determining whether an image, a video or a status update are valid to be disseminated on its platform or not .
In total, they have leaked more than 100 internal manuals to train its moderators, as well as spreadsheets and flowcharts that offer an unprecedented view of the guides of the social network to moderate the content that is published on it. Among other things and according to The Guardian, issues such as violence, hate speech, terrorism, pornography, racism and self-harm are moderated.
Graphic violence, a key issue
The social network is frontally opposed to videos and photos that show dying people or animals … if sadistic attitudes are observed, either in the content or in the comments that accompany it. Facebook defines sadism as “enjoying the pain or humiliation that a living being is suffering.”
Facebook does not allow the death of a human or animal, either through comments on the video by the person who uploaded it, or by those who see it. In short, they look for sadistic manifestations between those who participate in the conversation and whoever originally shared it.
These manifestations they are of various types. On the one hand, comments showing pleasure about what is being seen (“I love seeing animals suffer”). On the other hand, comments that celebrate the violence that is being seen.
On the other hand, in the case of abortion images they are allowed as long as there is no nudity. This will surely be the subject of debate among the moderators themselves and the users of the social network.
What does not fall into this category It is framed in three groups: give an opinion on a violent act that happened to a public figure (for example, the execution of a dictator), support for the death penalty or rejoice at the death of a criminal sentenced to the death penalty.
When it comes to cruelty to animals, Facebook allows photos and videos to document it as long as they are to raise awareness. What is especially emphasized is adding protections to the images to try not to hurt the sensitivity of users.
Within these images that are protected are photos or videos of animal mutilation, as well as photos of the use of repeated and indiscriminate violence against any animal.
With regard to violent acts against minors, here would be framed videos or photos in which a minor suffers repeated physical or psychological abuse by adults. It is also considered child abuse to appear smoking tobacco or any other substance in a photo or video.
The “revenge porn”, strictly prohibited
The so-called “revenge porn” is one of the aspects in which Facebook makes an impact. In case someone does not know what we mean, we can define this phenomenon as the act of distribute, without the victim’s permission that stars in it, a photograph or video of an intimate nature.
According to Facebook guidelines, content classified as revenge porn must be comply with these premises:
- The images were taken in a private setting.
- The person who appears in them is naked, practically naked or engaging in sexual activity.
- The image has been published without consent because it is a context in which someone wants to take revenge on an ex-partner, or if the publication is made through media that spread the story.
Nudes are allowed within a historical context
According to what was published by The Guardian, adult nudes are allowed as long as they serve to illustrate historical facts, as it could be the life inside the concentration camps, or specific facts of wars collected in the history books.
In this way, images showing deterioration of a human body in the context of the Holocaust are perfectly publishable. The famous photo from the Vietnam War in which children flee from a napalm attack, which was the subject of criticism and debate a while ago, would also fall within this permissiveness.
Self-harm images are not censored
As has been collected in The Facebook Files, the social network will not censor images in which a person injures himself. Apparently, the intention behind this decision is that you do not want to censor people who are going through a delicate situation.
According to Facebook itself, it wants to show people with these types of problems who have support in the real world. What they are going to try to fight against is the live broadcast of deaths, so that events such as the murder of a man that could be seen on Facebook Live do not occur again.
Hate speech is not allowed
In Europe, Germany has already taken steps to eliminate hate speech From Facebook. The Teutonic government has tightened the nuts on the social giant, which wants to avoid at all costs becoming a forum of hatred.
For that reason, all videos, images or statements that incite hate speech will be removed by the moderation team of the social network.
In Engadget | Sex, violence and terrorism on Facebook: these are the rules that determine whether or not you see that content