Facebook Receives Over 1 Million User Violation Reports A Day

1
350
A smartphone user shows the Facebook application on his phone.

According to Monika Bickert, head of policy management at Facebook, they receive more than one million reports of user violations every day. However, she told that she doesn’t have any data what percentage is serious and how many of these are removed from their website.

Bickert spoke to the fine (and imperfect) line between free speech and hate speech at SXSW’s first Online Harassment Summit on Saturday.

“You can criticize institutions, religions, and you can engage in robust political conversation. But what you can’t do is cross the line into attacking a person or a group of people based on a particular characteristic” – she said.

The criteria used by Facebook to determine if a content is hateful or not is if it basically attacks people according to their actual or perceived race, ethnicity, sex, religion, national origin, disease, disability or sexual orientation. These types of contents are not allowed in the site.

The social media giant encourages respectful behavior. Since there are different types of people with different cultural backgrounds using the platform, Facebook finds the need balance their needs, safety and interests. Therefore, the company may have to remove any sensitive content or limit the audience that sees it. But the social network finds it hard to enforce this rule.

“When it comes to hate speech, it’s so contextual … We think it’s really important for people to be making that decision.”

She hinted however that automation will someday play a bigger role in this regard. But she also noted that the reported number of violations are “steadily increasing” as users were allowed by Facebook to flag what they think are hate content from their devices.

Here’s a Facebook Reporting Guide.

Click here for High Resolution Image

Facebook Reporting Guide

1 COMMENT

LEAVE A REPLY