created 11 months ago | Tagged:
Some people are total assholes on Facebook. They're bullies who post things that are racist, sexist, and otherwise offensive. And you report them. But what happens when you do? Facebook Security has posted an explanation. The guide above (click to enlarge) looks a little bit like the game Mouse Trap, and it's not all that straightforward, though the company breaks down the process a bit, explaining that there are four teams (Safety, Hate and Harassment, Access, and Abusive Content) made up of hundreds of employees that are monitoring complaints 24/7. From Facebook:
If one of these teams determines that a reported piece of content violates our policies or our statement of rights and responsibilities, we will remove it and warn the person who posted it. In addition, we may also revoke a user's ability to share particular types of content or use certain features, disable a user's account, or if need be, refer issues to law enforcement. We also have special teams just to handle user appeals for the instances when we may might have made a mistake.
In a community of 900 million users, it's good to know that Facebook is sorting through all the nonsense on the site, even if it doesn't always get it right.