Quantcast

How Does Facebook Handle Those Inappropriate Content Reports?

June 20, 2012
Image Credit: Photos.com

Michael Harper for redOrbit.com

Facebook on Tuesday released a handy little infographic detailing how inappropriate, rude or spammy Facebook posts are handled.

As the service grew, so too did the level of inappropriate and generally troubling content (pictures of adolescent girls chunking the deuce and sporting a duck-face notwithstanding). As such, Facebook began offering users a “report” button in which to call attention to sexually explicit content, hate speech or cries for help. Now, Facebook says they offer even more sophisticated solutions to preventing this content and acting upon it when appropriate.

Facebook posted an elaborate flowchart on their blog explaining just how this process works.

“There are dedicated teams throughout Facebook working 24 hours a day, seven days a week to handle the reports made to Facebook. Hundreds of Facebook employees are in offices throughout the world to ensure that a team of Facebookers are handling reports at all times,” according to their official blog.

“For instance, when the User Operations team in Menlo Park is finishing up for the day, their counterparts in Hyderabad are just beginning their work keeping our site and users safe.”

The Facebook User Operations team has been split 4-ways to review specific types of reports: the Access team, the Abusive team, the Harassment team and the Hate team. Depending on the content in question, a user´s report will end up with one of these teams.

For instance, a post which wishes ill will or harm on another user will be directed to Facebook´s Hate and Harassment team. If a poster makes mention of wanting to bring harm to themselves, the post could even be directed to local authorities to prevent the user from hurting themselves or others.

Facebook also says they have software which will allow concerned friends to report the questionable content to a close group of friends who may know what´s going on with that person in their offline life.

If these teams decide the content reported violates their policies or their Statement of Rights and Responsibilities, they´ll remove the content and warn the person who posted it, just as they do in their war against breastfeeding mothers.

“In addition,” continues the Facebook post, “we may also revoke a user´s ability to share particular types of content or use certain features, disable a user´s account, or if need be, refer issues to law enforcement. We also have special teams just to handle user appeals for the instances when we might have made a mistake.”

Facebook´s User Operations Team also looks after your accounts in the event you forget your passwords or have your account compromised. The social networking site uses online checkpoints in these instances to authenticate your identity before restoring your account and prompting you to create a new password.

Facebook also announced they work with outside groups and organizations to keep their users safe. Partnering with their Safety Advisory Board and the National CyberSecurity Alliance, Facebook aims to not only keep everyone safe, but also educate them about how to keep themselves safe when online. Facebook also said they work with over 20 suicide prevention agencies worldwide, including Facebook´s Network of Support for the LGBT community of Facebook users.


Source: Michael Harper for redOrbit.com



comments powered by Disqus