Quantcast

Google Relies On ‘Trusted Flaggers’ To Report YouTube Video Abuse

March 18, 2014
Image Credit: Thinkstock.com

Enid Burns for redOrbit.com – Your Universe Online

The amount of video content uploaded to YouTube is overwhelming, and Google is looking to a community of about 200 people and organizations to aid its staffers by flagging videos with inappropriate content. Once videos are flagged by volunteers, YouTube’s team then makes a final ruling on what videos can remain on the site.

YouTube has a team of staffers that regularly monitors the content in order to flag and take down offending videos. It also asks users to flag videos that are inappropriate, and posts community guidelines on the video sharing site. To help the in-house team, YouTube has enlisted a team of about 200 people and organizations to help by flagging up to 20 YouTube videos at a time, The Wall Street Journal reports. The program reportedly began in 2012.

Last week the Financial Times reported that the UK Metropolitan Police’s Counter Terrorism Internet Referral Unit uses its “super flagger” authority. The organization is using its new status to flag videos for review and removal.

Guidelines for YouTube content are clear. While the video sharing site is careful to monitor for inappropriate content, it also wants to be an open place for content that is interesting to people around the world. “We’re not asking for the kind of respect reserved for nuns, the elderly, and brain surgeons. We mean don’t abuse the site,” it says on the community guidelines page.

When flagged videos are reviewed, YouTube has the option of restricting videos for adults only, or removing them from the site. YouTube seeks to restrict or take down videos from a few main areas such as pornography or sexually explicit content; nudity, especially of a sexual nature; any kind of animal abuse; drug abuse, under-age drinking and smoking or bomb making; graphic or gratuitous violence; “gross-out” videos of accidents, dead bodies or other shocking content; hate speech or activities; predatory behavior; threats or harassment; and spam.

Initial reaction after the Financial Times posted about the UK Metropolitan Police’s Counter Terrorism Internet Referral Unit receiving super flagger status was that Google was allowing the UK government to censor videos it doesn’t like. That was when Google disclosed more details about the program, the Wall Street Journal reports.

Google releases its Transparency Report quarterly to reveal government takedown requests. The super flagger status does not allow any government agency to take down content, but aids YouTube in taking down content that doesn’t conform to its community guidelines.

“A person familiar with the program said the vast majority of the 200 participants in the super flagger program are individuals who spend a lot of time flagging videos that may violate YouTube’s community guidelines. Fewer than 10 participants are government agencies or non-governmental organizations such as anti-hate and child-safety groups, the person added,” wrote Alistair Barr and Lisa Fleisher of the Wall Street Journal.

While the number of users with super flagger status remains few, the ability to flag unwarranted content is available to all YouTube users who log into the site to view content.

“Any user can ask for a video to reviewed. Participants in the super flagger program, begun as a pilot in 2012, can seek reviews of 20 videos at once,” wrote Barr and Fleisher.


Source: Enid Burns for redOrbit.com - Your Universe Online



comments powered by Disqus