Social Networks Struggling To Target Sexual Predators
Enid Burns for redOrbit.com – Your Universe Online
Teenagers are a coveted audience online, yet building a social networking site or app, or letting that age group on a larger social network comes with responsibility; a responsibility that companies take on with varying degrees. This is the promise to identify and block sexual predators from grooming and meeting up with kids and teens.
Experts say that some of the best forms of monitoring combine technology and human observation. Reuters reports on a recent takedown stemming from chats on Facebook, and what other social networks do to keep unwanted activity from their sites.
In March, Facebook alerted authorities in South Florida to a situation after the social network’s software raised flags, and the person monitoring activity flagged by the system realized that a man had made arrangements to meet a 13-year-old girl after school. The police took the threat seriously and credited Facebook’s swift action in the arrest of the sexual predator.
Facebook commits a portion of its resources to monitoring suspicious activity to protect itself as much as it protects its users. It’s a very fine line that determines acceptable behavior, and activity that needs intervention. Facebook works hard to monitor only what’s really necessary. “We’ve never wanted to set up an environment where we have employees looking at private communications, so it’s really important that we use technology that has a very low false-positive rate,” said Facebook Chief Security Officer Joe Sullivan, in the Reuters article.
The software Facebook uses peers into personal interactions such as in chats and direct messages to determine if a particular user has made several attempts to arrange meetings with its contacts, it also flags strings of numbers that can represent a phone number. The activity that the software looks for is referred to as “grooming,” when a sexual predator develops relationships with several targets and tries to take the interactions to the next level.
Other sites with more concentrated business models and target audiences need to evaluate their stance on monitoring interaction. The article talks about a few other platforms, and how each has dealt with some of this unwanted activity.
A smartphone app focused on flirtation with strangers called Skout recently shut down its teen section where users under 20 were encouraged to send flirtatious messages to one another. The network realized that some unwanted users, who were over 20 and looking for younger individuals were given “easy access” to that particular audience.
When it’s just a segment of a larger audience, a company like Skout can make a decision to shut down this portion of operations and still maintain a business. Other social networking platforms are built with tweens and teens in mind, and have to make decisions on how they want to monitor interactions.
Habbo Hotel, a virtual world with a heavily tween and teen audience had to temporarily block all chatting in June when a UK television reporter wrote about being targeted with explicit remarks after posing as an 11-year-old girl. Some similar networks, such as Disney’s Club Penguin, offer an option to filter chats to allow only approved words. Parents can choose whether their child has access to filtered or unfiltered chat.
As much as it’s necessary to weed out the naughty users, it can have a detrimental effect on a company and its revenues. “You might lose some of your naughty users, and if you lose traffic you might lose some of your revenue,” Clair Quinn, safety chief at WeeWorld, a site aimed at kids and young teens, said in the Reuters article.
Excessive filtering and other actions to monitor and restrict interactions can also turn off the legitimate audience. Teens often go to these sites to express new freedoms, and if that’s curtailed, they may find another place to do just that.