Quantcast
Last updated on April 18, 2014 at 5:21 EDT

Facebook Mixes Emotion With Anti-Cyberbullying Effort

July 13, 2012

redOrbit Staff & Wire Reports – Your Universe Online

Facebook has launched new tools that allow members to better communicate their emotions and resolve conflicts, CNN reported.

The new features are a result of collaborations between the social networking company and Yale, U.C. Berkeley and Columbia University, and are based on months of research with focus groups of children, teachers and psychologists.

One of the changes is specifically targeted at teenagers aged 13 and 14, and allows them to report a cruel or threatening post or image a classmate has posted by simply clicking the “This post is a problem,” button, which replaces the former “Report” button. The teenager is then guided through a series of casually worded questions to determine what specific type of issue is taking place, and the degree of seriousness.

Facebook then asks the teen user to rank his emotion on a grid. Once finished, the social network provides a list of suggested actions based on the urgency of the complaint. For instance, if the teen is more annoyed than fearful, they might simply choose to send a pre-written message to the person who wrote the offending post, saying that the remarks made them uncomfortable. If the teen is afraid, he or she will be prompted to seek help from a trusted friend or adult.

There are links to detect anyone who may be feeling suicidal and direct them to professionals and Facebook’s suicide chat hotline.

We feel it is important that Facebook provide encouragement for kids to seek out their own support network,” said Robin Stern, a psychoanalyst from Columbia University who worked on the project, in an interview with CNN.

“The children tell us they are spending hours online… they are living their lives with Facebook on in the background.”

Young Facebook members are not the only ones who may need help communicating their feelings online.

Facebook examined photos reported for removal by members of all ages, including those flagged offensive for being pornographic, containing hate speech or depicting drug use.

The company noticed that these images were frequently flagged for more personal reasons. For instance, someone may not have liked how they looked in the picture, or feared their manager at work might see them in a picture doing something embarrassing.

When a photo is reported to Facebook for violating community standards, it typically goes to a Facebook employee who determines what steps, if any, to take.

Unsurprisingly, this results in a large volume of requests. By increasing the options and directing members to ask the person who posted a photo to take it down, Facebook is putting its members in charge of their own issues while simultaneously lessening its workload.

The photo-removal features were a result of work conducted by Dacher Keltner, director of the Social Interaction Laboratory at Berkeley, and his team, who customized Facebook´s stock requests based on the reason for wanting the photo removed, and how important it was to the offended party that it be taken down.

The new language was made more polite, and the recipient given a series of pre-written answers from which to select.  The updated approach also serves to open up a dialog between the two sides.

The latest changes show that Facebook is being more mindful to the ways people communicate their emotions on the social network. Moreover, by researching, changing wording and tracking response rates, the company is also learning how to better engage its members.

Indeed, while the intentions behind the changes are good – conflict resolution and helping members – the method could also have new applications for delivering paid content.

“Language really matters and design really matters for this stuff,” said Jake Brill, a Facebook product manager.

“The smallest change can have a really notable impact,” he told CNN.

Some Facebook users have already seen the new changes, which will be rolled out to all U.S. members this week, the company said.

Early results are positive, with the rate of people completing the questionnaire when untagging an image rising from 48% to 78%.

Facebook said it hopes to expand the program to other countries after carefully customizing the wording to be appropriate for different languages and cultures.

Facebook´s anti-cyberbullying initiative began shortly after the Tyler Clementi suicide, although Facebook says it was not inspired by any single event.

The results of the research were presented on Wednesday in Menlo Park, California, at Facebook´s Compassion Research Day.


Source: redOrbit Staff & Wire Reports - Your Universe Online