Quantcast

News Feed Study Leads Facebook To Issue A Public Apology

July 3, 2014
Image Caption: Sheryl Sandberg, the Chief Operating Officer at Facebook. Credit: Stephen Lam/Getty Images/Thinkstock.com

Alan McStravick for redOrbit.com – Your Universe online

Sheryl Sandberg, the very likeable chief operating officer for Facebook and author of the popular and controversial ‘Lean In: Women, Work and the Will to Lead,’ was selected to be the public face for the social network’s apology after it came to light the company, working with researchers from the University of California, San Francisco and Cornell University, manipulated the news feeds of 689,003 of their users. Unfortunately, neither Sandberg nor Facebook seem sorry for the right reason.

Claiming the week-long psychological experiment and its results had been “poorly communicated”, Sandberg explained to The Wall Street Journal, “This was part of ongoing research companies do to test different products, and that was what it was.” Sandberg then apologized for how the company went about publicizing the study, stating, “We never meant to upset you.”

The bigger issue is that manipulations of data likely occur on a very regular basis at the company. Not voiced in this mea culpa was any mention that Facebook planned to curtail any future study or “ongoing research.”

This particular study, led by data scientist Adam Kramer, used an algorithm that was designed to alter the receipt order of stories to a user’s News Feed. The algorithm would seek out either positively- or negatively-worded posts and then monitor the users’ subsequent status updates to judge whether they were susceptible to the emotional contagion they were hypothesizing existed. Neither the data scientists nor Facebook altered any of the posts. They merely highlighted whether the user would receive a higher percentage of negative or positive posts.

As the results of the study showed, a Facebook user is influenced by the emotional tenor of the messages that arrive in their News Feed. Those subjected to more positive messages tended to post updates that were themselves more positive. Conversely, those who had the negative messages funneled to their Facebook page, tended to to react negatively in their own later updates.

While Facebook believes the fault was in how the study was conveyed and not in the act itself, researchers are now openly calling into question whether or not Facebook breached ethical guidelines around informed consent that most every study is expected to abide by.

“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out,” Kramer offered as his own attempt at an apology. “At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused.”

For the time being, we’ll have to sit and wait and see if these apologies mollify the American public. But as not all of the users who were targeted by the study are US residents, it turns out the Information Commissioner’s Office (ICO), a UK regulator, has opened an investigation into the social network over their conduct of the very broad study. The scope of the investigation is to determine if Facebook broke data protection laws. According to the BBC, the ICO intends shortly to question Facebook on the matter during which the company claims, “appropriate protections for people’s information,” had been taken.

In a statement, Facebook’s Richard Allen said, “We are happy to answer any questions regulators may have.”

The ICO will also reach out to Irish regulators for assistance as Facebook’s European Headquarters are in Dublin, Ireland.

The company continued to shield itself from any perceived wrongdoing by claiming that there had been “no unnecessary collection of people’s data.” The issue many in the public take issue with, and rightly so, is the willful manipulation of an individual user’s emotional state under the guise of the results being important for academic purposes. People who regularly engage on Facebook will now have to determine if the Facebook experience provides enough value that they will, henceforth, willingly submit themselves and their data to any future forms of study.

—–

SHOP NOW – GoPro Hero3: White Edition – (131′/ 40m Waterproof Housing)


Source: Alan McStravick for redOrbit.com - Your Universe online



comments powered by Disqus