facebook emotions study
June 29, 2014

In The Name Of Science – Manipulating Facebook Users’ Emotions

Alan McStravick for redOrbit.com - Your Universe Online

The social media universe lit up this weekend with news that Facebook and social scientists from the social network as well as Cornell University and the University of California, San Francisco had manipulated news feeds for some 689,000 users without their knowledge or prior specific consent. If you frequent the redOrbit website, you may know that I wrote about the actual study approximately two weeks ago, detailing the exact nature of the study parameters and the results obtained.

Annie-Rose Strasser of ThinkProgress was one of many who provided an inflammatory headline to their readers backed by admittedly well researched privacy concerns supported by experts in the field. Rose Strasser contends the interesting results of the study are marred by one particularly disturbing aspect: None of the participants involved had been explicitly notified of their involvement.

When you consider the nature of the study, this is a fair argument. Facebook and the team of social scientists, via an algorithm, manipulated the news feeds of participant users so that some received an increase in so-called negative posts and news stories to their page feed while others were presented with more positive stories to their feed. The team wanted to know the effect, if any, on a user who sees either more positivity or negativity in their social media experience.

Facebook was swift in their reply and defense stating, and rightly so, that every Facebook user has agreed to the company's terms of service which claims user's data may be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement. Both the university-based social scientists and Facebook claim this study fell within the terms of service agreement because no researcher was presented actual postings by any of the over half million users reviewed for this study. Instead, a Facebook computer program scanned for words considered to be either “positive” or “negative.”

“As such, it was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research,” claimed the researchers in their study.

As companies on this new technological frontier like Facebook, Amazon, Google, Ebay and others continue to grow and plan for the long term, one can be certain that the most valuable asset they have for growing their company is the personal data of those individuals who interact with the sites.

In fact, in a recent statement, Sheryl Sandberg, Facebook's chief operations officer explained how the company intended to leverage the massive amounts of data they collect on each and every individual user. “Our goal is that every time you open News Feed, every time you look at Facebook, you see something, whether it's from consumers or whether it's from marketers, that really delights you, that you are genuinely happy to see.”

While that certainly seems a noble and worthwhile cause, to dive deep through the collected data to know their user, a major issue arises when the use of that data goes beyond well placed news feed items and targeted advertising: data on users can be handed over for intelligence agency investigations or to an insurance company.

Everyone clicks agree on the TOS document each time it changes and pops up in our browser window. But not everyone fully understands what they just agreed to. According to a recent Consumer Reports survey, “only 37 percent of Facebook users say they have used the site's privacy tools to customize how much information apps are allowed to see.” It is this unknown territory that can be most detrimental to Facebook users. “Even if you have restricted your information to be seen by friends only, a friend who is using a Facebook app could allow your data to be transferred to a third party without your knowledge.”

Privacy advocates have long sought for tech companies to present their existing and updated terms of service agreements in a format that more closely resembles plain English. Speaking with ThinkProgress some time ago, Pamela Rutledge, director of the Media Psychology Research Center stated, “There's a burden on the individual to get educated, but there's also a burden on the companies. We're not all lawyers and we're not all IT guys.”

The 'caveat emptor' argument really only goes so far. It will be up to the public, once made fully aware of this practice, whether or not they trust Facebook for their day-to-day social media use, if at all. The public at-large can become enraged, and legitimately so, for this egregious breach of expectation and trust. I know that I, for one, never expected my agreement to Facebook's terms of service meant I could be their own lab rat whose emotions they could manipulate and toy with in the guise of scientific research.