Entertainment Magazine

Facebook’s Research Dilemma: Did They Violate Ethical Or Social Contracts?

By Drpamelarutledge @pamelarutledge
Print Friendly Version of this pagePrint Get a PDF version of this webpagePDF

Facebook-w-mag-glassFacebook is getting serious flack for manipulating member news feeds to measure the emotional impact of positive and negative posts on member moods.  Legal or not, this spells bigger trouble for Facebook because it violates the basic premise upon which their empire is founded—relationships—and the social contract of fairness.


Facebook is in the spotlight…again.  This time, it is for recently published research that manipulated members’ Facebook news feeds based on positive and negative emotional content and then measured the impact by judging the positivity or negativity of member posts.  Research agenda aside, Facebook scientists did this presuming permission based on their legal consent agreement that every member must sign, not based on either the potential psychological impact of manipulating emotional content or the understanding that people have about what’s ok.

The Facebook data scientist and co-author Adam Kramer said he was sorry–sort of.  He apologized for the “way the paper described the research and any anxiety it caused.”  My guess is that the anxiety it caused was multi-faceting and bounced around like a marble in a pinball machine as people contemplated the implications of Facebook’s ethics in general, given that Facebook has access to a vast amount of data.

Facebook violated its social contract with customers and is finding out just how contagious emotions can be.

Although we don’t think about it much, we all know Facebook does research.  So does Google, Microsoft and every other company that you interact with online.  So what’s the big deal?   Many people ‘accept’ user agreements and give away all kinds of rights all the time.  The difference is that although we click “accept,” we don’t read the fine print.  We wouldn’t understand it if we did, unless we were among the minority of contract lawyers specializing in IT.

We’re judging on a different standard anyway, although we don’t realize it.  We are assuming an agreement based on an entirely different set of principles, a social contract. The handshake.  Trust and fairness. Most humans innately believe in fairness and can tell you whether or not something is fair by the age of five.  Neuroimaging shows that fairness and equity are often as important as personal gain in economic exchange games (Singer, et al., 2006).  In fact, the aversion to inequity is so strong that people will sacrifice personal gain to prevent another from receiving an inequitable share (Tabibnia & Lieberman, 2007).

One of the basic rules of fairness is respecting property rights—and not taking other people’s stuff.  We feel especially invaded if it’s personal stuff.  Here, it’s our emotions and our words.  We have become increasingly uncomfortable with the way that ‘big data’ is being used, whether it’s search algorithms or targeted political messages.  Some of the data mining produces things we find useful, such as relevant products recommendations and reviews.  But the data in question is about US.  Corporate assurances of identity-scrubbing doesn’t do much to quell the sense that you’ve been violated, especially when you reflect on the kind of stuff you’ve posted out there in cyberspace.

Humans fear manipulation and for good reason.  Logically, it means we’re not in control.  Not being in control triggers fundamental survival instincts. If we aren’t in control of our environment, or ourselves, we are vulnerable and in danger.  Fight or flight.

The fact that Facebook was manipulating newsfeeds based on emotional content with the intent of measuring the impact on OUR emotions makes it all the more discomforting. The idea that these are aggregated or averaged doesn’t matter to our primitive brains.  Reptilian reaction: Who gave you permission to fiddle around with my emotions?

Most people won’t look for the published article in PNAS (the Proceedings of the National Academy of Sciences) to examine the methodology or to see what the magnitude of the impact actually was.  Not many of us—including journalists—know what B means (at least without Wikipedia) or would want to wade through t scores, P or Cohen’s d scores to figure it out.  So we rely on headlines and press releases that make it sound very ‘Clockwork Orange.’  It may be small comfort (if you’re still reading after the last sentence) but the largest impact came from reducing positive words in news feeds and that impact was a decrease of .01% in positive posts across the entire population tested.  That is one one-hundredth of one percent.  You wouldn’t give an advertisement offering a 1% discount the time of day, much less one of .01%.  But for most of us, ‘statistically significant’ isn’t something we deal with on a daily basis and, combined with the ‘manipulating emotions,’ it sounds pretty important.  Even if we get the small magnitude and the idea that statistics are about probabilities not certainties, it still leaves us to wonder how even that small an incremental shift in emotion might influence decision making.

In this instance, the level of measured impact—if it in fact measured mood– is unlikely to have made any difference at all to the disposition of individual users.  First of all, the Facebook study did not prove that the ‘emotion-laden’ words identified by algorithms in Facebook posts were actually a reflection of anyone’s mood.  The posts were determined to be either positive or negative if they included one positive or negative word using an academically-accepted linguistic word count software.  We might argue that the differences in ‘emotion’ words was due to the fact that we respond in ‘like’ kind as a social norm.

Emotional contagion is a concept from epidemiology and network theory studies.  The Facebook study in particular follows a study by Fowler and Christakis (2008) that suggested emotions were ‘contagious’ across networks and that happiness seemed to be more ‘infectious’ than unhappiness.  The Facebook scientists in question, Kramer, Guillory, and Hancock (2014), were hoping to eliminate some of the criticism of the Fowler and Christakis study by creating a before and after approach and, in part, test the popular assumption that Facebook makes you feel worse because of the dreaded Fear of Missing Out (FoMO).  Understanding how behaviors travel like information across networks is important and exciting stuff.  But clearly these researchers aren’t chess players and didn’t think more than a few moves ahead.  Even if they didn’t have any ethical issues with their approach, the public response to this should have been an easy call.

But the main point of the reaction is not about the emotions they were trying to measure, although that makes this whole thing more emotion-laden. The big deal is the emotions they triggered when this all came out. Facebook wasn’t just measuring our ‘identity-scrubbed’ behaviors; they were violating our privacy and, in the eyes of many, toying with our emotions without our consent.

To test theories of emotional contagion, Facebook withheld content from the newsfeeds of approximately 600,000 people to see how more or less positive and negative messages would impact the postings of the recipients.  If that were mail, it would probably be a federal offense.  They argue that, legal agreements aside, the messages were available to use through other means, but that’s not the point.  If I am friends with someone, I am posting, aka ‘sending,’ them messages with the assumption that, based on the way Facebook works, that the recipient will see them in their feed.  The recipient has that same expectation. Whether or not my friends actually see them is irrelevant.  I don’t have to open my any letter I receive, but if it’s personal mail, it’s still a federal offense for you to steal it or even delay it for a week while you steam it open and count the happy and sad words.   We are just now coming to grips with how social networks like Facebook are redefining privacy based on access—theirs.  It’s our expectations and assumptions that have been violated here.

The really, really, really dumb thing is that Facebook has over 1 billion subscriber (at least they did before all this blew open), and it would have so incredibly easy to recruit volunteers for the research and easy enough to run controls and alter timing to avoid empirically compromised data.  The study, in fact, posed an interesting question and the data manipulation was a fairly innocuous intrusion on one’s account.  But they didn’t ask permission or for volunteers.  They just did it, which is so very Big Brother at a time when people are starting to wonder about this stuff that the damage to consumer trust may be incalculable.   The Facebook scientists say they now have a more rigorous Internal Review Board (IRB).  I hope it’s more than running it by legal.

Social media is about relationships.  There are literally millions of articles warning people, companies, brands, and nonprofits about the need for authenticity, transparency, and ‘humanness.’  It’s both ironic and telling that Facebook blew this fundamental attribute, which is the cornerstone of their success.

The takeaway here for the rest of us is really media literacy-based.  There’s a burden on all of us to get better educated.  We need to pay attention to what we’re giving people access to.  We also need to demand a new level of disclosure.  Companies should have to bear the burden of disclosures that are accessible to the ‘man on the street’ and not just the legal team.

Fowler, J., & Christakis, N. (2008). Dynamic Spread of Happiness in a Large Social Network: Longitudinal Analysis over 20 Years in the Framingham Heart Study. BMJ Online First | British Medical Journal, 337(a2338), 109.

Kramer, A. D. I., Guillory, J. E., & Hancock, J. T. (2014). Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks. Proceedings of the National Academy of Sciences, 111(24), 8788-8790.

Singer, T., Seymour, B., O’Doherty, J. P., Stephan, K. E., Dolan, R. J., & Frith, C. D. (2006). Empathic Neural Responses Are Modulated by the Perceived Fairness of Others. Nature, 439(7075), 466-469.

Tabibnia, G., & Lieberman, M. D. (2007). Fairness and Cooperation Are Rewarding. Annals of the New York Academy of Sciences, 1118(1), 90-101.


Back to Featured Articles on Logo Paperblog