ExpressVPN

Fury at Facebook study which manipulated users emotions

In January 2012 Facebook performed a study on some 689,000 of its users, in which it deliberately filtered information posted on their home pages in order to manipulate their emotional state – making them feel more positive or negative.

By analysing the comments made by unwitting test subjects – which videos they watched, which pictures and web links they looked at, etc., Facebook was able to assess their emotional state, which it was then able to manipulate using a process known as ‘emotional contagion’.

In one test, for example, subjects who were feeling positive could be made to feel more negative by reducing the amount of ‘positive emotional content’ they could see in their home page feed, while another test achieved the opposite by reducing the amount of ‘negative emotional content.’

We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.

Facebook, who published the study performed in partnership with researchers from Cornell and the University of California, apparently sees nothing wrong with its actions, claiming that users provided ‘informed consent’ by agreeing to Facebook’s Terms of Service (in a very small clause hidden in the middle of Facebook’s 9,045 word ToS).

facebook research

Facebook claims that its manipulation of the news feed is ‘consistent with Facebook’s data use policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.’ In a more recent statement, Facebook said the study was approved through an internal review process at Facebook, not through a university Institutional Review Board.

Internet activists and even politicians have reacted angrily however. Jacob Silverman, author of Terms of Service: Social Media, Surveillance, and the Price of Constant Connection, said,

The internet is a vast collection of market research studies; we’re the subjects… What’s disturbing about how Facebook went about this, though, is that they essentially manipulated the sentiments of hundreds of thousands of users without asking permission (blame the terms of service agreements we all opt into). This research may tell us something about online behavior, but it’s undoubtedly more useful for, and more revealing of, Facebook’s own practices.

Facebook cares most about two things: engagement and advertising. If Facebook, say, decides that filtering out negative posts helps keep people happy and clicking, there’s little reason to think that they won’t do just that. As long as the platform remains such an important gatekeeper — and their algorithms utterly opaque — we should be wary about the amount of power and trust we delegate to it.

Interestingly, the research found that users responded more to ‘emotional’ posts, so as Forbes observes, ‘prepare to have Facebook curate your feed with the most emotional of your friends’ posts if they feel you’re not posting often enough.’

Much of the criticism focuses on the notion of ‘informed consent’ (or more specifically the lack of it), although others have pointed to the dubious ethics of performing psychological manipulation on unsuspecting targets. Professor of law at Maryland University, James Grimmelmann, argues that,

The real scandal… is what’s considered “ethical.” The argument that Facebook already advertises, personalizes, and manipulates is at heart a claim that our moral expectations for Facebook are already so debased that they can sink no lower. I beg to differ. This study is a scandal because it brought Facebook’s troubling practices into a realm—academia—where we still have standards of treating people with dignity and serving the common good.

At the very least, users should have to explicitly agree to being included in Facebook psychological experiments, before being subjected to them!


Douglas Crawford I am a freelance writer, technology enthusiast, and lover of life who enjoys spinning words and sharing knowledge for a living. Find me on Google+

Related Coverage


One response to “Fury at Facebook study which manipulated users emotions

  1. I knew there was a reason I’ve been avoiding Facebook for years. “Don’t trust ’em” simply seemed an inadequate reason. Until now, that is.

    A giant corporation wants to manipulate my mind – that’s fine; it’s called advertising, or maybe cereal box graphic design. But a giant corporation wants to manipulate my mind without my informed consent or voluntary participation? Yeah, “I don’t trust ’em” looks plenty sufficient now.

Leave a Reply

Your email address will not be published. Required fields are marked *