Hey there, time traveller!
This article was published 7/7/2014 (781 days ago), so information in it may no longer be current.
Facebook, the social networking company, has shown it can play with the emotions of its millions of users by tweaking the algorithm that selects content of their news feed. The report of the company's experiment, published this month in a scientific journal, casts light on the power of social-media managers over their customers. It also raises questions about ethics and accountability in the control of social media.
The experiment, reported in the Proceedings of the National Academy of Sciences, was conducted by Adam D.I. Kramer, a researcher who works for Facebook, together with two Cornell University scientists, Jamie E. Guillory and Jeffrey T. Hancock. In the week of Jan. 11 to 18, 2012, they kept negative messages out of the news feed to selected Facebook users and kept positive messages out of the feed to other selected users. Then they monitored the messages from those users to see if their mood was affected by the news feed content they received.
The news feed is a filtering device within Facebook that keeps each user's message volume within manageable limits. Most people's friends put out far more messages than one person could ever read. The news feed notices which kinds of messages a user most often reads and selects messages accordingly. The researchers narrowed that selection to strip out messages containing emotionally negative words for some users and emotionally positive words for others. Then they watched the results.
The experimenters' conclusion, given in the journal article was: "We show, via a massive (N 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues."
When Facebook users learned of this experiment and complained about the dirty trick played on them, Facebook chief operating officer Sheryl Sandberg apologized for upsetting people and said the company should have communicated better. She gave no clue she saw a problem with an ethically challenged organization holding and using power to sway the emotions of millions of people by secretly tweaking an algorithm.
Politicians, entertainers, publicists and advertisers are trying to affect our emotional states all the time, sometimes for innocent reasons, sometimes to advance their own interests. In a free society, different speakers are tugging us this way and that and we make our own choices to which voices we will listen. Social media take this a step further by sending us messages from our friends, whose feelings naturally affect us more directly.
Adam Kramer's experiment, however, takes us into a world where emotional states can be transferred to others through emotional contagion, leading people to experience the same emotions without their awareness. Facebook or any social medium can show us messages from our friends who loved the show and weed out the messages from friends who hated the show. We will then love the show -- and we won't even know what happened.
The implications for marketing and for political decision-making are obvious: Facebook has the power to manipulate the selection of messages in its news feed. It did manipulate that selection in January 2012 and still today sees nothing wrong with that. Marketers and political campaigners will pay handsomely to get their fingers on that kind of persuasive power. Now that Facebook has done that once on a massive scale for a week, why would they or another social medium refuse to do it again? How would we know if they were doing it today?
For social media users, the best defence may be an alert and critical mind: That seems to be your friends talking to you, but it's really a giant corporation sending you a selection of messages from your friends -- and there's no knowing how the choice was made.