That Time Facebook Emotional Contagion Took Over Your News Feed

July 1, 2014
Traffika

If you’re using Facebook, then you’re somewhat aware of the eerie power it has over you. It confronts you with some of your favourite things without asking. It presents you with people you had forgotten even existed. And there’s an ad of that car you once you looked at following you around your news feed.

Maybe you’re not that comfortable with a social network knowing so much about you. Maybe you want to keep some things private. Well you can run, and you can hide, but Facebook still knows how you’re feeling. And what’s more, is they have the ability to manipulate your mood based on what they choose to show in your news feed.

The Controversial Scientific Experiment

Some Facebook users feel as though the unaware participants of the study were used like lab rats.

Some Facebook users feel as though the unaware participants of the study were used like lab rats.

For one week in January 2012 (that we know of) Facebook was toying with your emotions. The latest issue of The Proceedings of the National Academy of Sciences of the USA (PNAS) features a paper titled Experimental Evidence of Massive-scale Emotional Contagion Through Social MediaEssentially it says data scientists used algorithms to tamper with the news feeds of 689, 003 randomly chosen, unsuspecting Facebook users to see if it affected their emotions. Evidently, it did.

The news feed is where you mostly view content shared by your friends, and because your friends often post more content than any one person can view, Facebook is continually developing algorithms which determine what to show and what to omit. These algorithms are supposed to show you posts that are relevant and engaging to your specific interests, but this particular experiment saw users being exposed to either predominately positive or predominately negative posts.

Emotional Contagion (featuring Matt Damon)

Remember the movie “Contagion”, where Matt Damon and Co. race against time to develop a vaccine to treat an extremely lethal virus? This is really nothing like that, except for the fact that emotions spread on Facebook can be contagious.

We show, via a massive experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.

Put simply, “emotional contagion” means spending time with happy people can make you happy and spending time with sad people can make you sad. The transfer of emotional states isn’t exactly ground-breaking psychology, but this experiment is the first evidence of “emotional contagion” via indirect written communication. Reading someone’s Facebook status is sort of the equivalent of overhearing a conversation on the bus; it wasn’t directed at you but it still affects you in some way.

The study used Linguistic Inquiry and Word Count software to determine the positive or negative emotion of posts and then conducted two parallel studies in which one group had their percentage of positive posts increased and negative posts decreased and the other had the opposite. The study found people exposed to more positive posts were more inclined to post positive and people exposed to negative posts were inclined to post negative.

This observation, and the fact that people were more emotionally positive in response to positive emotion updates from their friends, stands in contrast to theories that suggest viewing positive posts by friends on Facebook may somehow affect us negatively, for example, via social comparison.

The data scientists behind the experiment were pretty chuffed with this result, but perhaps a little too confident in the authenticity of people’s Facebook emotions. After all, status updates and posts are often tweaked to get attention (cue the duckface selfie.)

The Shock Factor

Whether you believe people’s statuses or not, the evidence revealed from this study is probably not all that surprising. Smiles can be infectious and so can tears, big shocker there. What is surprising to many Facebook users and members of the tech community is that Facebook can conduct such a large experiment on it’s users without notifying them before or after their news feeds had been tampered with.

The experiment has sparked a global conversation about the ethics of social platforms toying with users’ emotions. While Facebook argues it was done in an effort to improve algorithms, this provides little comfort to those who may have actually felt the negative effects of the study, not to mention the idea of using people like lab rats to conduct an experiment without our permission or awareness.

Facebook Data Scientists Adam D. I. Kramer responded to wide spread anger and ethical concerns in a public post on his Facebook earlier today.

The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.

And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.

Kramer wrote and designed the experiment and said all of Facebook’s research was aimed at creating a better user experience. All Facebook users have to agree to Facebook’s Data Use Policy when they sign up, so technically they have given permission to be involved in such studies. However, playing with someone’s emotions is obviously a more serious matter.

Kramer says himself and other researchers at Facebook have come a long way since the study in 2012 and Facebook’s review practices will take the negative reaction to this study into account. Whether or not your news feed was tampered with in January 2012 or any other time is likely to remain a mystery… Unlike the cause of that lethal virus in Contagion. That was caused by bats.