In 2012 Facebook tweaked the algorithm to manipulate the emotional content appearing in newsfeeds of 689,003 randomly selected, unwitting users. Posts were identified as either positive’ (awesome!) or negativeβ (bummer) based on the words used. In one group, Facebook reduced the positive content of news feeds, and in the other, it reduced the negative content. βWe did this research because we care about the emotional impact of Facebook and the people that use our product,β Kramer says. βWe felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friendsβ negativity might lead people to avoid visiting Facebook.β Did tinkering with the content change the emotional state of users? Yes, the authors discovered. The exposure led some users to change their own behaviours: the researchers found people who had positive words removed from their feeds made fewer positive posts and more negative ones, and vice versa. It could have been an online version of monkey see, monkey do, or simply a matter of keeping up with the Joneses. ‘The results show emotional contagion’, Adam Kramer and his co-authors write in the academic paper.
Excerpt from: Who Can You Trust?: How Technology Brought Us Together β and Why It Could Drive Us Apart by Rachel Botsman