Facebook manipulated users’ moods in secret experiment
Facebook manipulated the emotions of hundreds of thousands of its users, and found that they would pass on happy or sad emotions, it has said. The experiment, for which researchers did not gain specific consent, has provoked criticism from users with privacy and ethical concerns.
For one week in 2012, Facebook skewed nearly 700,000 users’ news feeds to either be happier or sadder than normal. The experiment found that after the experiment was over users tended to post positive or negative comments according to the skew that was given to their news feed.
The research has provoked distress because of the manipulation involved.
Studies of real world networks show that what the researchers call ‘emotional contagion’ can be transferred through networks. But researchers say that the study is the first evidence that the effect can happen without direct interaction or nonverbal clues.
Anyone who used the English version of Facebook automatically qualified for the experiment, the results of which were published earlier this month. Researchers analysed the words used in posts to automatically decide whether they were likely to be positive or negative, and shifted them up or down according to which group users fell into.
It found that emotions spread across the network, and that friends tended to respond more to negative posts. Users who were exposed to more emotional posts of either type tended to withdraw from posting themselves.
The research drew criticism from campaigners over the weekend, who said that the research could be used by Facebook to encourage users to post more and by other agencies such as governments to manipulate the feelings of users in certain countries.
Even the scientist that edited the study had ethical concerns about its methods, she said. “I think it’s an open question,” Susan Fiske, professor of psychology at Princeton University, told the Atlantic. “It’s ethically okay from the regulations perspective, but ethics are kind of social decisions. There’s not an absolute answer. And so the level of outrage that appears to be happening suggests that maybe it shouldn’t have been done… I’m still thinking about it and I’m a little creeped out too.”
Facebook’s ‘Data Use Policy’ — part of the Terms Of Service that every user signs up to when creating an account — reserves the right for Facebook to use information “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
The researchers said that constituted the informed consent required to conduct the research and made it legal. The study does not say that users were told of their participation in the experiment, which researchers said was conducted by computers so that they saw no posts.
Facebook has said that there are on average 1,500 possible stories that could show up on users’ news feeds at any one time. It uses an algorithm that it says analyses users’ behaviour on the site to determine which of those stories to show.
— source independent.co.uk
Facebook conducted an experiment into how content can affect emotions by manipulating the feeds of over 600,000 unsuspecting users.
The ‘Experimental evidence of massive-scale emotional contagion through social networks’ paper was recently published and explained that Facebook changed the tone of feeds by skewing the number of positive and negative terms seen.
Facebook then monitored the effect it had on users’ emotions.
“The results show emotional contagion,” wrote the team of Facebook scientists in the paper.
“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”
Facebook was able to carry out the experiment in January 2012 without informing the 600,000 users of their involvement as they had ticked a box agreeing to the social network’s terms and conditions which include “internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
However, Facebook points out that an algorithm picked out positive and negative posts, meaning no personal information or user data was viewed by human researchers.
— source thedrum.com
Facebook modified news feeds of 689k users for a week
A research paper by the Proceedings of the National Academy of Sciences has published a report which claims that Facebook altered the news feeds of more than 600,000 users for a week without their knowledge to see the impact of positive and negative posts on their feed.
According to the research “in-person interaction and nonverbal cues are not strictly necessary for emotional contagion.”
To be precise, Facebook targeted 689,003 users for the experiment and manipulated their news feeds by exposing them to positive and negative posts.
The set of users were divided into two groups. One set was exposed to negative posts and the other set was exposed to positive posts and then Facebook monitored their posts to see if it effected their behaviour.
Disturbingly, it did. The social network, in the truest sense did not change what the users could see but using an algorithm selectively chose what the users saw. This was used as a powerful tool to change user behaviour.
When the selective algorithm was launched, it was claimed that it would allows users to gain access to more content they would like to see. However, it did not reveal the fact, it was also allowing Facebook control over what the user saw from his own feed.
According to the research paper, when a user opened their news feed, each post had emotional content of relevant emotional valance, there was a 10 to 90 percent chance of it being removed from the users feed for the purposes of the experiment. The posts, though, were still available when a user accessed a friends timeline directly.
The research paper has pointed out that such manipulation of the news feed is written into Facebook’s terms of use.
However, this is incredibly disturbing as such manipulation of a users feed is almost akin to brainwashing. Facebook is the one website with a quantum of stickiness that it’s always logged in as it connects a user with his/her friends and family.
Such news feed manipulations could be used for dastardly purposes. Facebook could use it to bolster its own advertising revenue, as advertising is its primary source of income.
Such manipulation is unacceptable in a post Edward Snowden and NSA PRISM world. It’s hard to fathom, how such data manipulation can be used to control select groups of people, minorities or even the masses.
— source indiatoday.intoday.in