Facebook’s secret experiment on users comes under fire

Jun 30, 2014 | Digital marketing skills, Facebook marketing, Social media

Facebook has come under fire for conducting a secret experiment on some of its users, manipulating their news feed to see how they would react emotionally. For one week in 2012 Facebook tampered with the algorithm used to place posts into user news feeds to study how this affected their mood. The study, conducted by […]

Facebook has come under fire for conducting a secret experiment on some of its users, manipulating their news feed to see how they would react emotionally.


facebook%20dislike.jpeg
For one week in 2012 Facebook tampered with the algorithm used to place posts into user news feeds to study how this affected their mood.
The study, conducted by researchers affiliated with Facebook, Cornell University, and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences.
The experment was carried out on about 700,000 users without their knowledge, prompting outrage from some people.
Results of the study spread – and with that, anger and disbelief – when the online magazine Slate and The Atlantic website wrote about it on Saturday.
“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” the study authors wrote.
“These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”
There’s been growing user anger.
‘#Facebook MANIPULATED USER FEEDS FOR MASSIVE PSYCH EXPERIMENT… Yeah, time to close FB acct!’ one Twitter user said.
Other tweets used words like ‘super disturbing’, ‘creepy’ and ‘evil’ to describe the psychological experiment.
In a statement responding to the outrage, Facebook that ‘none of the data used was associated with a specific person’s Facebook account’.
‘We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible,’ it added.
Susan Fiske, a Princeton University professor who edited the report for publication, said the researchers assured her the study had been approved by an ethics review board.
In an emailed statement to news agency AFP, she said: “They approved the study as exempt, because it is essentially a pre-existing dataset, part of FB’s ongoing research into filtering users’ news feeds for what they will find most interesting. Many ethical issues are open to debate, and this one seems to have struck a nerve.”
View the study here

All topics

Previous editions