Facebook experimented on 689 000 users, and they didn’t know

Share on facebook
Share on twitter
Share on linkedin
Share on email

Just when you though that Facebook couldn’t get any creepier, it goes and goes and publishes the result of a study in which it deliberately manipulated the emotions of a large chunk of users and discovered it could make them happy or sad on demand. Unsurprisingly, Facebook users are not terribly happy to learn that they’ve been unwitting guinea pigs in large scale experiments on their emotions. Unsurprisingly, Facebook doesn’t seem to care.

The study was conducted by Facebook’s data science team in 2012, and the full results have been published here. What it says is that over the period of a week of, Facebook altered the algorithm that decides which posts would show up in the News Feed of 689 003, or 0.04%, of its users. The new algorithms removed emotionally negative posts for one group and emotionally positive posts for another.

The results of the experiment concluded that Facebook users would be more likely to post emotionally positive posts if they were exposed to them more often, and the opposite holds true as well. Basically, this shows that human emotions can be contagious even without face-to-face interaction between the individuals.

In addressing concerns over Facebook’s mass experimentation and potential manipulation of its users, Adam Kramer, a data scientist at Facebook and co-author of the study, posted a public explanation of the reasoning behind the study. “The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product.” Kramer said, “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.”

While Facebook’s News Feed has always been about manipulating user’s timelines to show the posts that Facebook thinks would be more relevant to you, the experiment has created quite an outrage online. After all, given the size of Facebook’s audience, it opens the door to all kinds of abuse. Could an advertiser pay to make you inclined to an emotional response to its ads, or a political party associate itself further to feelings of warmth? Targetting banners by keyword and subject matter is one thing, manipulating your state of mind while your being marketed to is a whole level of ethical debate further.

Facebook offers the explanation that every user who signs up agrees to the same terms of use which include the clause that data can be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement”. Facebook data chief Adam Kramer says that he never meant to upset anyone.

One of the first sites to report on the study, The Atlantic, pointed out that it may not meet with academic standards, however, and that the findings may be a statistical blip. It quotes the editor of the journal Proceedings of the National Academy of Science of the United States of America – which published the report – as saying that she found it “creepy”.

Clay Johnson of Blue State Digital, the company that built and managed US president Barack Obama’s online campaign for the presidency in 2008 took to Twitter in the wake of the revelations to give some rather poignant, and frankly quite disturbing, insight.

[Source – PNAS, Via – The Guardian]

David Greenway

David Greenway

David is a technology enthusiast with an insatiable thirst for information. He tends to get excited over new hardware and will often be the one in the room going "Its got 17 cores, 64GB of RAM and a 5" 4K flexible OLED display, oh it makes phone calls too?" Currently uses: Too many phones. Wants: World peace... and more phones.