Facebook has violated your privacy. Well, okay – that’s nothing new. Facebook and its on again off again relationship with your personal details is a rocky road that always seems to be just on this side not okay. That is not news. This – this is something different. And this, is HUGE.
Facebooks Covert experiment
Unless you’ve spent the last few days under a rock (or under a beer and a Barbecue, as it was the Canada day long weekend over the last few days) you’ve probably heard about Facebooks Emotional Manipulation Study. That’s right! What has been long suspected is now proven beyond a shadow of a doubt – Facebook is manipulating your emotions! It’s all thanks to this man – Adam Kramer.
He’s a data scientist over at Facebook. He ran an experiment back in 2012 on 689,003 Facebook users. His goal? To find out if and how emotions can spread throughout social networks. The exact terminology they used was ‘Emotional Contagion.’ The study lasted for a week and was a collaboration with two researchers from Cornell University and the University of California respectively. How did they conduct this study? Well, some of the participants received happy and cheerful items in their newsfeeds, and some received depressing and grim posts exclusively. And yes, it turns out that moods are contagious and can be spread through social media. In other words – people have empathy and that can be spread through social media. From the abstract:
When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that … the observation of others’ positive experiences constitutes a positive experience for people.
Scientifically speaking, the study is unique in some ways and pretty ordinary in others. The sample size of nearly 700 000 people is truly huge – possibly the largest in the history of psychology. The results are interesting in that they show that very small changes in the emotional state of our environment can effect how we feel and act on our social networks. On the other hand, the effects in the study are tiny, among the smallest significant results ever published. Here is the disturbing part though – Facebook didn’t ask for permission before conducting this study! In fact, the only reason anyone knows about it is because Cornell and their cohorts published their findings in a journal last week!
When this study was published, people were rightfully angry to find out that a social media site was, essentially manipulating users emotions for science without asking them first! A lot of critics were saying that Facebook should have made sure to acquire informed consent before conducting a study like this. Facebook responded by saying that they had permission – from it’s Data Use Policy: The researchers took advantage of the fine print in Facebook’s data use policy to conduct this scientific experiment without informed consent. Even though the academic researchers worked with Facebook when designing the study, it appears that they only obtained ethical approval after the data collection was finished. Since the data was already collected, the ethics committee that granted approval seems to have awarded it an “approval lite”. It seems like the Facebook researchers exploited an ethical loophole. More disturbing, there is no age filter set on the study, so it could very well have included users younger than 18.
Informed consent is a core principle of human research ethics, established in the aftermath of the second world war. In important cases where the question is deemed vital and consent isn’t possible (or would prevent a fair test), it can be legally bypassed. But this is rare, and it is doubtful whether the Facebook study would qualify for such an exemption.
Here’s the kicker though – it wasn’t until four months after the study, in May 2012 that Facebook made changes to its data use policy. That’s when it introduced that line about how it might use your information for research.
Facebook is facing more than just user backlash following news that the social network was intentionally manipulating user emotions during a 2012 experiment. The Information Commissioner’s Office (ICO) is a data regulator in the United Kingdom. They announced on Wednesday that they are investigating whether or not Facebook violated data collection laws. At this point, the ICO is unsure whether or not Facebook actually broke any laws. However, since Facebook’s European headquarters is based in Dublin, the agency plans to contact Ireland’s data protection group about the matter, according to the Financial Times.
The IOC can “force organizations to change their policies and levy fines of up to 500,000 British pounds [about $857,000],” the FT wrote.
Financially, that fine won’t really hurt Facebook. Afterall, it brought in $2.5 billion in revenue last quarter. Instead, any type of monetary fine would be symbolic.
“It’s clear that people were upset by this study and we take responsibility for it. We want to do better in the future and are improving our process based on this feedback,” a Facebook spokesperson wrote in a statement given to Mashable. “The study was done with appropriate protections for people’s information and we are happy to answer any questions regulators may have.”
Sheryl Sandberg, Facebooks COO issued an apology on Wednesday. She admitted that Facebook ‘Communicated Really Badly’ about its experiment on users. Despite all this fallout, Privacy experts in the U.S. are saying that it is unlikely that Facebook broke any laws.
An official complaint has been filed to the US Federal trade commission as well. The complaint was filed by digital rights group the Electronic Privacy Information Centre (Epic). Facebook said it has no comment to make about the complaint. Epic wants Facebook to pay damages and to hand over the algorithm underlying the work.