Facebook under fire for conducting secretive emotions study

  • Though not clearly illegal, study may have breached ethical, moral lines
  • Such studies common; users should be aware all Internet services similar

Facebook under fire for conducting secretive emotions study NEWS ANALYSIS THE recent move by Facebook to conduct a sociological experiment on a select group of its users has drawn the ire and criticism from academia, analysts and social media professionals, who described it as callous, unethical and irresponsible, although perhaps not outright illegal.
 
Last week, the Menlo Park, California-based social media giant revealed that it had published details of a vast experiment in which it manipulated information posted on users’ pages and found it could make people feel more positive or negative, according to The Guardian.
 
The findings were published in the journal of the National Academy of Sciences. 
 
Conducted in conjunction with researchers from two highly-rated US universities, Cornell and the University of California, Facebook was said to have filtered users’ news feeds – the flow of comments, videos, pictures and web links posted by others in their social network – in a bid to understand how different people react to various content.
 
The methodology involved Facebook selecting 689,000 users and using linguistic inquiry and word count software to anonymously analyse about three million posts from these users.
 
The posts contained some 122 million words, four million of which were positive, and 1.8 million of which were negative. None of the words were actually seen by researchers, the social media giant claimed.
 
Facebook justified the experiment by saying that users’ agreement with the social network’s terms and conditions when creating their accounts constituted “informed consent on this research.”
 
The Facebook study revealed that when the number of positive posts in a user’s news feed was reduced, the user posted fewer positive posts and more negative posts.
 
Facebook cited this finding, among others in the same research, as evidence that social networks can spread “emotional contagion,” or the wide-scale transfer of positive or negative emotions between users.
 
Put simply, Facebook manipulated users’ news feeds and sought to see how they reacted, whether positively or negatively. But it did so with its users believing that what they experienced was a natural part of their responses when in actuality their feeds had been tampered with in a bid to measure their reactions.
 
The Facebook study concluded that “Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.”
 
Facebook under fire for conducting secretive emotions study Critical reception
 
Initial reaction to the Facebook experiment had been largely critical, with academia and to industry practitioners roundly criticising the way in which the study was undertaken, rather than the study's objectives per se.
 
Critics argued that while the Facebook study may not have crossed lines of legality, it certainly breached the ethical and moral responsibilities the social network site ought to have for its users.
 
The New York Times pointed out that while Facebook isn’t the only company that manipulates and analyses data – Google and Yahoo also watch how users interact with search results or news articles to adjust what is shown – it noted that Facebook’s test did not appear to have such beneficial purposes.

The Times quoted Brian Blau, an analyst with Gartner, as saying, “Facebook didn’t do anything illegal, but they didn’t do right by their customers. Doing psychological testing on people crosses the line.”
 
James Grimmelmann, a professor of technology and the law at the University of Maryland, was highly critical of the study and took exception to the way Facebook justified its clause for “informed consent.”
 
Writing in his blog, Grimmelmann said the standard of consent for Facebook’s terms of service is low.
 
“But that ‘consent’ is a legal fiction, designed to facilitate online interactions,” he stressed. “It’s very different from informed consent, the ethical and legal standard for human subjects research.”
 
Grimmelmann argued that Facebook’s actual Data Use Policy contains none of these human subject clauses, but that they are only general statements and not specific enough.
 
The law professor also attacked the fact that Facebook claimed that no text was seen by human researchers, as the study used advanced software algorithms to gain its insights.
 
Gimmelmann stressed that this claim misses the point because for an observational study, automated data processing may be a meaningful way of avoiding privacy harms to research subjects but that is because “in an observational study, the principal risks to participants come from being observed by the wrong eyes.”
 
“This, however, was not an observational study. It was an experimental study… the unwitting participants were told, seemingly by their friends, for a week either that the world was a dark and cheerless place or that it was a saccharine paradise. That’s psychological manipulation, even when it’s carried out automatically.”

Besides these arguments, there were also concerns about the study affecting the emotional health of users, argued Pamela Clark-Dickson, senior analyst of consumer services at London-based analyst firm Ovum.
 
Commenting on the study, the analyst said the way in which the study was conducted could have had tragic consequences for those unwitting participants who may already have been suffering poor mental health, but who were randomly selected to receive a higher proportion than usual of negative posts.
 
“In Britain and in the United States, for example, about one in four people will suffer from a mental illness in any given 12-month period. Given the nature of this particular research, Facebook should have sought explicit consent from participants prior to embarking upon it,” she said.
 
Adam Kramer, Facebook’s lead researcher on the study, said that he understood why users had “concerns” about the research and duly apologised for the debacle, though he also attempted to defend the study.
 
Kramer also did not address the issue of “informed consent” which Gimmelmann raised.
 
“We did this research because we care about the emotional impact of Facebook and the people [who] use our product,” Krammer said in a blog post.
 
“The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone.
 
“I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”
 
The study’s editor, Susan Fiske, however defended Facebook in a Bloomberg report, claiming that the authors said it passed Cornell’s human subjects’ ethical review and that she would not “second-guess the Cornell review board.”
 
“People are relating to Facebook as if it has betrayed their trust,” she said. “The level of reaction is understandable. That doesn’t mean what Facebook or Cornell did is unethical.”
 
All the same, users beware
 
Facebook under fire for conducting secretive emotions study Social media professionals Digital News Asia (DNA) spoke to said the social experiment Facebook conducted may have a direct negative impact on the company immediately, but they were not completely convinced that its users would abandon the service altogether.
 
Jeremy Woolf (pic), senior vice president and global social media practice lead at media relations firm Text 100, said social media companies have in the past experimented with data on their sites, but added that they have been largely limited to aggregating profile data and surprising unsuspecting users as a way of demonstrating how much information they’re giving away.
 
“In my understanding of Facebook’s user agreement, it didn’t cross a legal line as users have consented to this type of thing,” he said via email to DNA. “The real question is whether they crossed a moral line.”
 
Woolf believes that this debacle could be perceived as an abuse of both power and trust, and that the experiment may further erode people’s confidence in Facebook’s ability to serve them accurate information.
 
That said, he noted that while there may be a potential backlash and some people may opt out of Facebook, this wasn’t the first time the social network giant has tested its user base, adding that the impact will be minimal and short-term.
 
David Lian, general manager at digital marketing firm Zeno Group Malaysia, said users must do better to understand that data collection and organisation has become so easy for today’s Internet-based economy.
 
Facebook under fire for conducting secretive emotions study Lian (pic) said every time users click a link on Facebook or other Internet sites, these sites start to learn the kind of content users like and will show them more of the same.
 
“I believe this should be a wake-up call to people to really understand what they are signing on to when agreeing to terms of service,” he told DNA via email. “A lot of times, we treat user agreements as mere text that we’ll just agree to anyway because we want to use the service.”
 
However, Lian said he disagreed with any direct manipulation of information, like in the case of the Facebook study, without the users understanding what they’ve signed up for as part of a research project.
 
Asked what social media consumers must do about such situations, Lian said it’s impossible to totally stop any sort of data being collected in today’s online world.
 
He said he believes that anonymised data collection could be a good thing as more scientists would be able to tap into this data to understand humans and make life better.
 
“Unless you go all Luddite, users [just] need to be aware of what data is outgoing. The nice companies will ask you for your permission to use that data, but you can be sure there will be many others who won’t," said Lian.
 
Meanwhile, Woolf advised, “Read the terms of service and don’t sign on if you fear this type of experiment. Or, if you’re offended, vote with your feet and opt out of Facebook [or any kind of Internet service].”
 
Related Stories:

The world needs to unite on privacy and trust: EU official
 
US$19b for WhatsApp: What Facebook is really getting

The effectiveness of targeted advertising on Facebook

 
For more technology news and the latest updates, follow us on TwitterLinkedIn or Like us on Facebook.

 
Keyword(s) :
 
Author Name :
 
Download Digerati50 2020-2021 PDF

Digerati50 2020-2021

Get and download a digital copy of Digerati50 2020-2021