We all have those people in our News Feeds, the ones who lead fabulous lives, who go on long foreign vacations, and are even able to fulfill that damned #100HappyDays challenge without sounding overly lame. Love them or hate them, you do end up following them, until of course jealousy or sheer annoyance compels you to click on that ‘Unfollow’ button next to their names.

The News Feed on Facebook is an intensely personal space. It isn’t like your Twitter timeline with retweets and tweets from strangers. These are people you’ve interacted with at some point in your life.

And this is perhaps why there is much outrage over the news that Facebook ran a little experiment in 2012, where it tweaked the News Feed of close to 689,003 users to reflect a certain kind of content for users. You can view the study as it was published here.

Facebook’s study conducted along with Cornell University, and the University of California at San Francisco was trying to understand ’emotional contagion’. According to Facebook’s Adam Kramer (one of the co-authors), the idea was to “investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.”

Kramer then adds that the experiment showed that if a certain kind of emotion, say positive or negative was expressed, then it would encourage users to share similar stuff about their lives as well. Facebook’s pop psychology claims that if your friends are expressing happiness on the site, so will you, eventually.

While that’s good news for Facebook, not everyone is impressed in the manner in which the study was conducted. In fact as this Atlantic piece points out, even the editor of the study was a little creeped out.

Susan Fiske, the professor of psychology at Princeton University who edited the study for Proceedings of the National Academy of Sciences of America, told the Atlantic, “I was concerned until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time.”

According to Forbes, there was no internal review board from the colleges concerned, but rather just a Facebook internal review that was used as a go ahead for the study.

Facebook’s internal review aside, some users have been pointing out on social media that this isn’t the first time that a tech company has used user data without their explicit permission. In fact as many news reports have pointed out, Facebook, Data use policy clearly states that the company can use user data for, “or internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

Some argue that when you joined Facebook, you signed up for these wacky experiments, so you shouldn’t whine about it now. But the argument that “you gave your data to Facebook, so put up or shut up,” overlooks the larger issue at hand. Which is one of ethics.

As Fiske told Atlantic, “I had not seen before, personally, something in which the researchers had the cooperation of Facebook to manipulate people… Who knows what other research they’re doing.”

And this what makes the study more worrying, that the entire study was based on tinkering with News Feeds. It shows that Facebook can in some ways influence how you engage with it by reflecting certain kinds of posts. Forget simple status updates like ‘Having a great day.” It can also show you content around your politics from pages you like, to get an emotional response from you. Call it human behaviour, but not knowing that such a response was deliberately sought from you, borders on creepy.

More importantly the question of explicit consent shouldn’t be overlooked, especially in a case where Facebook is literally toying with emotions. As Robinson Meyer points out in another Atlantic piece, what is more worrying is that “ like any other study on human behaviour, Facebook never bothered to inform users that their News Feeds had been altered after the study was conducted.”

And Facebook doesn’t think there’s anything wrong with what it did. Forbes’ Kashmir Hill says Facebook is focusing on data use, “rather than on the ethics of emotionally manipulating users to have a crappy day for science.”

The takeaway from this as she points out is “One usable takeaway in the study was that taking all emotional content out of a person’s feed caused a ‘withdrawal effect.’ Thus Facebook now knows it should subject you to emotional steroids to keep you coming back.”

The only good news in this is that study might not be entirely accurate According to Psych Central’s Dr John Grohol, the tool used by Facebook to determine what constitutes as positive or negative is limited in its nature. The study notes, “Posts were determined to be positive or negative if they contained at least one positive or negative word, as defined by Linguistic Inquiry and Word Count software (LIWC2007) (9) word counting system…”

According to Grohol, “LIWC was created to analyze large bodies of text — like a book, article, scientific paper, an essay written in an experimental condition, blog entries, or a transcript of a therapy session,” and that to use to this analyse tweets or status update is not efficient.

He adds, “it (LIWC) wasn’t designed to differentiate — and in fact, can’t differentiate — a negation word in a sentence. For instance if a sentence had a positive and negative emotion (happy prefaced by not), LIWC would count both the emotions.

As he writes, this makes for “a huge difference if you’re interested in unbiased and accurate data collection and analysis,” and that despite whatever ‘benefits’ Facebook claims, the study might not just be accurate. Read more on his explanation here.

Accurate or not, the one sentiment that is popular post this experiment is perhaps we are all ‘lab rats’ for Facebook, except that at times we don’t even know it.


Tags: , , , , , ,