I am talking of the huge furore online at this point over Facebook's manipulation of user feed two years ago as part of some research project. Here's a little background. The hullabaloo started with a recent report in NewScientist, a weekly non-peer-reviewed international science magazine, which said that in January 2012, the social network altered the number of positive and negative comments that almost 700,000 randomly selected readers saw on their feeds of articles and photos. According to the study, Facebook manipulated the feeds to study whether online emotions are contagious. That is, how online messages influence readers' "experience of emotions," which may, in turn, affect their offline behaviour.
For a week, some users were shown posts with a higher number of positive words, others were shown posts with more negative sentiments. The outcome of the study, published on June 17 in the Proceedings of the National Academy of Sciences, showed people shown fewer positive words were prone to writing more negative posts, while the opposite happened with users who were exposed to fewer negative terms.
The whole thing seems to have blown up on its face, with Facebook users turning to every other social networking platform including Twitter to vent their anger. Almost every post I have read looks at the research as a breach of their privacy. Such is the outrage that a Facebook researcher involved in the project has had to tender a public apology. Adam Kramer, a Facebook data scientist and one of the authors of the study, wrote on his Facebook page on Sunday that the team was "very sorry for the way the paper described the research and any anxiety it caused."
Facebook followers' ire is understandable. A New York Times post rightly points out "academic protocols generally call for getting people's consent before psychological research is conducted on them, Facebook didn't ask for explicit permission from those it selected for the experiment."
That said, there are two things here. First, this isn't the first time the Menlo Park, California-based company was using the data available with it for a "social experiment". Nor is it the only one tinkering with the data. Google and Yahoo! routinely follow how people interact with search results or news articles to modify what is shown. All of them say this improves the user experience, makes the site more engaging and so on.
And if you are not already aware, Facebook does it too. In 2012, MIT Technology Review reported that Mark Zuckerberg himself used Facebook user data for some personal experiments. That report suggested that Zuckerberg tapped the social influence of Facebook to improve the number of registrations for organ donation. It was a clever move - users were given the option to click a box on their Timeline pages to indicate whether they were registered donors. That click would send a notification message to people on their "Friend" list. The article goes on to indicate that this feature created a sort of social pressure to register as an organ donor and the enrolment numbers zoomed. But in my opinion, that move was even more spooky than the latest research.
Second, as the site has pointed out in its defence, customers' concerns about privacy issues are misplaced. The company has said none of the data in the study was associated with a specific person's account. It was undertaken to make the site's content more alluring and relevant, and, in any case, when users sign up for Facebook and agree to its terms of service they consent to this kind of manipulation. That argument was not accepted, but that's a different story.
Now come to the research and you will see why I call this study lame. Academics have trashed the study - as you can read on the many blog posts on the study - on several grounds. For instance, Tal Yarkoni, a psychology Research Associate at the University of Texas at Austin, says, that the fact that users who were part of the experimental ended up producing content that was slightly more positive or slightly more negative doesn't mean that those users actually felt differently. "It's entirely possible - and I would argue, even probable - that much of the effect was driven by changes in the expression of ideas or feelings that were already on users' minds," he writes.
Some others have questioned the methodology used in the study. Dr John Grohol, founder of the psychology site, Psych Central, says the study doesn't really measure the moods it is proposing to capture. If the researchers wanted to make the study exhaustive, they had to go to Facebook users and have them fill out a complete questionnaire. "Instead the authors were making strange judgement calls based on content of status updates to predict a user mood," he says, adding that one needs a different tool or survey to accurately gauge something as complex as emotional state.
In other words, so much for nothing!
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
