You are not logged in. LOG IN NOW >

First POST: Contagious

BY Micah L. Sifry | Monday, June 30 2014


  • For one week in January 2012, researchers at Facebook deliberately skewed the News Feed content of nearly 700,000 users, some shown content deemed to contain more happy words, other shown more that was sad. A week later, users were somewhat more likely to post especially positive or negative content themselves based on how their feeds had been skewed, according to a new study by Adam Kramer, Jamie Guillory and Jeffrey Hancock, published by the Proceedings of the National Academy of Sciences. This news set off quite a firestorm of commentary over the weekend.

  • The research study was first noticed by the New Scientist late last week and then Sophie Weiner of Animal New York tore into it as "manipulation."

  • Soon, ">Kashmir Hill of Forbes and ">Robinson Meyer of The Atlantic piled on, each questioning the ethics of the research, with Hill damning Facebook for abusing its terms of service (which do allow usage of user data for research purposes) and Meyer discovering that the researchers did not have formal Institutional Review Board approval to experiment on human subjects.

  • Slate's Katy Waldman was quick to call the experiment unethical, declaring that "Facebook intentionally made thousands upon thousands of people sad." This appears to be a bit of an exaggeration.

  • On Facebook, the lead researcher Adam Kramer defended his work, noting that the actual manipulation of people's News Feeds was very small, and that the effect it produced was the average of "one fewer emotional word, per thousand words, over the following week." He also wrote that the intention of the research was not to see if Facebook could manipulate its users' emotions, but whether seeing lots of positive content might make users feel more negative about themselves, and whether seeing more negative content might make them visit the site less.

  • Psychoinformatics research professor Tal Yarkoni also penned a defense of Facebook and the research study, pointing out that the News Feed "is, and has always been, a completely contrived environment" and that "Facebook is constantly manipulating its users' experience."

  • In response, technosociologist Zeynep Tufekci zinged Yarkoni for his seeming "resignation to online corporate power" and the "new tools and stealth methods" they have to "quietly model our personality, our vulnerabilities, identify our networks, and effectively nudge and shape our ideas, desires and dreams."

  • Tufekci is right to point to the larger implications of this Facebook controversy. Changing people's emotional experience is a temporary effect--but one that could certainly be abused in all kinds of ways. In 2012, Facebook definitely tilted the election towards Barack Obama by placing "I Voted" buttons on the pages of its 160 million users, boosting their turnout slightly. (Since Facebook's users are disproportionately young, female and urban, a general increase in voting by its users helped Obama more than Romney, as I point out in my book The Big Disconnect. It's troubling that the research on that experimental manipulation--which built on a 2010 experiment on 61 million users--has yet to be released. Combine Facebook's capacity to change what some users see in their News Feed, with its ability to nudge voter turnout upward, and you have the power to skew elections.

In other news around the web:

  • Flying above the NSA's data center in Utah Friday: a blimp labeled "NSA Illegal Spying Below" paid for by the Electronic Frontier Foundation, Greenpeace and the Tenth Amendment Center.

  • Twenty additional groups joined with those three in launching "Stand Against Spying," a new campaign urging Americans to call their Member of Congress after seeing their positions on mass surveillance. Members of the coalition include the Sunlight Foundation, Freedom Works, Free Press Action Fund, Fight for the Future, Defend Progress, Reddit, Upworthy, TechFreedom and the Libertarian Party.

  • The new NSA director Michael Rogers tells David Sanger of the New York Times that the sky isn't falling as a result of Edward Snowden's disclosures.

  • In the New Yorker, Amy Davidson explains how the Supreme Court's decision in last week's Riley v California case could affect NSA surveillance jurisprudence.

  • In Slate, Selina MacLaren points out that most of the Supreme Court's justices are pretty confused about technology.

  • David Carr devotes his Monday column to the thriving world of email newsletters, those handy missives that collect and organize content for busy readers.

  • Derek Willis points out that the lost IRS emails are just the tip of a bigger problem in government today: federal agencies hampered "both by outdated and expensive computing infrastructure and by regulations that won't require modern storage and retrieval techniques until the end of 2016 at the earliest."