Personal Democracy Plus Our premium content network. LEARN MORE You are not logged in. LOG IN NOW >

Why Facebook's 'Voter Megaphone' Is the Real Manipulation to Worry About

BY Micah L. Sifry | Thursday, July 3 2014

Two years ago, on the morning of the 2012 election in the United States, I got an email with an urgent subject line: "You should write the story of how Facebook blew an opportunity to turn out 300k voters." The sender, a veteran progressive online activist who would prefer to remain anonymous, was upset for good reason. The election was bound to be close, and as of 10am that morning he hadn't yet seen an "I'm Voting" button on his Facebook page, nor had another colleague of his. Nor was one on my own Facebook page.

Given that when Facebook deployed a similar "I Voted" button in 2010, and added messages in users' News Feeds showing them the names and faces of friends who had said they voted, the cumulative effect boosted turnout then by at least 340,000 votes, these activists had good reason to be concerned. Facebook had announced that it was going to do the same thing in 2012, and this time around its American user base had grown enormously, from 61 million to more than 160 million. A social and visible nudge like an "I'm Voting" button had the potential to measurably increase turnout, even more so as Facebook was including a useful tool to help people find their polling places. And yet on Election Day 2012 its deployment was far from universal. Facebook was conducting research on us.

For example, our reporter Sarah Lai Stirland said that she saw the button on her Facebook page, but after clicking it, "the whole thing disappeared, and there's no record on my account that shows that I promised to vote." My original source said back then that he only saw the I Voted button on his Facebook page at 3pm in the afternoon, "way too late to have the most effect." Ben Wikler, another progressive online organizer, told me at the time, "I saw the 'I voted' button thing, but I don't see a single story about it in my news feed, contra my experience in 2010…when it was a lot more aggressively omnipresent." Alan Rosenblatt, then the Center for American Progress' associate director for online advocacy, told Stirland, "it should be on my timeline and I should see folks using it in my newsfeed…but neither is happening."

That day, all we were able to find out from a reliable source inside Facebook was that these discrepancies were supposedly the result of ongoing research by the company's data science team. As Stirland reported for us at techPresident then, "some users will see different buttons, and others might not see the message at all, and that people have been randomly selected and placed in control groups as part of ongoing research by Facebook's data team."

Of course, that's not what Facebook had been advertising. Their Election Day post touting the "I'm Voting" feature said, "Facebook is focused on ensuring that those who are eligible to vote know where they can cast their ballots and, if they wish, share the fact that they voted with their friends."

This last week, Facebook has been battered by questions about newly published research coming out of its data science team that showed how a small change in the amount of positive or negative content users might see in their News Feed could, just in the span of a week, measurably change the emotional content of their own postings. The fact that Facebook is constantly testing and tweaking News Feed is actually of little interest. But, as commentators including Zeynep Tufekci, Kate Crawford and danah boyd have all noted, it is good that this controversy has (finally) arisen, because the far more troubling issue is whether Facebook uses its evident media power in transparent and neutral ways.

And the question of how Facebook might influence the outcome of an election--a far more potent effect than making a few thousand random users include one more sad or happy word a week in their updates--is not just an academic one. Indeed, one of the academic researchers involved in the 2012 Facebook "I'm Voting" experiment, Professor Robert Bond of the University of California San Diego, said that "it is possible" that it had the effect of helping increase Barack Obama's voter turnout more than Mitt Romney's.

Engineering the Public
As part of the research for my new book, The Big Disconnect: Why the Internet Hasn't Transformed Politics (Yet), I reached out to several of the authors of the original study of Facebook's 2010 voting experiment, which had the wonderfully memorable title, "A 61-Million Person Study in Social Influence and Political Mobilization." That study found that about 2 percent of the people who saw on Facebook that their friends had said that they voted themselves were more likely to report voting. A two percent boost would be a huge effect, but the study's authors were only able to firmly verify from voter records (matched to people's Facebook names) that indeed 340,000 of those people did indeed vote. That is, the appearance of the "I Voted" button increased turnout in 2010 by a verified .4 percent. People in a control group, who did not have the button on their page, did not increase their voting participation.

Professor Bond was the lead author of that study, and the only researcher who was kind enough to respond to my inquiries. (The general lack of transparency about Facebook-based research is a recurring frustration.) And he confirmed by email what our unnamed company source had told us about the variations in Facebook's 2012 effort being a product of ongoing research, saying "You are right, we did have a similar experiment for the 2012 election. However, we are still working on it and results and details are under embargo until something gets published."

I pressed him to explain how the two experiments might have influenced turnout and whether there were differences in how different demographic groups responded to the appearance of the voting prompt. He replied, “The experiment we conducted in 2012 tests different mechanisms, so we can't say for sure” whether it had the same effect as in 2010. “While there was definitely still an overall message to encourage turnout, the differences between messages that allow us to estimate their effect was different than in 2010.” He added, “I would expect that the effect of the 2010 manipulation would have been different in 2012 simply because it is a much higher salience election.” 

In other words, we can’t directly compare Facebook’s impact on each election because one “manipulation” was in a Presidential year, when more people are naturally engaged, and the other wasn’t. Also, in 2012, the Obama campaign was highly active on Facebook, another new variable.

But that said, assuming the “manipulations” used by Facebook in 2012 had a positive effect on its users’ voting behavior, then it is quite likely that Facebook actually tilted the 2012 election towards Obama. This is because membership and usage of Facebook is not uniform across different demographic groups in the United States. According to the Pew Internet & American Life Project, women are ten points more likely to use it than men; young people are almost twice as likely to be on Facebook than people over 65; and urban dwellers are slightly more likely to use it than rural folks. (it's no coincidence, I think, that socially progressive campaigns like the movement for marriage equality are benefitting enormously from Facebook's reach and companies like Upworthy are driving huge levels of engagement through Facebook.)

Assuming that the “contagion effect” is uniform, a nudge that increased voter participation by adult American Facebook users probably pushed more Obama voters than Romney voters to go vote, because Obama did better with women, young people and urban dwellers. Women were 53% of the overall vote, and they went for Obama over Romney by 55 to 44 percent. The youngest voters, 18- to 29-year-old, went for Obama by 60 to 37 percent. Big city dwellers gave Obama a whopping 69 percent of their votes; mid-sized city dwellers gave him 58 percent. 

When I asked Professor Bond directly if it were possible that the Facebook contagion effect helped increase Obama’s vote more than Romney’s, he answered, “I would say that it is possible, but that we didn't test for this at all and it would be quite difficult to tell for sure." He noted that "Democrats and Republicans seem to have been equally responsive to the [2010] treatment" but also that they observed variations in the contagion effect by demographic group.

Of course, Obama won the 2012 by a comfortable margin; it's not likely Facebook's friendly nudge changed the results of any state electoral contest. But here's what is so worrisome about the company's "I'm a Voter" initiative. What if Facebook were to place that nudge only on pages of users who said they were Democrats? Presumably someone would notice and sound the alarm. But what if the nudge was only on pages of Democrats in one key swing state? Or, what if the button was on everyone’s page, but Facebook changed its news feed feature to limit the sharing of that news only to a targeted group? Would that kind of manipulation be as easy to uncover? The company already limits the number of users who will see something that someone posts randomly to their feed; you have to pay extra to “promote” it to everyone else.    

How exactly might independent reporters verify that Facebook indeed did what it said it did on a couple hundred million of its users’ pages? One of the least-noticed implications of our new age of data-intensive politics is that one side has nearly all the marbles. Until the media and other observers develop the tools to independently monitor the uses of Big Data by third-party platforms (as well as campaigns), the integrity of the process will rest entirely on the honesty of the data scientists and engineers inside these organizations, for only they will know if they are playing fairly. And we already know that campaigns will do whatever they think they can get away with to win. Will the big publicly traded corporations that are today’s new media platforms behave more ethically?

Companies like Facebook are also political animals. They have political action committees and spend millions on lobbyists in Washington and many state capitals.. Occasionally, as in the fight against the SOPA and PIPA bills, they have used the immense convening power of their platforms and home pages to rally users to take a stand. To some extent they are a lot like traditional media companies, though they are at lot less transparent about their internal processes; the average news division at a big TV network has much clearer procedures for keeping its operations separate from its corporate parent’s lobbying efforts than these tech giants. Of course, if a platform like Facebook hosts a live townhall-style meeting for a presidential candidate it risks appearing one-sided, and that concern has caused the company to offer similar opportunities to the other side. Most corporations avoid taking sides in partisan fights because they don’t want to risk alienating half their customers. But a more subtle kind of favorite-playing could take place and we might never know.

Oh, and Facebook has just announced that it is going to start putting an "I'm A Voter" button--which it has rebranded as the "Voter Megaphone"-- on all the pages of its users worldwide when their countries are having national elections. Last month, about 4.1 million Indians clicked on the I'm a Voter button and 31 million people saw it, Facebook's Katie Harbath told techPresident. And a few weeks ago, Facebook put it on the pages of its European users, in tandem with the European Parliament elections. It is planning to do the same for Indonesia's elections on July 9, Sweden's on September 14, the Scottish independence referendum on September 18, Brazil on October 5, and the U.S. midterms November 4.

Harbath added, "I would still put this in the beta-testing phase. We are trying to learn from each election how users are using the megaphone," she said. So are we.

News Briefs

RSS Feed friday >

First POST: Spoilers

How the GOP hasn't fixed its tech talent gap; the most tech-savvy elected official in America, and the most tech-savvy state-wide candidate; and much, much more. GO

thursday >

First POST: Hot Spots

How Facebook's Mark Zuckerberg is making inroads in China; labor protests among Uber drivers spread to more cities; new data about the prevalence of online harassment; and much, much more. GO

wednesday >

First POST: Reminders

Why the RNC hasn't managed to reboot how Republican campaigns use voter data; new ways of using phone banking to get out the vote; how the UK's digital director is still ahead of the e-govt curve; and much, much more. GO

tuesday >

First POST: Patient Zero

Monica Lewinsky emerges with a mission to fight cyber-bullying; Marc Andreessen explains his political philosophy; tech donors to MayDay PAC get pushback from Congressional incumbents; and much, much more. GO

monday >

First POST: Front Pagers

How Facebook's trending topics feed is wrecking political news; debating the FBI's need for an encrypted phone "backdoor"; democratizing crisis data; and much, much more. GO

More