Help Us #FactcheckFacebook's Election Efforts Today
BY Micah L. Sifry | Tuesday, November 4 2014
Today is Election Day in the United States, and along with the many efforts by campaigns and advocacy groups to get out their voters, Facebook is taking a big step to push people to the polls. As I reported last week for Mother Jones, for the first time in six years, Facebook says it is rolling out its "voter megaphone"--a banner across the top of each user's page like the one shown above--to all of its users above the age of 18 in the United States. That's somewhere upwards of 150 million people, if all goes according to plan.
While other tech giants, notably Google and Microsoft, have put resources into making it easier for people to find their polling places and look up their elected officials, and to make that data more usable overall, Facebook is the only major platform actively leaning on its users to go vote. In 2010, internal company research on the voter megaphone showed that the "I'm a Voter" button, when combined with the images of close friends who also said they were voting, could increase turnout by a verified .4 percent. And that was a minimum estimate; researchers only counted the votes of people whose names precisely matched those on the voter rolls.
In 2012, Facebook was still experimenting on how tweaks to the user experience could affect political engagement and turnout. As my colleague Sarah Lai Stirland reported here at the time, that meant that not everyone saw the voter megaphone, raising questions that befuddled political activists. As Michael Buckley, a Facebook company spokesman, admitted to me last week, that year a variety of software bugs also kept the megaphone off of potentially millions of users pages. But he insisted that there was no selective targeting of users by any demographic or political category.
"We've always implemented these tests in a neutral manner, and we've been learning from our experience and are 100% committed to even greater transparency when we encourage civic participation in the future," he said. "That's why in our 2014 efforts in the United States we're going out to as broad a group of users as possible and publicly explaining our efforts to members of the media."
The reason why Buckley says the company is pledging "even greater transparency" is simple. Many people have gotten much more sensitive to experiments on their Facebook news feed since last summer, when the news broke that researchers had subtly altered the emotional content of 700,000 users' feeds, finding that slight shifts in positive or negative language could affect the expressed mood of users. That led to worldwide outrage, even though the degree of "emotional contagion" measured by the company's researchers was truly tiny.
In the wake of that furor, it was understandable that Facebook responded with alacrity when I started asking it about a different test on users' news feeds, done in the months right before the 2012 election, that also affected users' expressed level of political engagement. That research, by data scientist Solomon Messing, involved 1.9 million Facebook users. According to his post-election survey of that group, which garnered more than 12,000 responses, people who received that treatment reported paying attention to government more and voting at a much higher rate. Messing's experiment was one of several that helped convince the company that elevating hard news over "listicles" and other linkbait-style content would help, not hurt, how much people engaged with Facebook. And thus, starting in 2013, the company began prioritizing news more; it's an open question how much that may have affected the national political mood in the United States since then.
We live in the age of data-driven media. More and more, the ads you see, the stories that are generated to attract your interest, the products that are placed in your way, the language that is used to get you to respond--all of this has been tested beforehand and is being continuously refined based on your responses. Like other marketers, political campaigns are now deeply into micro-targeting. And it isn't just that someone who watches a cooking show on cable will see different ads than someone who watches sports. Political technology vendors are openly bragging about their ability to deliver messages aimed an individual voters, based on matching their computer's cookies or IP address to the voter file. Until journalism catches up, we literally have no idea just how much any given politician may be micro-pandering (indeed, one of the top practitioners of data-driven campaigning, Blue State Digital founder Joe Rospars, recently criticized his fellow Democratic online campaigners for literally churning out thousands of disparate email messages with little regard to offering any overarching theme to unite them).
It's in that context that we have to look critically at Facebook's voter megaphone. There's never been a business with such a broad as well as intimate reach into the lives of Americans. Only Google, with its hold on how the majority of Americans search for information, comes close. So when Facebook says it is going to use its enormous platform to nudge more of its users to go vote, we shouldn't just applaud the company for being a good corporate citizen. We should also do what we can to keep the company honest.
So, I have a small favor to ask on this Election Day. If you are an adult Facebook user in the United States, go to your page sometime today and take a screenshot of what you see at the top when you first open it. Post that screenshot somewhere--on Twitter, on Facebook, or here in the comments thread. Use #factcheckfacebook as your hashtag, so we can keep track.