Personal Democracy Plus Our premium content network. LEARN MORE You are not logged in. LOG IN NOW >

The Filter Bubble and the News You Need To Know

BY Nick Judd | Thursday, June 3 2010

(UPDATE: For reference, see this Wired article — thanks @jakebrewer.)

At Personal Democracy Forum 2010 earlier this morning, Eli Pariser said that the increasing sophistication with which information is filtered for us to consume is reducing our ability to engage with differing points of view.

"You don't choose it," Pariser said of what he called the "filter bubble," made up collectively of the way in which aggregators like Google and social sources like Facebook filters out news from points of view divergent from our own.

It's a "just give people what they want" way of operating, Pariser said, and it's great for consumers.

But, he said, "it's bad for citizens."

Much the same way that the old gatekeepers to information chose what their consumers — the readers of their newspapers, for example — had the opportunity to digest in their daily reading, the new gatekeepers are making similar decisions, just in an automated way and with a different reasoning. Rather than the agenda of a publisher, Google filters your news algorithmically. But it's still a filter. And you don't choose it.

"It doesn't tell us what we need to hear," Pariser said. "It doesn't expose us to points of view that [challenge] our thinking."

He added it doesn't expose us to things that might disturb or upset us.

Here was a core of Pariser's argument:

"We really need to get away from that silly idea that code doesn't care about anything."

Programs are the product of programmers, and some of their biases and agendas can make it into the source code. I think there's no better way to explain why that's important than to quote from an intro-level computer science textbook explanation of "semantic error."

From How to Think Like a Computer Scientist, a Python text by Allen Downey, Jeffrey Elkner and Chris Meyers:

"If there is a semantic error in your program, it will run successfully, in the sense that the computer will not generate any error messages, but it will not do the right thing. It will do something else. Specifically, it will do what you told it to do."

Programs do what you tell them, which is not always what you expect them to do. The Google app that you expect to find you the best and most informative news about a given subject is really finding you the best and most informative news about a given subject, as it understands "best" and "informative," and as it understands what is relevant to you.

So if you want to create a search tool that finds the news you need to know, rather than the news you want to read, where do you begin?

News Briefs

RSS Feed friday >

First POST: Scary Monsters

Facebook opens up about its experiments on tweaking voting behavior; breaking news in the FCC net neutrality battle; getting hard data on civic tech's impact on political efficacy; and much, much more. GO

thursday >

First POST: System-Gaming

Why techies interested in political reform are facing challenges; the latest data on Democratic voter contacts in 2014; Hungary's anti-Internet tax demonstrations are getting huge; and much, much more. GO

wednesday >

First POST: Gimme Shelter

The link between intimate partner violence and surveillance tech; the operational security set-up that connected Laura Poitras, Glenn Greenwald and Edward Snowden; how Senate Dems are counting on tech to hold their majority; and much, much more. GO

tuesday >

First POST: Tribes

Edward Snowden on the Internet's impact on political polarization; trying to discern Hillary Clinton's position on NSA reform; why Microsoft is bullish on civic tech; and much, much more GO

monday >

First POST: Inventions

How voter data-sharing among GOP heavyweights is still lagging; why Facebook's News Feed scares news publishers; Google's ties to the State Department; and much, much more. GO

More