Personal Democracy Plus Our premium content network. LEARN MORE You are not logged in. LOG IN NOW >

The Filter Bubble and the News You Need To Know

BY Nick Judd | Thursday, June 3 2010

(UPDATE: For reference, see this Wired article — thanks @jakebrewer.)

At Personal Democracy Forum 2010 earlier this morning, Eli Pariser said that the increasing sophistication with which information is filtered for us to consume is reducing our ability to engage with differing points of view.

"You don't choose it," Pariser said of what he called the "filter bubble," made up collectively of the way in which aggregators like Google and social sources like Facebook filters out news from points of view divergent from our own.

It's a "just give people what they want" way of operating, Pariser said, and it's great for consumers.

But, he said, "it's bad for citizens."

Much the same way that the old gatekeepers to information chose what their consumers — the readers of their newspapers, for example — had the opportunity to digest in their daily reading, the new gatekeepers are making similar decisions, just in an automated way and with a different reasoning. Rather than the agenda of a publisher, Google filters your news algorithmically. But it's still a filter. And you don't choose it.

"It doesn't tell us what we need to hear," Pariser said. "It doesn't expose us to points of view that [challenge] our thinking."

He added it doesn't expose us to things that might disturb or upset us.

Here was a core of Pariser's argument:

"We really need to get away from that silly idea that code doesn't care about anything."

Programs are the product of programmers, and some of their biases and agendas can make it into the source code. I think there's no better way to explain why that's important than to quote from an intro-level computer science textbook explanation of "semantic error."

From How to Think Like a Computer Scientist, a Python text by Allen Downey, Jeffrey Elkner and Chris Meyers:

"If there is a semantic error in your program, it will run successfully, in the sense that the computer will not generate any error messages, but it will not do the right thing. It will do something else. Specifically, it will do what you told it to do."

Programs do what you tell them, which is not always what you expect them to do. The Google app that you expect to find you the best and most informative news about a given subject is really finding you the best and most informative news about a given subject, as it understands "best" and "informative," and as it understands what is relevant to you.

So if you want to create a search tool that finds the news you need to know, rather than the news you want to read, where do you begin?

News Briefs

RSS Feed friday >

First POST: Tracking

Questions about whether Whisper is secretly tracking its users' secrets; the FBI's continued push against the new wave of encrypted phones; community service, high-tech-mogul-style; and much, much more. GO

thursday >

First POST: Hosts

Airbnb in hot water in NYC; Knight Prototype Fund backs some civic tech projects; pondering Google's position on net neutrality; and much, much more. GO

wednesday >

First POST: Africa Calling

How some techies are starting to respond to the Ebola crisis; everything you need to know about GamerGate; how Twitter may upset the 2015 UK elections; and much, much more. GO

tuesday >

First POST: Burrowing

How Democratic candidates down-ballot are getting access to the same voter targeting tools used by larger campaigns; Microsoft Bing rolls out its election prediction program; Edward Snowden's first emails to Laura Poitras; and much, much more. GO

monday >

First POST: Attending

New revelations from Laura Poitras' film Citizen Four; how India's new real-time online attendance system for government officials works; tech critic Evgeny Morozov in hot water; and much, much more. GO

More