Personal Democracy Plus Our premium content network. LEARN MORE You are not logged in. LOG IN NOW >

The Filter Bubble and the News You Need To Know

BY Nick Judd | Thursday, June 3 2010

(UPDATE: For reference, see this Wired article — thanks @jakebrewer.)

At Personal Democracy Forum 2010 earlier this morning, Eli Pariser said that the increasing sophistication with which information is filtered for us to consume is reducing our ability to engage with differing points of view.

"You don't choose it," Pariser said of what he called the "filter bubble," made up collectively of the way in which aggregators like Google and social sources like Facebook filters out news from points of view divergent from our own.

It's a "just give people what they want" way of operating, Pariser said, and it's great for consumers.

But, he said, "it's bad for citizens."

Much the same way that the old gatekeepers to information chose what their consumers — the readers of their newspapers, for example — had the opportunity to digest in their daily reading, the new gatekeepers are making similar decisions, just in an automated way and with a different reasoning. Rather than the agenda of a publisher, Google filters your news algorithmically. But it's still a filter. And you don't choose it.

"It doesn't tell us what we need to hear," Pariser said. "It doesn't expose us to points of view that [challenge] our thinking."

He added it doesn't expose us to things that might disturb or upset us.

Here was a core of Pariser's argument:

"We really need to get away from that silly idea that code doesn't care about anything."

Programs are the product of programmers, and some of their biases and agendas can make it into the source code. I think there's no better way to explain why that's important than to quote from an intro-level computer science textbook explanation of "semantic error."

From How to Think Like a Computer Scientist, a Python text by Allen Downey, Jeffrey Elkner and Chris Meyers:

"If there is a semantic error in your program, it will run successfully, in the sense that the computer will not generate any error messages, but it will not do the right thing. It will do something else. Specifically, it will do what you told it to do."

Programs do what you tell them, which is not always what you expect them to do. The Google app that you expect to find you the best and most informative news about a given subject is really finding you the best and most informative news about a given subject, as it understands "best" and "informative," and as it understands what is relevant to you.

So if you want to create a search tool that finds the news you need to know, rather than the news you want to read, where do you begin?

News Briefs

RSS Feed friday >

First POST: Spoilers

How the GOP hasn't fixed its tech talent gap; the most tech-savvy elected official in America, and the most tech-savvy state-wide candidate; and much, much more. GO

thursday >

First POST: Hot Spots

How Facebook's Mark Zuckerberg is making inroads in China; labor protests among Uber drivers spread to more cities; new data about the prevalence of online harassment; and much, much more. GO

wednesday >

First POST: Reminders

Why the RNC hasn't managed to reboot how Republican campaigns use voter data; new ways of using phone banking to get out the vote; how the UK's digital director is still ahead of the e-govt curve; and much, much more. GO

tuesday >

First POST: Patient Zero

Monica Lewinsky emerges with a mission to fight cyber-bullying; Marc Andreessen explains his political philosophy; tech donors to MayDay PAC get pushback from Congressional incumbents; and much, much more. GO

monday >

First POST: Front Pagers

How Facebook's trending topics feed is wrecking political news; debating the FBI's need for an encrypted phone "backdoor"; democratizing crisis data; and much, much more. GO

More