The Filter Bubble and the News You Need To Know
BY Nick Judd | Thursday, June 3 2010
At Personal Democracy Forum 2010 earlier this morning, Eli Pariser said that the increasing sophistication with which information is filtered for us to consume is reducing our ability to engage with differing points of view.
"You don't choose it," Pariser said of what he called the "filter bubble," made up collectively of the way in which aggregators like Google and social sources like Facebook filters out news from points of view divergent from our own.
It's a "just give people what they want" way of operating, Pariser said, and it's great for consumers.
But, he said, "it's bad for citizens."
Much the same way that the old gatekeepers to information chose what their consumers — the readers of their newspapers, for example — had the opportunity to digest in their daily reading, the new gatekeepers are making similar decisions, just in an automated way and with a different reasoning. Rather than the agenda of a publisher, Google filters your news algorithmically. But it's still a filter. And you don't choose it.
"It doesn't tell us what we need to hear," Pariser said. "It doesn't expose us to points of view that [challenge] our thinking."
He added it doesn't expose us to things that might disturb or upset us.
Here was a core of Pariser's argument:
"We really need to get away from that silly idea that code doesn't care about anything."
Programs are the product of programmers, and some of their biases and agendas can make it into the source code. I think there's no better way to explain why that's important than to quote from an intro-level computer science textbook explanation of "semantic error."
From How to Think Like a Computer Scientist, a Python text by Allen Downey, Jeffrey Elkner and Chris Meyers:
"If there is a semantic error in your program, it will run successfully, in the sense that the computer will not generate any error messages, but it will not do the right thing. It will do something else. Specifically, it will do what you told it to do."
Programs do what you tell them, which is not always what you expect them to do. The Google app that you expect to find you the best and most informative news about a given subject is really finding you the best and most informative news about a given subject, as it understands "best" and "informative," and as it understands what is relevant to you.
So if you want to create a search tool that finds the news you need to know, rather than the news you want to read, where do you begin?