This weekend, I met Gabriel Weinberg, an MIT grad living in the Philadelphia suburbs who created a search engine called DuckDuckGo. It’s a simple website that weeds out spam search results and promises not to save your search requests like Google does. Time magazine’s Techland blog featured the site just two weeks ago. What makes the search engine especially intriguing now is that its promoted features dovetail perfectly with a new book warning about the perils of sites like Google filtering the information you receive based on data it has collected from you. The Filter Bubble: What the Internet is Hiding from You, by Eli Pariser, former executive director of MoveOn.org, went on sale in May and is stirring a growing debate about how dominant Internet companies such as Google and Facebook are limiting your exposure to a diversity of ideas for the sake of selling ads that target you.

Disclosure: I have not read the book. But I got a taste of the debate as I began to do some reading about DuckDuckGo and wanted to share what I’ve found so far and some of my initial reactions. First, the concept of a filter bubble fits neatly with the ongoing phenomenon of people choosing to follow news sources aligned to their worldview. But rather than being ideologically driven, it is automated based solely on companies trying to sell you stuff. What’s the problem with this? We get what we want, or at least what a computer thinks we want, based on our own online behavior. In an interview with the Independent newspaper in the UK, Pariser explained the problem:

The technology that was used to target ads is now being used to target content. It’s one thing being shown products you might be interested in, but when that’s shaping what information you see, it has a much broader effect. My main concerns are that it gives individuals a distorted view of the world because they don’t know this filtering is happening or on what basis it’s happening, and therefore they don’t know what they’re not seeing. It’s a problem, more broadly, for democracy because it’s like auto-propaganda. It indoctrinates us with our own beliefs and makes it less likely that we’ll run into opposing viewpoints or ideas.

Pariser calls for more transparency in how we arrive at information. One of his solutions, according to the UK’s Telegraph newspaper, is for the return of a human element: Ethics. But the Telegraph reporter raised concerns about that approach.

…though his plea for transparency is commendable, is there not something deeply troubling about the notion of ethical algorithms? Whose duty would it be [to] embed civic responsibility in these codes? And exactly whose idea of civic responsibility would be imposed? It shireks ‘thought police’ to me.

Is this all much ado about nothing? Before the Internet, we had newspapers, TV and radio telling us what they thought we should know. They were filtering. Before that, it was whoever controlled information: Governments, kings, churches, tribal chiefs. Now it’s corporations trying to sell us TVs and teeth-whitener.

It’s good that people are aware of how the things they use – Google and Facebook – work. However, it is the responsibility of people to make informed choices. And Eli Pariser’s book is a welcome addition to the mix. But people have a long tradition of being willfully bamboozled. That’s called human nature.

Advertisements