News
Computing Applications

The Filter Bubble: Is It Real? Does It Matter?

Posted
A representation of the "filter bubble."
When search engines yield different results based on who is doing the searching, we face the algorithmically constructed isolation known as the filter bubble.

When MoveOn.org founder Eli Pariser searched the Web, he noticed something strange: his first-page results differed from those his friends got. Worried that he was only seeing part of the picture, he examined how Google (and other search engines) tracked his online activities to select certain items to promote, reinforcing his established patterns. His 2011 book The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think
gave a name to the algorithmically constructed isolation he feared ("filter bubble"), and launched a debate about both its importance and its effects.

Some casual experiments showed that identical queries sometimes do deliver differing results; others found the opposite. Google itself has remained silent, eschewing the term "filter bubble" in all communications (and declining to provide official comment for this article), but most of the experiments and discussion have been light on scientific proof, as was Pariser’s book.

Now, with the release of a report on the subject by Facebook, there is increased interest in measuring both the bubble and its consequences.

Not Just Search

Pariser pointed to a specific date for the filter bubble’s appearance on Google: December 4, 2009, the day Google announced "Personalized Search for everyone" using cookies to track individuals’ behavior across the Web. He wrote that filter bubbles also may occur anywhere algorithms demote or hide information: on social networks, in movie recommendations, or in advertising. Facebook, for example, employs such an algorithm to promote items on the first page every logged-in user sees: the "News Feed."

According to a recent study by Facebook research scientist Eytan Bakshy and colleagues, other factors matter more than a search engine’s algorithms. "People are exposed to content that’s commensurate to their friends’ diversity," said Bakshy. "If there’s any kind of filtering going on, this is simply because of users’ limited time and attention to read information."  Further, a previous study on which Bakshy worked observed that "people primarily get information from weak ties, which tend to be more dissimilar." Pariser called the study "good science" and pointed out that the filter bubble is still present on Facebook (although it is smaller than he expected).

Another study looked at how an algorithmic recommendation system affected movie choice. "Exploring the Filter Bubble: The Effect of Using Recommender Systems on Content Diversity" mined 17 years of data from the University of Minnesota’s MovieLens movie recommendation system to find that the system did indeed lessen the diversity of what a user would see in its "Top Picks for You" section. However, the study also segregated those who followed those recommendations from those who did not, and surprisingly found that "followers" were likely to watch a more diverse range of movies. As lead author Tien T. Nguyen posited, "We don’t want to take a risk to explore a new movie, because it’ll cost us two or three hours. With a recommender system, we have a new data point suggesting how much we like or hate this new movie. So we’re likely to say, ‘That’s a new item I didn’t know about, but the system is confident that I may like it, so I’ll take a risk and consume this movie.’"

Magic Keywords, Hidden Effects

Regardless of whether such personalization isolates users, some bemoan its secretive and uncontrolled nature. One such critic is Gabriel Weinberg, who founded the search engine https://duckduckgo.com/ out of privacy concerns, then later promoted its proud lack of personalization through the single-page websites dontbubble.us and PrivateBrowsingMyths.com. Weinberg found Google uses one’s previous searches as "magic keywords" to bias newer searches, even when browsing "privately" in so-called incognito mode. "People think, ‘Well, I’m in incognito mode, therefore I’m getting an anonymous result,’" he said. "That is completely untrue. All that incognito mode does is delete your cookie information and search history and other information after you leave your browser."

A desire to educate inspired another recent paper, the April 2015 "Detecting and Visualizing Filter Bubbles in Google and Bing." Lead author Tawanna R. Dillahunt of the University of Michigan says she was driven by what she saw as real consequences of the filter bubble. "A sociologist and I asked people who may have financial constraints: what do you think is needed to get ahead, in terms of social mobility?," she said. "The leading answer, by far, was ‘education.’ But when we talked about technology to access education, no one brought up massive open online courses (MOOCs) or other free online education tools. Now, when I search for ‘free online courses’, [the MOOC] Coursera shows up. But I already use it! If you’re accessing from a public library, depending on the location and clients of that library, you might not get that search result."

On the other hand, you might receive that same search result. If there is any unifying theme in the research, it is that you cannot guess how search engines’ personalization algorithms will direct an individual search, whether that guidance is wanted, or whether it will ultimately lead to better decisions. As the mechanisms are generally hidden from the user, it is difficult for individuals to decide how much personalization they want, or how to control it. That may soon change; Dillahunt and her colleagues are developing a web browser extension that will show how much one is being "bubbled" in search results.

If browsers ultimately get a dial that allows us to control how "bubbled" we are, Nguyen believes people will vary tremendously in how they twist it. "Some say that recommendations create a filter bubble, and that we need to diversify the top choices. But what if some users say they don’t like that? Maybe we can measure whether the individual user is more open to new things, in which case we’d diversify their recommendations more."

Tom Geller is an Oberlin, OH-based technology and business writer.

 

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More