News
Computing Applications News

Search Engine Agendas

Is Google trying to trick you on the way to the polls?
Posted
  1. Introduction
  2. The Experiments
  3. What to Do?
  4. Beyond Elections
  5. Further Reading
  6. Author
  7. Figures
2016 presidential candidates
Research has shown the order in which the results of search engine queries are presented can affect how users vote.

In the novel 1984, George Orwell imagines a society in which powerful but hidden forces subtly shape peoples’ perceptions of the truth. By changing words, the emphases put on them, and their presentation, the state is able to alter citizens’ beliefs and behaviors in ways of which they are unaware.

Now imagine today’s Internet search engines did just that kind of thing—that subtle biases in search engine results, introduced deliberately or accidentally, could tip elections unfairly toward one candidate or another, all without the knowledge of voters.

That may seem an unlikely scenario, but recent research suggests it is quite possible. Robert Epstein and Ronald E. Robertson, researchers at the American Institute for Behavioral Research and Technology, conducted experiments that showed the sequence of results from politically oriented search queries can affect how users vote, especially among undecided voters, and biased rankings of search results usually go undetected by users. The outcomes of close elections could result from the deliberate tweaking of search algorithms by search engine companies, and such manipulation would be extremely difficult to detect, the experiments suggest.

Writing in Proceedings of the National Academy of Sciences, Epstein and Robertson conclude, “Given that search engine companies are currently unregulated, our results … [suggest] that such companies could affect— and perhaps are already affecting—the outcomes of close elections worldwide … Unregulated election-related search engine rankings could pose a significant threat to the democratic system of government.” Epstein says his concerns center on Google, because of its dominant position, with two-thirds of the search engine market in the U.S. and 90% in Europe.


“Unregulated election-related search engine rankings could pose a significant threat to the democratic system of government.”


A spokeswoman for Google derided the notion the company might attempt to influence elections by calling it a “conspiracy theory.” She cited a statement by Google senior vice president of Search Amit Singhal that “Google has never ever re-ranked search results on any topic (including elections) to manipulate user sentiment. Moreover, we do not make any ranking tweaks that are specific to elections or political candidates. From the beginning, our approach to search has been to provide the most relevant answers and results to our users, and it would undermine people’s trust in our results, and our company, if we were to change course.”

Back to Top

The Experiments

Epstein and Robertson conducted five double-blind experiments to determine if biased search engine rankings might actually sway elections. In each of the first three experiments, 102 people recruited from the public in San Diego were given brief biographies of both candidates in the 2010 Australian election for prime minister and then were asked to state their preferences based on the biographies. Then the subjects were given alternate search engine results—with links to real websites, which they were encouraged to explore—bearing on the election. The rankings of some of the results put one candidate near the top of the search results, while some ranked the other candidate higher and some were balanced between the two. The subjects, who were unfamiliar with the Australian election, were then asked how they would vote based on all the information at hand. A statistical analysis showed the subjects came to view more favorably the candidates whose search results ranked higher on the page, and were more likely to vote for them as a result.

In another experiment, Epstein and Robertson selected 2,150 demographically diverse subjects during the 2014 Lok Sabha elections in India, in which 430 million votes were cast. They found voters were similarly subject to unconscious manipulation by search engine results. In particular, the larger sample size revealed subjects who had reported a low familiarity with subjects were more likely to be influenced by manipulation of search engine results, suggesting manipulation attempts might be directed at these voters.

Depending on how the experiments were structured, between zero and 25% of the subjects said afterward they had detected bias in the search engine rankings. However, in a counterintuitive result, those subjects who reported seeing bias were nevertheless more likely to be influenced by the manipulation; they apparently felt there must be a good reason for the bias, and so it tended to validate their choice of candidates.

The researchers found search engine rankings could shift voter preferences by 20% to 80% depending on demographics such as party affiliation and income level, suggesting manipulation could be targeted at certain groups. “This is incredibly important from a practical perspective, especially when we’re talking about companies that maintain massive profiles about people,” Epstein says.

Epstein says the shift comes from the widespread belief that search engine results that rank high are somehow better or more “correct” than lower-ranking items. This view is constantly reinforced as people run queries—such as, “What is the capital of Uganda?”—for which the correct answer invariably appears at the top of the results page.

Back to Top

What to Do?

Epstein admits there is no evidence that any search engine company has ever tried to manipulate election-related search rankings, but he says the results of his experiments are cause for concern because they show how easily that could be done, either at the direction of the management of a search company, or by a “rogue employee” with hands-on access to complex search algorithms. Even absent deliberate manipulation, search engine rankings can become self-reinforcing through the “digital bandwagon effect,” in which users see top-ranked candidates as somehow better and more worthy of their respect.

One solution to the problem could be an “equal-time rule” (in the U.S., the equal-time rule mandates radio and television broadcast stations airing content by a political candidate must provide an equivalent opportunity to any opposing political candidates who request it) that requires search companies to mix the results of searches about election-related matters so no candidate has any rank advantage, Epstein says. “Either search engine companies are going to have to do this voluntarily, or they will see standards set by an industry association or a non-profit or by government,” he says, “because if we don’t start moving in that direction, the free and fair election will, for all intents and purposes, be meaningless.”

Another possibility might be to post warnings at the top of political search results—similar to those that now flag advertisements—telling users the order in which results are shown may reflect bias in favor of the candidate(s) ranked near the top. Epstein acknowledges search companies are unlikely to voluntarily adopt an equal-time rule or the warnings, but he says either or both could be built into the browser, acting automatically when search results contain the names of political candidates.


There is no evidence that any search engine company has ever tried to manipulate election-related search rankings.


The idea government might play a role in regulating search engines is not new, and it is strongly opposed by search companies and by many First Amendment watchdogs. Frank Pasquale, now a professor of law at the University of Maryland, in 2008 wrote a paper recommending the establishment of a Federal Search Commission. He argues free speech concerns do not apply to search because search engines act more like common carriers, which are subject to regulation, than like media outlets, which enjoy First Amendment protections.

As for why we need federal regulation, Pasquale says, “When a search engine specifically decides to intervene, for whatever reason, to enhance or reduce the visibility of a specific website or a group of websites … [it] imposes its own preferences or the preferences of those who are powerful enough to induce it to act.”

Back to Top

Beyond Elections

Concerns about algorithms that search, select, and present information extend beyond search companies. “I’ve looked at the black box algorithms behind Google, Facebook, Twitter, and the others,” Pasquale says, “and I’m pretty troubled by the fact that it’s so hard to understand the agenda that might be behind them.” He supports the idea of a “trusted advisory committee” of technical, legal, and business experts to advise the Federal Trade Commission on the fairness of the algorithms of those companies.

Not only are the algorithms behind these major services complex and secret, often users do not know that any selection logic or personalization occurs at all, says Karrie Karahalios, a professor of computer science at the University of Illinois and co-director of the Center for People and Infrastructures. In a study involving 40 of her students, more than half were “surprised and angered” to learn there was a “curation algorithm” behind the Facebook News Feed. She says such “invisible algorithms,” in the interest of efficiency, can mislead people by acting as secret “gatekeepers” to information.

Karahalios recommends browsers offer graphical cues to users to show how the algorithms work, so users know why they are seeing certain results. For example, when an item ranks high in search results because it has many links to other things, she suggests that might be signaled with a larger type font. She also says users should have some control over how the algorithms work. “I think it is important to have some levers that users can poke and prod to see changes in the algorithmic system,” she says.

In 2014, Karahalios and several colleagues presented five ideas by which algorithms, even secret ones, might be “audited” for bias by outside parties. In one, the Sock Puppet Audit, computer programs would impersonate actual users, generating test data and analyzing the results. Similarly, the testing and evaluation of algorithms could be crowd-sourced, by some mechanism such as Amazon’s Mechanical Turk.

The advocates of audits agree these ideas present technical and legal difficulties, but they say some kind of external checking on the fairness of these ubiquitous services is needed. “Putting warnings on search results is not enough,” Karahalios says.

Luciano Floridi, a professor of philosophy and ethics of information at the University of Oxford, says the power and secrecy of Google is worrisome in part because of the company’s nearmonopoly power. “Nothing wrong has happened so far, but that’s not a strategy,” he says; “that’s like keeping your fingers crossed.” He says recent revelations that Volkswagen AG manipulated engine software to fool regulators and consumers are not reassuring.

Floridi says the risk of mischief is compounded because Google’s users are not customers in the retail commercial sense. “They are not accountable because users are not paying for searches,” he says. “We don’t have customers’ rights with Google.”

Floridi advises Google on “the right to be forgotten” regulations by the European Union. He says he finds his contacts at the company to be “open-minded and sensible” about ideas for regulating search. “If it makes good sense socially speaking, and if it makes good sense business-wise, then there is a conversation on the table,” he says.

Back to Top

Further Reading

Bracha, O., Pasquale, F.,
Federal Search Commission? Access, fairness, and accountability in the law of search. Cornell Law Review, September 2008 http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1002453

Epstein, R.,
The search engine manipulation effect and its possible impact on the outcomes of elections. Proceedings of the National Academy of Sciences, Aug. 18, 2015 http://www.pnas.org/content/112/33/E4512.abstract

Pasquale, F.,
The black box society: the secret algorithms that control money and information. Harvard University Press, 2015 http://www.hup.harvard.edu/catalog.php?isbn=9780674368279

Sandvig, C., Hamilton, K., Karahalios, K., and Langbort, C.,
Auditing algorithms: research methods for detecting discrimination on Internet platforms. 64th Annual Meeting of the International Communication Association, May 22, 2014 http://acawiki.org/Auditing_Algorithms:_Research_Methods_for_Detecting_Discrimination_on_Internet_Platforms

Zittrain, J.
Engineering an election – digital gerrymandering poses a threat to democracy. Harvard Law Review Forum, Jun 20, 2014 http://harvardlawreview.org/2014/06/engineering-an-election/

Videos – How Google Works https://www.youtube.com/watch?v=Md7K90FfJhghttps://www.youtube.com/watch?v=3tNpYpcU5s4

Back to Top

Back to Top

Figures

UF1 Figure. Research has shown the order in which the results of search engine queries are presented can affect how users vote.

Back to top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More