Opinion
Computing Applications Viewpoint

Reason-Checking Fake News

Using argument technology to strengthen critical literacy skills for assessing media reports.
Posted
  1. Introduction
  2. From Fact-Checking to Reason-Checking
  3. Argument Technology for Critical Literacy
  4. Argument Mining for Reason-Checking
  5. References
  6. Authors
  7. Footnotes
mobile phone displays News page

While deliberate misinformation and deception are by no means new societal phenomena, the recent rise of fake news5 and information silos2 has become a growing international concern, with politicians, governments and media organizations regularly lamenting the issue. A remedy to this situation, we argue, could be found in using technology to empower people’s ability to critically assess the quality of information, reasoning, and argumentation through technological means. Recent empirical findings suggest “false news spreads more than the truth because humans, not robots, are more likely to spread it.”10 Thus, instead of continuing to focus on ways of limiting the efficacy of bots, educating human users to better recognize fake news stories could prove more effective in mitigating the potentially devastating social impact misinformation poses. While technology certainly contributes to the distribution of fake news and similar attacks on reasonable decision-making and debate, we posit that technology—argument technology in particular—can equally be employed to counterbalance these deliberately misleading or outright false reports made to look like genuine news.

Back to Top

From Fact-Checking to Reason-Checking

The ability to properly assess the quality of premises and reasoning in persuasive or explanatory texts—critical literacy—is a powerful tool in combating the problem posed by fake news. According to a 2017 Knight-Gallup survey, one in five U.S. adults feels “not too confident” or “not confident at all” in distinguishing fact from opinion in news reporting.a Similarly, in the U.K., the National Literacy Trust recently reported that one in five British children cannot properly distinguish between reliable online news sources and fake news, concluding that strengthening critical literacy skills would help in identifying fake news.b

Efforts to combat the effects of fake news focus too often exclusively on the factual correctness of the information provided. To counter factually incorrect—or incomplete, or biased—news, a whole industry of fact-checkers has developed. While the truth of information that forms the basis of a news article is clearly of crucial importance, there is another, often overlooked, aspect to fake news. Successfully recognizing fake news depends not only on understanding whether factual statements are true, but also on interpreting and critically assessing the reasoning and arguments provided in support of conclusions. It is, after all, very possible to produce fake news by starting from true factual statements and drawing false conclusions by applying skewed, biased, or otherwise defective reasoning. We therefore argue that fact-checking should be supplemented with reason-checking: evaluating whether the complete argumentative reasoning is acceptable, relevant, and sufficient.3

Back to Top

Argument Technology for Critical Literacy

Seven years ago, we introduced the Argument Web in Communications: an integrated platform of resources and software for visualizing, analyzing, evaluating, or otherwise engaging with reasoned argument and debate.1 Since then, argument technology has matured into an established research field, attracting widespread academic and industrial interest. Recently, for example, IBM presented the results of its ‘grand challenge’ on argument technology in a live debate between the Project Debater system and two human debating champions.c

Commissioned by the BBC, we have developed a suite of argument technologies, built on the infrastructure of the Argument Web. The resulting software tools are aimed at providing insight into argumentative debate, and at instilling the critical literacy skills needed to appraise reasoned persuasive and explanatory communication. In addition to identifying reasoning patterns and fallacies, our software addresses the issue posed by echo chambers in which people are less exposed to opinions diverging from their own, while already held views get reinforced. Several cognitive processes are involved in this process—such as confirmation bias6—which discourages the consideration of alternative positions in a dispute, and the backfire effect,8 which leads to further entrenchment of viewpoints when presented with conflicting facts.

The Polemicist applicationd addresses this looming one-sidedness of argumentative positions. The application lets the user take on the role of moderator in a virtual radio debate: selecting topics, controlling the flow of the dialogue, and thus exploring issues from various angles. The textual data is drawn from the Argument Web database of analyzed episodes of BBC Radio 4’s Moral Maze.e On this weekly radio program, recurring panelists and invited subject experts debate a morally divisive current affairs topic. The ensuing debate is often lively, combative, and provocative, producing a wealth of intricate argumentative content. Polemicist produces responses given by the actual Moral Maze participants from the Argument Web database and assigns them to software agents modeled on the participants. Playing the role of moderator lets the user rear-range the arguments and create wholly novel virtual discussions between the contributions of participants that did not directly engage in the original debate, while still reflecting their stated opinions.

Test Your Argumentf aims to both foster critical literacy skills and prompt users to consider alternative viewpoints. The software challenges users with a number of argumentation puzzles designed to help develop an understanding of the core principles of strengthening and critiquing arguments. The examples are again drawn from debates on BBC Radio 4’s Moral Maze. Test Your Argument was launched on BBC Taster in December 2017, and has since been visited over 10,000 times, with a rating of 4/5, and 88% of the evaluations saying that the BBC should do more along these lines.g

Argument Analytics serves as an online second-screen supplement to BBC Radio and Television broadcasts. Trialed in 2017 on selected episodes of Moral Maze, the data-driven infographics are designed to provide a deeper insight into the debate. For instance, the interaction between arguments pro and contra are diagrammatically visualized, alignment between participants’ stances is mapped out, and a timeline shows which parts of the debate lead to the most conflict. An example of Argument Analytics for the October 11, 2017 episode of Moral Maze dedicated to the 50-year anniversary of the Abortion Act in the U.K.h

Back to Top

Argument Mining for Reason-Checking

The latest addition to the suite of argument technologies developed for the BBC is The Evidence Toolkit.i This online application is designed to encourage users to dissect and critically appraise the internal reasoning structure of news reports. The Evidence Toolkit launched in March 2018j as part of BBC’s Young Reporter initiative (formerly called ‘BBC School Report’). BBC Young Reporter is a U.K.-wide opportunity offered to some 60,000 11- to 18-year-old students to develop their media literacy skills by engaging firsthand with journalism and newsmaking.k The 2018 initiative addressed the issue of fake news. To let students develop the means to distinguish real news from fake news, the BBC commissioned the iReporter gamel from Aardman Animations targeted at 11- to 15-year-olds, and The Evidence Toolkit from the Centre for Argument Technologym for 16- to 18-year-olds.

uf1.jpg
Figure. The Evidence Toolkit interface.

The Evidence Toolkit guides students through a series of steps to identify claims, arguments, counter-arguments, reasoning types, and evaluation criteria on the basis of scholarship in Critical Thinking and Argumentation Theory.3,9 To help students understand the theoretical concepts, examples are given from episodes of BBC Radio 4’s Moral Maze. Since BBC Young Reporter is intended to be primarily used in the classroom and teachers will often not be argumentation experts themselves, The Evidence Toolkit comes with teacher notes, available through the BBC website.n

Upon identifying the main claim and reasons in the news article, the software helps users classify the reasoning in the news article based on the type of evidence provided. Reasons can be connected to claims in many different ways. Drawing on theories of argumentation and persuasion,9 users are presented with a compact set of options not requiring any specific theoretical background knowledge. Reasoning is classified as fact-based or opinion-based, which in turn can be subdivided further. Opinion-based reasoning, for example, subdivides into evidence drawn from experts (providing authoritative backing), from popular sentiment (of the masses or of a particular community), and from personal experience (whether the author’s own or that of a witness).

Each type of reasoning is associated with a specific template of critical questions pointing at the evaluation criteria for the reasoning.11 In answering these questions, the user builds a confidence level in the support for the claim. Identifying any counter-considerations is another essential step in judging the impartiality of news articles. Again, the software helps students identify any such objections, often linguistically marked with indicative phrases such as “on the other hand,” “admittedly,” or “to some extent.”

In addition to a choice of five articles from various news sources across the political spectrum that are manually pre-analyzed by a team of experts to identify claims, reasons, and objections, The Evidence Toolkit employs automated methods for argument mining (also called argumentation mining in the literature) to allow the students to select their choice of article from the BBC News archives. The implemented argument mining technology automatically extracts the argumentative content from the news articles—provided the chosen article has any explicit argumentative content in it to begin with. Argument mining builds on the successes of Opinion Mining and Sentiment Analysis7 to identify not only what views are being expressed in a text, but also why those views are held4—the software automatically processes the natural language text to produce the analysis otherwise performed by human experts.

At the time of this writing, The Evidence Toolkit has accumulated over 22,000 tries. The software has been well received, earning a rating of 4.15 out of 5 (where the average for applications on BBC Taster lies around 3.5). The user feedback moreover showed not only an accessible user experience (78% found it easy to use), but a successful one: 84% said the critical thinking tools explained in The Evidence Toolkit help to check the reliability of news, with 75% saying that it made them think more deeply about the topics at issue in the news articles. Putting critical literacy high on the BBC’s agenda and applying argument technology to drive it also appears to reflect positively on the organization itself, with 73% stating that The Evidence Toolkit positively changed their view of the BBC.

The suite of argument technology developed for the BBC aims to address the major societal challenge posed by intentional obfuscation and misinformation in the modern media landscape. In collaboration with the BBC—and with the producers of BBC Radio 4’s Moral Maze in particular—we have approached critical literacy from several angles, developing quantitative debate analytics, interactive ways of engaging with argumentative material, and argument mining technology. With a distribution to over 3,000 educational institutions in the U.K., The Evidence Toolkit constitutes, to the best of our knowledge, the first public deployment of argument mining technology at scale. The further development of argument technology for reason-checking could provide a much needed weapon in combating fake news and reinforcing reasonable social discourse.

    1. Bex, F. et al. Implementing the Argument Web. Commun. ACM 56, 10 (Oct. 2013), 66–73.

    2. Flaxman, S., Goel, S., and Rao, J.M. Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly 80 (2016), 298–320.

    3. Johnson, R.H. and Blair, J.A. Logical Self-Defense. McGraw-Hill Ryerson, 1977.

    4. Lawrence, J. and Reed, C. Argument mining: A survey. Computational Linguistics 45, 4 (2020), 765–818.

    5. Lazer, D.M.J. et al. The science of fake news. Science 359, 6380 (2018), 1094–1096.

    6. Nickerson, R.S. Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology 2, 2 (1998), 175–220.

    7. Pang, B. and Lee, L. Opinion mining and sentiment analysis. Foundations and Trends in Information Retrieval. Now Publishers, 2008.

    8. Sethi, R.J., Rangaraju, R., and Shurts, B. Fact checking misinformation using recommendations from emotional pedagogical agents. In A. Coy, Y. Hayashi, and M. Chang, Eds, Intelligent Tutoring Systems 99, 104 (2019), 99–104.

    9. van Eemeren, F.H. et al. Handbook of Argumentation Theory. Springer, Cham, 2014.

    10. Vosoughi, S., Roy, D. and Aral, S. The spread of true and false news online. Science, 359, 6380 (2018), 1146–1151.

    11. Walton, D., Reed, C., and Macagno, F. Argumentation Schemes. Cambridge University Press, 2008.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More