Research and Advances
Computing Applications Contributed articles

Digital Nudging: Guiding Online User Choices through Interface Design

Designers can create designs that nudge users toward the most desirable option.
Posted
  1. Introduction
  2. Key Insights
  3. Guiding Choices
  4. Designing a Digital Nudge
  5. Conclusion
  6. Acknowledgments
  7. References
  8. Authors
  9. Footnotes
  10. Sidebar: Questions Designers Need to Address
Digital Nudging, illustration

Life is full of choices, often in digital environments. People interact with e-government applications; trade financial products online; buy products in Web shops; book hotel rooms on mobile booking apps; and make decisions based on content presented in organizational information systems. All such choices are influenced by the choice environment, as reflected in this comment: “What is chosen often depends upon how the choice is presented.”16 Why? People have cognitive limitations, so their rationality is bounded,27 and heuristics and biases drive their decision making.34 Designers of choice environments, or “choice architects,”32 can thus use these heuristics and biases to manipulate the choice environment to subtly guide users’ behavior by gently “nudging” them toward certain choices.

Back to Top

Key Insights

  • Heuristics and biases influence offline and online behavior.
  • User-interface design influences choices, even unintentionally.
  • Thorough design and testing can help achieve a designer’s intended behavioral effects.

These observations are more than theory. We are being nudged every day of our lives. Supermarkets position items with the highest markups at eye level to nudge customers into making unplanned purchases. Likewise, supermarkets limit the number of units customers are allowed to buy, thereby influencing their buying decisions; customers subconsciously anchor their decisions on the maximum number and adjust downward from there, resulting in purchases of greater quantities.36 This effect has been demonstrated in the context of everyday items; for example, introducing a quantity limit of 12 cans of soup helped double the average quantity purchased from 3.3 to seven cans.36 Nudges are not, however, used only by marketers trying to sell more products or services; for example, when asking people to consent to being an organ donor, simply changing defaults can influence people’s choices. Setting the default to “dissent,” whereby donors have to opt out, rather than “consent” whereby donors have to opt in, can nearly double the percentage of organ donors.15 These examples show that largely imperceptible nudges are effective in a variety of offline contexts.

As in offline environments, online environments offer no neutral way to present choices. Any user interface, from organizational website to mobile app, can thus be viewed as a digital choice environment.37 Digital choice environments nudge people by deliberately presenting choices or organizing workflows, making digital nudging—”the use of user-interface design elements to guide people’s behavior in digital choice environments”37—a powerful tool in any choice architect’s toolbox. Choosing the most effective nudge involves trade-offs, however, because predicting the consequences of implementing certain nudges is not always possible.

Existing guidelines for implementing nudges have been developed primarily for offline environments, and digital nudging has only recently begun to attract programmer interest; see, for example, Gregor and Lee-Archer10 and Weinmann et al.37 In addition, guidelines that are effective offline may not always be directly transferred to a digital context; for example, online users are more willing to disclose information but are also more cautious about accepting default options.2 To this end, this article shows how designers can consider the effects of nudges when designing digital choice environments.

Back to Top

Guiding Choices

As in offline contexts, online decision making is almost always influenced by heuristics and biases; consequently, the concept of digital nudging applies not only to online consumers’ decision making but also to various other contexts, from e-health systems to social media apps to organizational information systems. Whereas such factors as presenting reviews or highlighting markdowns are well known for having a strong effect on user behavior in general, digital nudges influence decisions at the point and moment of decision making.a,22 In particular, digital nudging works by either modifying what is presented—the content of a choice6,35—or how it is presented—the visualization of a choice—as in, say, changing the design of the user interface.16 For example, the mobile payment app Square presents a “tipping” option by default, so customers must select “no tipping” if they prefer not to give a tip; this modification is likely an attempt to nudge people into giving tips, motivating them to tip even where tipping is uncommon.3

To illustrate the effects of digital nudges, we briefly explore the results of a series of experiments in the context of reward-based crowdfunding.28,33,38 In reward-based crowdfunding, project creators collect small amounts of money from a large number of people, or “backers.” Backers pledge money for projects and receive non-financial rewards in return (such as an e-book).1 To test how digital nudges influence backers’ pledges, researchers at the University of Liechtenstein modified the content and/or visualization of a choice environment to nudge backers toward a particular option through three particular heuristics and biases, known as the “decoy effect,”33 “scarcity effect,”38 and “middle-option bias.”28

Decoy effect. The decoy effect increases an option’s attractiveness by presenting the option alongside an unattractive option no one would reasonably choose—the decoy.13 In a study conducted in the context of crowdfunding (N = 96), the researchers showed how decoys can nudge users to select certain rewards;33 when backers were presented with a choice of receiving an e-book in return for a $10 pledge or both an e-book and a hardcover book for a $20 pledge, most backers chose to pledge $10. However, when a third option—a decoy nudge—was included that offered only the hardcover book in return for a $20 pledge (see Figure 1), most backers chose to pledge $20 to receive both the e-book and the hardcover book. Including the decoy option thus led many backers to move from the $10 pledge to the $20 pledge (see Figure 2).

f1.jpg
Figure 1. The decoy effect in reward-based crowdfunding; screenshot shows the decoy condition.

f2.jpg
Figure 2. The decoy effect in reward-based crowdfunding; adding a decoy option can make another option more attractive.

Scarcity effect. People tend to perceive scarce items as more attractive or desirable.9 In the context of crowdfunding (N = 166), the researchers showed that limiting the availability of rewards—a “scarcity nudge”—can lead them to choose a particular reward.38 For a fictitious movie project, backers were offered a choice between two rewards: pledge $10 to be listed in the screen credits or pledge $50 to receive the movie on a DVD/Blu-ray disc (see Figure 3). When the availability of the low-price reward was limited, 69% of the backers chose that reward, as in Figure 3, left side, whereas when the availability of the high-price reward was limited, 70% chose that reward, as in Figure 3, right side. Merely presenting information about the limited availability of either reward, even the higher-price one, thus caused more backers to choose that reward.

f3.jpg
Figure 3. The scarcity effect in reward-based crowdfunding; limiting either reward changes pledging behavior of potential backers.

Middle-option bias. People presented with three or more options (ordered sequentially, as by price) tend to select the middle option.4 Testing the effect of the middle-option bias in the context of crowdfunding (N = 282), the researchers showed that backers can be nudged into choosing the reward presented in the middle.28 They tested it by varying the pledges of the offered rewards by, in particular, shifting the scales such that Condition 1: $5, $10, $15; Condition 2: $10, $15, $20, and Condition 3: $15, $20, $25. The researchers told the participants that their pledge would be doubled as a reward if the project would be successful. However, irrespective of the scale, most backers tended to choose the middle option, and by shifting the scales, the researchers could nudge the participants toward selecting rewards associated with higher pledge amounts (see Figure 4).

f4.jpg
Figure 4. The middle-option bias in reward-based crowdfunding; even when the investment scale is increased, backers tended to select the middle option.

These examples show that designers can create digital nudges on the basis of psychological principles of human decision making to influence people’s online behavior. Unintended effects may arise, however, if designers of digital choice environments are unaware of the principles. For example, in the context of crowdfunding, presenting decoys or limiting the availability of rewards without considering their effect can unintentionally lead backers to select lower-price rewards; that is, as virtually all user-interface design decisions influence user behavior,20,30 designers must understand the effects of their designs so they can choose whether to nudge users or reduce the effects of nudges.

Back to Top

Designing a Digital Nudge

While a number of researchers have suggested guidelines for selecting and implementing nudges in offline contexts,5,6,16,19,21,31 information systems present unique opportunities for harnessing the power of nudging. For example, Web technologies allow real-time tracking and analysis of user behavior, as well as personalization of the user interface, and both can help test and optimize the effectiveness of digital nudges; moreover, mobile apps can provide a wealth of information about the context (such as location and movement) in which a choice is made. Given these advantages, information systems allow rapid content modification and visualization to achieve the desired nudging effect.

Drawing on guidelines for implementing nudges in offline contexts, we now highlight how designers can create digital nudges by exploiting the inherent advantages of information systems. Just as developing an information system follows a cycle, as in, say, the systems development life cycle—planning, analysis, design, and implementation—so does designing choices to nudge users (see Figure 5)—define the goal, understand the users, design the nudge, and test the nudge. We discuss each step in turn, focusing on the decisions designers must make.

f5.jpg
Figure 5. Designing digital nudges follows a cycle; based on Datta and Mullainathan5 and Ly et al.19

Step 1: Define the goal. Designers must first understand an organization’s overall goals and keep them in mind when designing particular choice situations. For instance, the goal of an e-commerce platform is to increase sales, the goal of a governmental taxing authority’s platform is to make filing taxes easier and encourage citizens to be honest, and the goal of project creators on crowdfunding platforms is to increase pledges and overall donation amounts. These goals determine how choices are to be designed, particularly the type of choice to be made. For example, subscribing to a newsletter is a binary choice—yes/no, agree/disagree—selecting between items is a discrete choice, and donating monetary amounts is a continuous choice, though it could also be presented as a discrete choice. The type of choice determines the nudge to be used (see the table here). The choice architect, however, must consider not only the goals but also the ethical implications of deliberately nudging people into making particular choices, as nudging people toward decisions that are detrimental to them or their wellbeing is unethical and might thus backfire, leading to long-term negative effects for the organization providing the choice.30 In short, overall organizational goals and ethical considerations drive the design of choice situations, a high-level step that influences all subsequent design decisions.

Step 2: Understand the users. People’s decision making is susceptible to heuristics and biases. Heuristics, commonly defined as “rules of thumb,”14 can facilitate human decision making by reducing the amount of information to be processed when addressing simple, recurrent problems. Conversely, heuristics can influence decisions negatively by introducing cognitive biases—systematic errors—when one faces complex judgments or decisions that should require more extensive deliberation.7 Researchers have studied a wide range of psychological effects that subconsciously influence people’s behavior and decision making.b In addition to the middle-option bias, decoy effect, and scarcity effect described earlier, common heuristics like the “anchoring-and-adjustment” heuristic, or people being influenced by an externally provided value, even if unrelated; the “availability” heuristic, or people being influenced by the vividness of events that are more easily remembered; and the “representativeness” heuristic, or people relying on stereotypes when encountering and assessing novel situations,34 influence how alternatives are evaluated and what options are ultimately selected. Other heuristics and biases that can have a strong effect on choices include the “status quo bias,” or people tending to favor the status quo so they are less inclined to change default options;18 the “primacy and recency effect,” or people recalling options presented first or last more vividly, so those options have a stronger influence on choice;24 and “appeals to norms,” or people tending to be influenced by the behavior of others.23 Understanding these heuristics and biases and the potential effects of digital nudges can thus help designers guide people’s online choices and avoid the trap of inadvertently nudging them into decisions that might not align with the organization’s overall goals.

Step 3: Design the nudge. Once the goals are defined (see Step 1: Define the goal) and the heuristics and biases are understood (see Step 2: Understand the users), the designer can select the appropriate nudging mechanism(s) to guide users’ decisions in the designer’s intended direction. Common nudging frameworks a designer could use to select appropriate nudges include the Behavior Change Technique Taxonomy,21 NUDGE,31 MINDSPACE,6 and Tools of a Choice Architecture.16 Selecting an appropriate nudge and how to implement it through available design elements, or user-interface patterns, is determined by both the type of choice to be made—binary, discrete, or continuousc—and the heuristics and biases at play; see the table for examples. For example, a commonly used nudge in binary choices is to preselect the desired option to exploit the status quo bias. When attempting to nudge people in discrete choices, choice architects can choose from a variety of nudges to nudge people toward a desired option. For example, in the context of crowdfunding, with the goal of increasing pledge amounts, choice architects could present the desired reward option as the default option; add (unattractive) choices as decoys; present the desired option first or last to leverage primacy and recency effects; or arrange the options so as to present the preferred reward as the middle option. When attempting to nudge people in continuous choices (such as when soliciting monetary donations), choice architects could pre-populate input fields (text boxes) with a particular value so as to exploit the “anchoring and adjustment” effect. Likewise, when using a slider to elicit numerical responses, the position of the slider and the slider endpoints serve as implicit anchors. Presenting others’ choices next to rewards to leverage people’s tendency to conform to norms or presenting limited availability of rewards to exploit the scarcity effect can be used to nudge people in binary, discrete, or continuous choices.

As the same heuristic can be addressed through multiple nudges, in most situations, designers have a variety of “nudge implementations” at their disposal. Unlike in offline environments, implementing nudges in digital environments can be done at relatively low cost, as system designers can easily modify a system’s user interface (such as by setting defaults, displaying/hiding design elements, or providing information on others’ pledges). Likewise, digital environments enable dynamic adjustment of the options presented on the basis of certain attributes or characteristics of the individual user (such as when a crowdfunding platform presents particular rewards depending on the backers’ income, gender, or age). Notwithstanding the choice of nudges, designers should follow commonly accepted design guidelines for the respective platforms (such as Apple’s Human Interface Guidelines and Microsoft’s Universal Windows Platform design guidelines) to ensure consistency and usability.


Big-data analytics can be used to analyze behavioral patterns observed in real time to infer users’ personalities, cognitive styles, or even emotional states.


Step 4: Test the nudge. Digital environments allow alternative designs to be generated easily, so their effects can be tested quickly, especially when designing websites. The effectiveness of digital nudges can be tested through online experiments (such as A/B testing and split testing). Testing is particularly important, as the effectiveness of a nudge is likely to depend on both the context and goal of the choice environment and the target audiences. For example, a digital nudge that works well in one context (such as a hotel-booking site like https://www.booking.com) may not work as well in a different context (such as a car-hailing service like https://www.uber.com); such differences may be due to different target users, the unique nature of the decision processes, or even different layouts or color schemes on the webpages; a hotel may use colors and shapes that evoke calmness and cleanliness, whereas a car-hailing service may use colors and shapes that evoke speed and efficiency. As choice architects have various nudge implementations at their disposal, thorough testing is thus imperative for finding the nudge that works best for a given context and users.

Especially in light of the increasing focus on integrating user-interface design and agile methodologies, using discount usability techniques (such as heuristic evaluation, as introduced by Nielsen25) is often recommended to support rapid development cycles (see, for example, Jurca et al.17). Likewise, agile methodologies include the quick collection of feedback from real users. However, such feedback from conscious evaluations should be integrated with caution because the effects of nudges are based on subconscious influences on behavior, and experimental evaluations can provide more reliable results. If a particular nudge does not produce the desired effect, a first step for system designers is to evaluate the nudge implementation to determine whether the nudge is, say, too obvious or not obvious enough (see Step 3: Design the nudge). In some instances, though, reexamining the heuristics or biases that influence the decision-making process (see Step 2: Understand the users) or even returning to Step 1: Define the goal and redefining the goals may be necessary (see the sidebar, “Questions Designers Need to Address”).

Back to Top

Conclusion

Understanding digital nudges is important for the overall field of computing because user-interface designers create most of today’s choice environments. With increasing numbers of people making choices through digital devices, user-interface designers become choice architects who knowingly or unknowingly influence people’s decisions. However, user-interface design often focuses primarily on usability and aesthetics, neglecting the potential behavioral effects of alternative designs. Extending the body of knowledge of the computing profession through insights into digital nudging will help choice architects leverage the effects of digital nudges to support organizational goals. Choice architects can use the digital nudging design cycle we have described here to deliberately develop such choice environments.

uf1.jpg
Figure. Applying the digital nudging design cycle (selected examples).

One final note of caution is that the design of nudges should not follow a “one-size-fits-all” approach, as their effectiveness often depends on a decision maker’s personal characteristics.16 In digital environments, characteristics of users and their environment can be inferred from a large amount of data, allowing nudges to be tailored. System designers might design the choice environment to be adaptive on the basis of, say, users’ past decisions or demographic characteristics. Likewise, big-data analytics can be used to analyze behavioral patterns observed in real time to infer users’ personalities, cognitive styles, or even emotional states.12 For example, Bayesian updating can be used to infer cognitive styles from readily available clickstream data and automatically match customers’ cognitive styles to the characteristics of the website (such as through “morphing”11). Designers of digital choice environments can attempt to “morph” digital nudges on the basis of not only the organizational goals but also users’ personal characteristics.

Any designer of a digital choice environment must be aware of its effects on users’ choices. In particular, when developing a choice environment, designers should carefully define the goals, understand the users, design the nudges, and test those nudges. Following the digital-nudging design cycle we have laid out here can help choice architects achieve their organizational goals by understanding both the users and the potential nudging effects so intended effects can be maximized and/or unintended effects minimized.

Back to Top

Acknowledgments

This work was partially supported by research grants from the University of Liechtenstein (Project No. wi-2-14), City University of Hong Kong (Project No. 7004563), and City University of Hong Kong’s Digital Innovation Laboratory in the Department of Information Systems. We wish to thank Joseph S. Valacich for valuable comments on earlier versions, as well as the anonymous reviewers for their insightful comments.

uf2.jpg
Figure. Watch the authors discuss their work in this exclusive Communications video. https://cacm.acm.org/videos/digital-nudging

Back to Top

Back to Top

Back to Top

Back to Top

    1. Belleflamme, P., Lambert, T., and Schwienbacher, A. Individual crowdfunding practices. Venture Capital 15, 4 (May 2013), 313–333.

    2. Benartzi, S. and Lehrer, J. The Smarter Screen: Surprising Ways to Influence and Improve Online Behavior. Penguin Books, New York, 2015.

    3. Carr, A. How Square Register's UI Guilts You Into Leaving Tips. Fast Company, New York, 2013; http://www.fastcodesign.com/3022182/innovation-by-design/how-square-registers-ui-guilts-you-into-leaving-tips

    4. Christenfeld, N. Choices from identical options. Psychological Science 6, 1 (Jan. 1995), 50–55.

    5. Datta, S. and Mullainathan, S. Behavioral design: A new approach to development policy. Review of Income and Wealth 60, 1 (Feb. 2014), 7–35.

    6. Dolan, P., Hallsworth, M., Halpern, D., King, D., Metcalfe, R., and Vlaev, I. Influencing behaviour: The Mindspace way. Journal of Economic Psychology 33, 1 (Feb. 2012), 264–277.

    7. Evans, J.S.B.T. Dual-processing accounts of reasoning, judgment, and social cognition. Annual Review of Psychology 59, 1 (Jan. 2008), 255–278.

    8. Fogg, B.J. Persuasive Technology: Using Computers to Change What We Think and Do. Elsevier, Oxford, U.K., 2003.

    9. Fromkin, H.L. and Snyder, C.R. The search for uniqueness and valuation of scarcity. Chapter 3 in Social Exchange, K.J. Gergen, M.S. Greenberg, and R.H. Willis, Eds. Springer U.S., Boston, MA, 1980, 57–75.

    10. Gregor, S. and Lee-Archer, B. The digital nudge in the Social Security Administration. International Social Security Review 69, 3–4 (July–Dec. 2016), 63–83.

    11. Hauser, J.R., Urban, G.L., Liberali, G., and Braun, M. Website morphing. Marketing Science 28, 2 (Mar 2009), 202–223.

    12. Hibbeln, M., Jenkins, J.L., Schneider, C., Valacich, J.S., and Weinmann, M. How is your user feeling? Inferring emotion through human-computer interaction devices. MIS Quarterly 41, 1 (Mar. 2017), 1–21.

    13. Huber, J., Payne, J.W., and Puto, C. Adding asymmetrically dominated alternatives: Violations of regularity and the similarity hypothesis. Journal of Consumer Research 9, 1 (June 1982), 90.

    14. Hutchinson, J.M.C. and Gigerenzer, G. Simple heuristics and rules of thumb: Where psychologists and behavioural biologists might meet. Behavioural Processes 69, 2 (May 2005), 97–124.

    15. Johnson, E.J. and Goldstein, D.G. Do defaults save lives? Science 302, 5649 (Nov. 2003), 1338–1339.

    16. Johnson, E.J., Shu, S.B., Dellaert, B.G.C. et al. Beyond nudges: Tools of a choice architecture. Marketing Letters 23, 2 (June 2012), 487–504.

    17. Jurca, G., Hellmann, T.D., and Maurer, F. Integrating agile and user-centered design: A systematic mapping and review of evaluation and validation studies of agile-UX. In Proceedings of the 2014 Agile Conference (Kissimmee, FL, July 28-Aug. 1). IEEE Computer Society, Piscataway, NJ, 2014.

    18. Kahneman, D., Knetsch, J.L., and Thaler, R.H. Anomalies: The endowment effect, loss aversion, and status quo bias. The Journal of Economic Perspectives 5, 1 (Winter 1991), 193–206.

    19. Ly, K., Mazar, N., Zhao, M., and Soman, D. A Practitioner's Guide to Nudging. Rotman School of Management Working Paper No. 2609347, May 2013, 1–28; https://ssrn.com/abstract=2609347

    20. Mandel, N. and Johnson, E.J. When webpages influence choice: Effects of visual primes on experts and novices. Journal of Consumer Research 29, 2 (Sept. 2002), 235–245.

    21. Michie, S., Richardson, M., Johnston, M. et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine 46, 1 (Aug. 2013), 81–95.

    22. Miesler, L., Scherrer, C., Seiler, R., and Bearth, A. Informational nudges as an effective approach in raising awareness among young adults about the risk of future disability. Journal of Consumer Behaviour 16, 1 (Jan./Feb. 2017), 15–22.

    23. Muchnik, L., Aral, S., and Taylor, S.J. Social influence bias: A randomized experiment. Science 341, 6146 (Aug. 2013), 647–651.

    24. Murdock, B.B. The serial position effect of free recall. Journal of Experimental Psychology 64, 2 (Nov. 1962), 482–488.

    25. Nielsen, J. Usability engineering at a discount. In Proceedings of the Third International Conference on Human-Computer Interaction on Designing and Using Human-Computer Interfaces and Knowledge-Based Systems (Boston, MA). Elsevier, New York, 1989, 394–401.

    26. Oinas-Kukkonen, H. and Harjumaa, M. Persuasive systems design: Key issues, process model, and system features. Communications of the Association for Information Systems 24, Article 28 (Mar. 2009), 485–500.

    27. Simon, H.A. A behavioral model of rational choice. Quarterly Journal of Experimental Psychology 69, 1 (Feb. 1955), 99–118.

    28. Simons, A., Weinmann, M., Tietz, M., and Brocke, vom, J. Which reward should I choose? Preliminary evidence for the middle-option bias in reward-based crowdfunding. In Proceedings of the Hawaii International Conference on System Sciences (Hilton Waikoloa Village, HI, Jan. 4–7). University of Hawaii, Manoa, HI, 2017, 4344–4353.

    29. Stanovich, K.E. Rationality and the Reflective Mind. Oxford University Press, New York, 2011.

    30. Sunstein, C.R. Nudging and choice architecture: Ethical considerations. Yale Journal on Regulation 32, 2 (2015), 413–450.

    31. Thaler, R.H. and Sunstein, C.R. Nudge: Improving Decisions About Health, Wealth, and Happiness. Yale University Press, New Haven, CT, and London, U.K., 2008.

    32. Thaler, R.H., Sunstein, C.R., and Balz, J.P. Choice Architecture. SSRN Electronic Journal (2010), 1–18; https://ssrn.com/abstract=1583509

    33. Tietz, M., Simons, A., Weinmann, M., and Brocke, vom, J. The decoy effect in reward-based crowdfunding: Preliminary results from an online experiment. In Proceedings of the International Conference on Information Systems (Dublin, Ireland, Dec. 11–14). Association for Information Systems, Atlanta, GA, 2016, 1–11.

    34. Tversky, A. and Kahneman, D. Judgment under uncertainty: Heuristics and biases. Science 185, 4157 (Sept. 1974), 1124–1131.

    35. Tversky, A. and Kahneman, D. The framing of decisions and the psychology of choice. Science 211, 4481 (Jan. 1981), 453–458.

    36. Wansink, B., Kent, R.J., and Hoch, S. An anchoring and adjustment model of purchase quantity decisions.Journal of Marketing Research 35, 1 (Feb. 1998), 71–81.

    37. Weinmann, M., Schneider, C., and Brocke, vom, J. Digital nudging. Business & Information Systems Engineering 58, 6 (Dec. 2016), 433–436.

    38. Weinmann, M., Simons, A., Tietz, M., and Brocke, vom, J. Get it before it's gone? How limited rewards influence backers' choices in reward-based crowdfunding. In Proceedings of the International Conference on Information Systems (Seoul, South Korea, Dec. 10–13). Association for Information Systems, Atlanta, GA, 2017, 1–10.

    a. Digital nudging, with its focus on the design of digital choice environments, can be viewed as a subset of persuasive computing/technology, which is generally defined as technology designed to change attitudes or behaviors and includes aspects of human-computer interaction beyond interface design.8,26

    b. See Stanovich20 for a taxonomy of rational thinking errors and biases; see also Wikipedia for an extensive list of cognitive biases that influence people's online and offline behavior (https://en.wikipedia.org/wiki/List_of_cognitive_biases).

    c. In most cases, the type of decision is an externality, and many decisions allow for only one type; for example, consenting to something (whether organ donation or signing up for a newsletter) would normally always be a binary choice—yes/no.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More