Research and Advances
Architecture and Hardware Contributed articles

Understanding Scam Victims: Seven Principles For Systems Security

Effective countermeasures depend on first understanding how users naturally fall victim to fraudsters.
Posted
  1. Introduction
  2. Distraction Principle
  3. Social Compliance Principle
  4. Herd Principle
  5. Dishonesty Principle
  6. Kindness Principle
  7. Need and Greed Principle
  8. Time Principle
  9. Related Work
  10. Conclusion
  11. Acknowledgments
  12. References
  13. Authors
  14. Footnotes
  15. Tables
  16. Sidebar: Key insights
  17. Sidebar: Representative Scams
  18. Figure
  19. Figure
  20. Figure
  21. Figure
  22. Figure
three-card monte

From a holistic security engineering point of view, real-world systems are often vulnerable to attack despite being protected by elaborate technical safeguards. The weakest point in any security-strengthened system is usually its human element; an attack is possible because the designers thought only about their strategy for responding to threats, without anticipating how real users would react.

We need to understand how users behave and what traits of that behavior make them vulnerable, then design systems security around them. To gain this knowledge, we examine a variety of scams, distilling some general principles of human behavior that explain why the scams work; we then show how they also apply to broader attacks on computer systems insofar as they involve humans. Awareness of the aspects of human psychology exploited by con artists helps not only the public avoid these particular scams but also security engineers build more robust systems.

Over nine series of the BBC TV documentary The Real Hustle (http://www.bbc.co.uk/realhustle/) Paul Wilson and Alexis Conran researched the scams most commonly carried out in Britain and, with Jessica-Jane Clement, replicated hundreds of them on unsuspecting victims while filming the action with hidden cameras. The victims were later debriefed, given their money back, and asked for their consent to publish the footage so others would learn not to fall for the same scams (see the sidebar “Representative Scams” to which we refer throughout the main text.)

The objective of the TV show was to help viewers avoid being ripped off by similar scams. Can security researchers do more? By carefully dissecting dozens of scams, we extracted seven recurring behavioral patterns and related principles exhibited by victims and exploited by hustlers. They are not merely small-scale opportunistic scams (known as “short cons”) but inherent security vulnerabilities of the human element in any complex system. The security engineer must understand them thoroughly and consider their implications toward computer and system security.

Back to Top

Distraction Principle

While we are distracted by what grabs our interest, hustlers can do anything to us and we won’t notice.

The young lady who falls prey to the recruitment scam is so engrossed in her job-finding task that she totally fails to even suspect that the whole agency might be a fraud.

Distraction is at the heart of innumerable fraud scenarios. It is also a fundamental ingredient of most magic performances,5 which is not surprising if we see such performances as a “benign fraud” for entertainment purposes. Distraction is used in all cases involving sleight of hand, including pickpocketing and the special “throw” found in the Monte.

The very presence of “sexy swindler” Jess among the hustlers owes to Distraction, as well as to Need and Greed (discussed later), since sex is such a fundamental human drive. The 2000 computer worm “ILOVEYOU,” which reportedly caused $5 billion–$8 billion damage worldwide, exploited these two principles.

In computing, the well-known tension between security and usability is also related to Distraction. Users care only about what they want to access and are essentially blind to the fact that “the annoying security gobbledygook” is there to protect them. Smart crooks exploit this mismatch to their advantage; a lock that is inconvenient to use is often left open.

Distraction also plays a role in the “419,” or Nigerian, scam. The hustler, posing as a Nigerian government officer with access to tens of millions of dollars of dodgy money, wants the mark to help transfer the money out of the country in exchange for a slice of it. When the mark accepts the deal, the hustler demands some amount of advance money to cover expenses. New unexpected expenses come up repeatedly, always with the promise that the money is just about to be transferred. These “convincers” keep the mark focused solely on the huge sum he is promised to receive.

Are only unsophisticated 419 victims gullible? Abagnale1 showed the Distraction principle works equally well on highly educated CTOs and ClOs. In 1999, he visited a company full of programmers frantically fixing code to avert the Y2K bug. He asked the executives how they found all the programmers and was told “these guys from India” knew computers well and were inexpensive. But, Abagnale thought, any dishonest programmer from an offshore firm fixing Y2K problems could also easily implant a backdoor…

People focused on what they want to do are distracted from the task of protecting themselves. Security engineers who don’t understand this principle have already lost the battle.

Back to Top

Social Compliance Principle

Society trains people to not question authority. Hustlers exploit this “suspension of suspiciousness” to make us do what they want.

The jeweler in a jewelry-shop scam gratefully hands over necklace and cash when “policeman” Alex says they’re needed as evidence, believing him saying they’ll be returned later.

Access control to sensitive databases may involve an exploitable human element. For example, social-engineering-expert Mitnick7 impersonates a policeman to nothing less than a law-enforcement agency. He builds up credibility and trust by exhibiting knowledge of the lingo, procedures, and phone numbers. He makes the clerk consult the National Crime Information Center database and acquires confidential information about a chosen victim. His insightful observation is that the police and military, far from being a tougher target, are inherently more vulnerable to social engineering as a consequence of their strongly ingrained respect for rank.

Social Compliance is the foundation for phishing. For example our banks, which hold all our money, order us to type our password, and, naturally, we do. It’s difficult to fault nontechnical users on this one if they fail to notice the site was only a lookalike. Note the conflict between a bank’s security department telling customers “never click on email links” and the marketing department of the same bank sending them clickable email advertisements for new financial products, putting the customers in double jeopardy.

System architects must coherently align incentives and liabilities with overall system goals. If users are expected to perform sanity checks rather than blindly follow orders, then social protocols must allow “challenging the authority”; if, on the contrary, users are expected to obey authority unquestioningly, those with authority must relieve them of liability if they obey a fraudster. The fight against phishing and all other forms of social engineering can never be won unless this principle is understood.

Back to Top

Herd Principle

Even suspicious marks let their guard down when everyone around them appears to share the same risks. Safety in numbers? Not if they’re all conspiring against us.

In the Monte, most participants are shills. The whole game is set up to give the mark confidence and make him think: “Yes, the game looks dodgy, but other people are winning money,” and “Yes, the game looks difficult, but I did guess where the winning disc was, even if that guy lost.” Shills are a key ingredient.

In online auctions, a variety of frauds are possible if bidders are in cahoots with the auctioneer. EBay pioneered a reputation system in which bidders and auctioneers rate each other through public feedback. But fraudsters might boost their reputations through successful transactions with shills. Basic reputation systems are largely ineffective against shills.

In online communities and social networks, multiple aliases created by certain participants to give the impression that others share their opinions are indicated as “sock-puppets.” In political elections, introducing fake identities to simulate grass-roots support for a candidate is called “astroturfing.” In reputation systems in peer-to-peer networks, as opposed to reputation systems in human communities, multiple entities controlled by the same attacker are called “Sybils.” The variety of terms created for different contexts testifies to the wide applicability of the Herd principle to many kinds of multiuser systems.

Back to Top

Dishonesty Principle

Our own inner larceny is what hooks us initially. Thereafter, anything illegal we do will be used against us by fraudsters.

In the Monte, the shills encourage the mark to cheat the operator and even help him do it. Then, having fleeced the mark, the operator pretends to notice the mark’s attempt at cheating, using it as a reason for closing the game without giving him a chance to argue.

When hustlers sell stolen goods, the implied message is “It’s illegal; that’s why you’re getting such a good deal,” so marks won’t go to the police once they discover they’ve been had. The Dishonesty Principle is at the core of the 419; once a mark realizes it’s a scam, calling the police is scary because the mark’s part of the deal (essentially money laundering) was in itself illegal and punishable. Several victims have gone bankrupt, and some have even committed suicide, seeing no way out of this tunnel.

The security engineer must be aware of the Dishonesty Principle. A number of attacks on the system go unreported because the victims won’t confess to their “evil” part in the process. When corporate users fall prey to a Trojan horse program purporting to offer, say, free access to porn, they have strong incentives not to cooperate with the forensic investigations of system administrators to avoid the associated stigma, even if the incident affected the security of the whole corporate network. Executives for whom righteousness is not as important as the security of their enterprise might consider reflecting such priorities in the corporate security policy, perhaps by guaranteeing discretion and immunity from “internal prosecution” for victims who cooperate with forensic investigations.

Back to Top

Kindness Principle

People are fundamentally nice and willing to help. Hustlers shamelessly take advantage of it.

This principle is, in some sense, the dual of the Dishonesty Principle, as perfectly demonstrated by the Good Samaritan scam. In it, marks are hustled primarily because they volunteer to help. It is loosely related to Cialdini’s Reciprocation Principle (people return favors)2 but applies even in the absence of a “first move” from the hustler. A variety of scams that propagate through email or social networks involve tear-jerking personal stories or follow disaster news (tsunami, earthquake, hurricane), taking advantage of the generous but naïve recipients following their spontaneous kindness before suspecting anything. Many “social engineering” penetrations of computer systems7 also rely on victims’ innate helpfulness.

Back to Top

Need and Greed Principle

Our needs and desires make us vulnerable. Once hustlers know what we want, they can easily manipulate us.

Loewenstein4 speaks of “visceral factors such as the cravings associated with drug addiction, drive states (such as hunger, thirst, and sexual desire), moods and emotions, and physical pain.” We say “Need and Greed” to refer to this spectrum of human needs and desires—all the stuff we really want, regardless of moral judgement. In the 419 scam, what matters most is not necessarily the mark’s greed but his or her personal situation; if the mark is on the verge of bankruptcy, needs major surgery, or is otherwise in dire straits, then questioning the offer of a solution is very difficult. In such cases the mark is not greedy, just depressed and hopeful. If someone prays every day for an answer, an email message from a Nigerian Prince might seem like the heaven-sent solution.

The inclusion of sexual appetite as a fundamental human need justifies, through this principle, the presence of a “sexy swindler” in most scams enacted by “the trio.” As noted, the Need and Greed Principle and the Distraction Principle are often connected; victims are distracted by (and toward) that which they desire. This drive is exploited by a vast proportion of fraudulent email messages (such as those involving length enhancers, dates with attractive prospects, viruses, and Trojans, including ILOVEYOU).

An enlightened system administrator once unofficially provided a few gigabytes of soft porn on an intranet server in order to make it unnecessary for local users to go looking for such material on dodgy sites outside the corporate firewall, thereby reducing at the same time connection charges and exposure to malware.

If we want to con someone, all we need to know is what they want, even if it doesn’t exist. If security engineers do not understand what users want, and that they want it so badly they’ll go to any lengths to get it, then they won’t understand what drives users and won’t be able to predict their behavior. Engineers always lose against fraudsters who do understand how they can lead their marks. This brings us back to the security/usability trade-off: Lecturing users about disabling ActiveX or Flash or Javascript from untrusted sites is pointless if these software components are required to access what users want or need (such as their online social network site or online banking site or online tax return site). Fraudsters must merely promise some enticing content to enroll users as unwitting accomplices who unlock the doors from inside.

The defense strategy should also include user education; as the Real Hustle TV show often says, “If it sounds too good to be true, it probably is.”

Back to Top

Time Principle

When under time pressure to make an important choice, we use a different decision strategy, and hustlers steer us toward one involving less reasoning.

In the ring-reward rip-off, the mark is made to believe he must act quickly or lose the opportunity. When caught in such a trap, it’s very difficult for people to stop and assess the situation properly.

Unlike the theory of rational choice, that is, that humans take their decision after seeking the optimal solution based on all the available information, Simon8 suggested that “organisms adapt well enough to ‘satisfice’; they do not, in general, ‘optimize’.”

They may “satisfice,” or reach a “good-enough” solution, through simplifying heuristics rather than the complex, reasoned strategies needed for finding the best solution, despite heuristics occasionally failing, as studied by Tversky and Kahneman.10

Though hustlers may have never formally studied the psychology of decision making, they intuitively understand the shift. They know that, when forced to take a decision quickly, a mark will not think clearly, acting on impulse according to predictable patterns. So they make their marks an offer they can’t refuse, making it clear to them that it’s their only chance to accept it. This pattern is evident in the 419 scam and in phishing (“You’ll lose access to your bank account if you don’t confirm your credentials immediately”) but also in various email offers and limited-time discounts in the gray area between acceptable marketing techniques and outright swindle. As modern computerized marketing relies more and more on profiling individual consumers to figure out how to press their buttons, we might periodically have to revise our opinions about which sales methods, while not yet illegal, are ethically acceptable.

From a systems point of view, the Time Principle is particularly important, highlighting that, due to the human element, the system’s response to the same stimulus may be radically different depending on the urgency with which it is requested. In military contexts this is taken into account by wrapping dangerous situations that require rapid response (such as challenging strangers at a checkpoint or being ordered to launch a nuclear missile) in special “human protocols” meant to enforce, even under time pressure, some of the step-by-step rational checks the heuristic strategy would otherwise omit.

The security architect must identify the situations in which the humans in the system may suddenly be put under time pressure by an attacker and whether the resulting switch in decision strategy might open a vulnerability. This directive applies to anything from retail situations to stock trading and online auctions and from admitting visitors into buildings to handling medical emergencies. Devising a human protocol to guide and pace the response of the potential victim toward the desired goal may be an adequate safeguard and also relieve the victim from stressful responsibility.

Back to Top

Related Work

While a few narrative accounts of scams and frauds are available, from Maurer’s study of the criminal world6 that inspired the 1973 movie The Sting to the autobiographical works of notable fraudsters,1,7 the literature contains little about systematic studies of fraudsters’ psychological techniques. But we found two notable exceptions: Cialdini’s outstanding book Influence: Science and Practice,2 based on undercover field research, revealed how salespeople’s “weapons of influence” are remarkably similar to those of fraudsters; indeed, all of his principles apply to our scenario and vice versa. Meanwhile, Lea et al.3 examined postal scams, based on a wealth of experimental data, including interviews with victims and lexical analysis of fraudulent letters. Even though our approaches were quite different, our findings are in substantial agreement. The table here summarizes and compares the principles identified in each of these works.

Back to Top

Conclusion

We supported our thesis—that systems involving people can be made secure only if designers understand and acknowledge the inherent vulnerabilities of the “human factor”—with three main contributions:

First is a vast body of original research on scams, initially put together by Wilson and Conran. It started as a TV show, not as a controlled scientific experiment, but our representative write-up9 still offers valuable firsthand data not otherwise available in the literature;

Second, from these hundreds of scams, we abstracted seven principles. The particular principles are not that important, and others have found slightly different ones. What matters is recognizing the existence of a small set of behavioral patterns that ordinary people exhibit and that hustlers have been exploiting forever; and

Third, perhaps most significant, we applied the principles to a more general systems point of view. The behavioral patterns are not just opportunities for small-scale hustles but also vulnerabilities of the human component of any complex system.

Our message for the system-security architect is that it is naïve to lay blame on users and whine, “The system I designed would be secure, if only users were less gullible.” The wise security designer seeking a robust solution will acknowledge the existence of these vulnerabilities as an unavoidable consequence of human nature and actively build safeguards that prevent their exploitation.

Back to Top

Acknowledgments

Special thanks to Alex Conran for co-writing the TV series and to Alex and Jess Clement for co-starring in it. Thanks to Joe Bonneau, danah boyd, Omar Choudary, Saar Drimer, Jeff Hancock, David Livingstone Smith, Ford-Long Wong, Ross Anderson, Stuart Wray, and especially Roberto Viviani for useful comments on previous drafts. This article is updated and abridged from the 2009 technical report9 by the same authors.

Back to Top

Back to Top

Back to Top

Back to Top

Tables

UT1 Table. Principles to which victims respond, as identified by three sets of researchers.

Back to Top

Back to Top

UF2-1 Figure. From right to left: Paul, with Alex as a shill, scams two marks at the three-shells game (one of several variants of the Monte).

UF2-2 Figure. From right to left: Paul and Alex haggle with the mark over the reward in the Ring Reward Rip-off.

UF2-3 Figure. Alex, flashing a fake police badge, pretends to arrest Jess in the Jewelry Shop Scam.

UF2-4 Figure. A mark, debriefed by accompanying TV crew, is dismayed to learn the hustlers just got hold of all her sensitive personal details in the Recruitment Scam.

UF2-5 Figure. From right to left: Jess gets two marks to change her tire before tricking them into handing over their own car keys in the Good Samaritan Scam.

    1. Abagnale, F.W. The Art of the Steal: How to Protect Yourself and Your Business from Fraud. Broadway Books, New York, 2001.

    2. Cialdini, R.B. Influence: Science and Practice, Fifth Edition. Pearson, Boston, MA, 2009; (First Edition 1985).

    3. Lea et al. The Psychology of Scams: Provoking and Committing Errors of Judgement. Technical Report OFT1070. University of Exeter School of Psychology. Office of Fair Trading, London, U.K., May 2009.

    4. Loewenstein, G. Out of control: Visceral influences on behavior. Organizational Behavior and Human Decision Processes 65, 3 (Mar. 1996), 272–292.

    5. Macknik, S.L., King, M., Randi, J., Robbins, A., Teller Thompson, J., and Martinez-Conde, S. Attention and awareness in stage magic: Turning tricks into research. Nature Reviews Neuroscience 9, 11 (Nov. 2008), 871–879.

    6. Maurer, D.W. The Big Con: The Story of the Confidence Man. Bobbs-Merrill, New York, 1940.

    7. Mitnick, K.D. The Art of Deception: Controlling the Human Element of Security. John Wiley & Sons, Inc., New York, 2002.

    8. Simon, H.A. Rational choice and the structure of the environment. Psychological Review 63, 2 (Mar. 1956), 129–138.

    9. Stajano, F. and Wilson, P. Understanding Scam Victims: Seven Principles for Systems Security. Technical Report UCAM-CL-TR-754. University of Cambridge Computer Laboratory, Cambridge, U.K, 2009.

    10. Tversky, A. and Kahneman, D. Judgment under uncertainty: Heuristics and biases. Science 185, 4157 (Sept. 1974), 1124–1131.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More