Cybersecurity and Psychology: The Role of Cognitive Biases

Image of a brain matrix.

Cybercriminals are masters of human psychology. 

Because they understand how people think, they can reverse engineer decision-making patterns into devastating hacking schemes.

In other words, cybercriminals don’t just breach computer systems and networks—they hack the human mind

To counter the hacker threat, cybersecurity protocols demand a thorough understanding of cognitive biases

In this article, we will explore the most common behavioral patterns hackers exploit, and provide an overview of cybersecurity employee training (and other recommendations) to help protect your team from exposure.

Understanding Cognitive Biases

It’s no secret that “cognitive bias” has negative connotations. 

After all, it’s a systematic deviation from rational thought. Or to put it another way, it’s
a mental error that arises from subjectivity. 

Within a social context, “bias” itself is commonly regarded as a form of prejudice. However, it actually has more innocent roots—and apparent usefulness—than its reputation concedes. 

As human beings, we make roughly 35,000 choices a day. To streamline our thinking—and to limit the demands of conscious thought—we leverage bias to filter information.

However, while such psychological “shortcuts” are convenient, they can also get us into trouble. 

By simplifying facts, cognitive bias blinds us to rationality. While it shines a spotlight on select information, it plunges the bigger picture into darkness. 

According to Chris Voss, former FBI hostage negotiator, cognitive bias involves the “unconscious—and irrational—brain processes that literally distort the way we see the world.”

How can we avoid such short-sighted thinking? 

After all, cognitive biases offer a broad entrance for cyberattack. If the average bank employee or consumer makes 35,000 decisions in a day, it wouldn’t take much for a brief lapse in judgment (i.e. “decision fatigue”) to enable a breach. 

Avoidance starts with both awareness and inhibition. 

As Nobel Prize-winning psychologist Daniel Kahneman advises, “the goal is not to trust what I think I see. To understand that I shouldn’t believe my lying eyes.”

When it comes to cybersecurity hygiene, we must learn to second-guess our first impressions. 

That’s especially true when dealing with strange emails, text messages, and phone calls. 

Ten Cognitive Biases That Impact Cybersecurity

Cognitive biases can manifest in unsuspecting ways.

Generally speaking, most systematic errors are caused by an unconscious attempt to take a mental shortcut—to find a resolution without first analyzing all possible information. 

Below are ten ways in which that can happen.

1. Availability Heuristic 

Definition

A common mental error where people assume the likelihood of an event based on how easily they can recall similar situations. With the availability heuristic (or “learning aid”), people often prioritize their most recent experience over comprehensive and historical analysis. 

Example

After observing the rise in Distributed-Denial-of-Service (DDoS) attacks on financial institutions, some banks choose to allocate a majority of their security budget to prevent DDoS. 

Meanwhile, other common attack vectors, like internal threats and phishing, receive only a fraction of the available resources. Though these banks successfully avoid DDoS, they will ultimately encounter many other forms of cyberattack.

The availability heuristic blinds them to the bigger picture.

2. Present Bias Discounting 

Definition

A cognitive bias that prizes immediate gratification over better, long-term rewards. 

Example

A new bank employee is thrilled to receive a personalized email from the CEO. Though the instructions appear slightly rushed (and request a wire of $50,000 to his account), the employee ignores his initial misgivings.

While he briefly considers asking his manager to review the case, he ultimately sends the money to the CEO and enjoys the dopamine rush of feeling useful so early in his new career.

Unfortunately, the CEO never sent that email. In fact, the wire transfer request came from a Russian hacking syndicate. Had the employee waited for approval, he would have prevented a cyberattack and reaped the rewards of a hero—rather than a liability.

3. WYSIATI

Definition

The acronym for “what you see is all there is,” WYSIATI is the bias of jumping to conclusions. According to Daniel Kahneman (who coined the term), WYSIATI is a cognitive trap of constructing convenient stories from incomplete evidence.

Example

A bank customer receives an upbeat email: 

“Your Credit Score Has Increased! Explore Great Financing Options Now!” 

At first glance, she notices her bank’s logo, font, and what appears to be the bank’s domain name in the sender’s address. Without hesitation, driven by the exciting prospect of favorable loan terms, she opens the email and clicks the link to learn more.

Unfortunately, the customer just fell victim to a sophisticated spoofing attack.

In her eagerness, she overlooked a crucial detail: the email address, though cleverly disguised with the bank’s name, actually originated from a hacked account—a subtle hint of display name spoofing. Moreover, the call to action, promising exclusive access to “great financing options,” preyed on her desire to leverage her improved credit score, pushing her to act impulsively.

In reality, every detail she thought to be familiar—the bank logo, font, and the seemingly legitimate domain name—was meticulously impersonated to bypass her better judgment. In a split second, she had woven herself a “convenient story” that the email was authentic, based solely on the manipulated evidence before her.

What you see (isn’t) all there is.

4. Dunning-Kruger Effect

Definition

A cognitive bias where people dramatically overestimate their knowledge or competence in a specific arena. 

Example

An IT technician at a regional bank becomes increasingly concerned about the recent surge in email phishing. 

Eager to prove their competence and initiative to the Chief Technology Officer (CTO), they decide to take the lead on implementing the bank’s email cybersecurity updates, despite having only a basic understanding of the sophisticated security measures required.

The technician, driven by a desire to demonstrate their worth and capability, confidently assures their colleagues and the CTO, “Don’t worry, I’ve got this under control.” However, their overconfidence and lack of deep technical expertise lead them to overlook or misunderstand critical protocols and established best practices (e.g., DMARC, SPF, and DKIM).

This misstep exposes the institution—and its numerous customers—to significant vulnerabilities, potentially paving the way for a catastrophic cyberattack. 

The technician’s failure to recognize the limits of their knowledge and the complexities of the task at hand is a clear manifestation of the Dunning-Kruger Effect. By overestimating their ability to manage the bank’s cybersecurity needs single-handedly, they inadvertently jeopardize the very security they aimed to bolster. 

5. Halo Effect

Definition

The tendency people have to unquestioningly trust specific companies and services. Because individuals see these entities as veritable angels (hence the halo), they are unable to recognize when cybercriminals are impersonating them.

Example

A bank manager is excited to have the office HVAC systems updated. He has known the mechanical engineer for over 20 years and has built a great friendship with him. 

When the HVAC vendor emails a brief update, he includes a file: “Revised Invoice.pdf.” 

At first, the manager finds it a bit unusual to be talking about finances prior to the date of service. However, the manager shrugs his shoulders and downloads the document. 

Unfortunately, his good intentions just put deadly ransomware on the company mainframe. Hackers impersonated the trusted vendor and preyed on the manager’s goodwill.

6. FOMO

Definition

The “fear of missing out,” FOMO is a cognitive bias derived from the paranoia of getting left behind. 

Example

An asset manager at a bank is about to clock out for the day. Before she leaves, she gets an email notification from the system administrator with a desperate subject line: “Final Notice – PASSWORD EXPIRATION.”

She opens the email and learns that her password expires at midnight—later that same day. 

If she doesn’t update her account now, she will “lose access to the portal” by tomorrow morning. Though she has dinner plans, she quickly sits back down, opens the link, and inputs her credentials. 

Unfortunately, her admin help desk never sent that email. Hackers have now obtained her username/password and can access the company servers with impunity. 

7. Loss Aversion

Definition

A cognitive bias where people prefer avoiding losses over achieving equivalent gains. Psychologically speaking, the pain of losing is twice as powerful as the joy of winning.

Example

A fast-growing bank is enjoying a surge in customers, but their revenue isn’t quite where they’d like to be. While they know their cybersecurity infrastructure is outdated, they choose to limit technology spending “until next year.” 

In other words, they forego investing in common sense cybersecurity protocols in favor of stopgap solutions. While they save money in the short term, they roll the dice on their security. 

Worse, they miss out on the many benefits cybersecurity measures can provide, including reducing the lifecycle of a breach by 74 days and enjoying average cost savings of $2.66 million.  

8. Anchoring

Definition

The psychological tendency to lean too heavily on the first piece of information provided. This is especially common in financial negotiations, where the initial number presented immediately “frames” the item’s perceived value. 

Example

A bank’s IT team is diligently investigating new cybersecurity solutions, but they’re most excited about one piece of technology. 

Why? Because it was the first tool they found, and they feel a strong connection to it. 

Unfortunately, the IT team has been anchored to one solution. They are now unable to survey the entire landscape and identify potentially more effective tools. 

9. Decision Fatigue

Definition

A behavioral phenomenon induced by mental exhaustion that impairs one’s ability to make rational decisions. Decision fatigue is closely related to “ego depletion,” following Sigmund Freud’s theory that mental fortitude and self-control deplete over time.

Example

It’s tax season, and a bank’s frontline staff has been dealing with an influx of customers. They’re running on fumes and simply trying to make it to the end of the day.

Some of the staff are too tired to notice a series of “Suspicious Email” alerts popping up on their devices. By ignoring the security warnings, they leave the door wide open to a major cyberattack—simply because they’re exhausted. 

Decision fatigue can endanger even the most advanced cybersecurity protocols. 

10. Optimism Bias

Definition

The tendency of individuals to assume they are immune from negative events. Optimism bias assumes bad things happen to “other people, not to me.”

Example

A community bank is proud that they have never experienced a cyber threat. While they’re confident in their security protocols, they’re equally sure that no hacker would try to breach their systems. 

After all, they’re just a local bank, and criminals go after major firms, right? This kind of optimism bias sets up the bank for a rude awakening. 

Indeed, banks of all sizes are routinely hacked, as the long list of victims from the 2023 MOVEit hack revealed. 

Cyberattacks are no longer a matter of “if,” but “when.”

Strategies to Overcome Cognitive Biases

Cognitive bias is a sensitive topic in a corporate setting. 

On the surface, any discussion of an employee’s bias can seem like a personal judgment. Therefore, it’s essential that such conversations are distanced from moral condemnation.

More importantly, any mention of cognitive bias should be charged to cybercriminals—not to your staff. After all, cognitive bias involves unconscious errors exploited by devious hackers

Generally speaking, cognitive bias should be addressed as a blindspot in a rear-view mirror, rather than a defect in someone’s character. 

A few focal points may foster a meaningful forum. 

To the extent that you can, always frame any conversation about bias within a group context. When you discuss cybersecurity as an organizational protocol, you will gain more buy-in from individual employees.

Over time, you will build a team-oriented environment equipped with the checks and balances you need. After all, it’s a known law of the universe: others can always see our errors more easily than we can

To create this transparent environment, you need a workforce honest enough to ask questions, vulnerable enough to share diverse perspectives, and bold enough to flag questionable messages and/or behavior

However, if your staff pushes back against your efforts, don’t hesitate to use hard evidence and statistics to illustrate the known dangers of cognitive bias. In the last few decades, behavioral scientists have conducted countless experiments to illustrate the irrationality of the human mind. 

The facts are on your side. Leverage those findings as much as possible.

To that end, it may also be helpful to promote books like Thinking, Fast and Slow (the seminal work on cognitive bias) to articulate your position and foster organic discussion at the office. 

Another book, Social Engineering: The Science of Human Hacking, may also be of interest. 

Ultimately, the most reliable cybersecurity frameworks are built on a strong foundation. Without exception, that foundation is laid with a robust and recurring emphasis on cybersecurity employee training. 

The Role of Training and Awareness

Human error is an existential threat to banking cybersecurity. 

In fact, 95% of cyberattacks involve human error, whether deliberate or accidental. 

Though tools like data loss prevention (DLP) and multi-factor authentication (MFA) are helpful, your employees deserve a more hands-on approach. 

They must:

Education is key to helping your team overcome cognitive biases

In fact, it’s the only way to properly equip your employees for the digital battlefield.

With ongoing cybersecurity employee training, you can create a culture of safety that reduces threat exposure and enhances incident response. While forging unity amongst staff, in-depth training will improve your organizational compliance and strengthen customer confidence. 

It’s not rocket science—it’s repetition

Through interactive sessions and real-world simulations, your employees can identify, resist, and overcome the dangers of cognitive biases. 

.Bank: The Cognitive Bias Firewall

Cyberattacks thrive on human error. 

That’s why hackers love to spoof websites. They steal trusted reputations to mask their fraudulence, hoping your staff will fall for their trap. 

Here’s some good news: .Bank shuts down spoofing attacks. 

Like a cognitive bias firewall, .Bank helps ensure your employees (and customers) always know what’s real—and what’s not.

After all, if it doesn’t say .Bank, it’s not your bank.

And while public domains can be counterfeited, a .Bank domain can never be faked. 

Why? Because other industries can’t use it. We built it exclusively for banks and we verify that the .Bank domain is only registered by legitimate and eligible banks.

Schedule a meeting to find out how we can help protect your bank today.

Don't miss out

Sign up for the .Bank newsletter and receive handpicked insights and ideas directly into your inbox.

Related Articles

A woman looks at a tablet, standing next to tower servers
Looking for new ways to protect your bank? Find out how managed detection and response (MDR) can provide the expert oversight you need.
A view of Earth from space, where connections of light create clusters.
Are your third-party vendors truly secure? Discover why supply chain security is essential for your bank (plus some best practices to defend your data).