Four Ways Cognitive Bias Hurts Your Cybersecurity And How To Address It

Four Ways Cognitive Bias Hurts Your Cybersecurity And How To Address It

Your employees can make or break the success of your cybersecurity program. Social engineering attacks, carelessness, and mistakes are constant challenges. When you’re an IT security professional, you’re constantly thinking about risk exposure and the best ways to defend the company. Other employees have different concerns, such as shipping orders, making sales, and managing staff. To protect the organization effectively, you need to understand human cognitive bias to implement successful security.

What Is Cognitive Bias?

According to Behavioral Economics, “A cognitive bias is a systematic (non-random) error in thinking.” Cognitive bias affects everyone to a degree. These errors in thinking impact memory, decision-making, our understanding of probability, and daily life. You probably can’t eliminate cognitive bias entirely. However, through awareness, training, and technology, you can reduce these errors.

Many of these tendencies have a significant impact on your organization’s cybersecurity program. Let’s consider four cognitive biases that impact managers and front-line employees.

1. Decision Fatigue Hurts Manager Security Oversight

Imagine you’re a busy manager in a Fortune 500 company. You have a team of 10 employees reporting to you. Most of the day, you’re thinking about customer satisfaction and ways to hit your goals. As you solve problems throughout the day, your decision-making capacity tires gradually. In the late afternoon, two employees submit IT security requests. Do you have the energy to review those requests thoroughly before approving them? If not, you may defer them until tomorrow. However, by deferring a decision, your employees will be less productive.

Being tired of making decisions and making mistakes is just one cognitive bias that hurts your IT security.

2. Halo Effect Skews Management Judgment

Like it or not, every manager has favorite employees: star players who always deliver exceptional results. Since those employees are amazing in their primary responsibilities, you may unconsciously assume they are proficient in everything they do. In essence, that’s the halo effect in action. What’s the risk? What if a star employee moves to a different department and you forget to remove his or her user accounts. Those unnecessary user accounts expose the organization to the possibility of increased employee misconduct.

This cognitive bias also harms IT security departments. You might assume that the company’s star executives can do no wrong, so you don’t challenge them when they make security mistakes. That’s why some social engineering attacks rely upon impersonating an executive to request access.

3. Optimism Bias Affects Password Behavior

In the first two cases, we looked at managers. This cognitive bias affects everybody. In brief, you make overly optimistic assumptions about the future. For example, you write down passwords on a Post-It note on the back of your smartphone case. You assume that you’re responsible and would never misplace your phone. In reality, thousands of people have their smartphones stolen each year.

Tip: One way to reduce optimism bias in terms of cybersecurity lies in offering training. Read our short guide to delivering employee password training.

4. Curse of Knowledge Bias Undermines Training Effectiveness

This cognitive bias undermines cybersecurity professionals directly. As a cybersecurity professional, you might hold ISACA certifications and have years of experience. That depth of expertise benefits the organization. However, there’s a dark side to that expertise: the curse of knowledge! You struggle to put yourself in the shoes of non-IT security experts (i.e., the vast majority of employees). Because of this cognitive bias, you’re less likely to develop effective, user-friendly IT security because you make too many assumptions.

One Way to Reduce the Risk of Social Engineering Attacks in Less than 30 Days

Overcoming cognitive bias is a tall order. Thankfully, you don’t have to overcome cognitive bias patterns; you just need to design systems and processes built with this tendency in mind. For example, you can use Apollo, an IT security chatbot, to overcome the halo effect and the curse of knowledge bias. Let’s take a closer look at how you can cut your cognitive bias problems down to size.

Apollo doesn’t have opinions on any of your employees, good or bad. That means Apollo is immune to the halo effect. A password reset request from a star performer and an average performer will both be processed methodically.

Apollo doesn’t make assumptions about how much a user knows. Unlike people, Apollo never tires of repeating cybersecurity rules. That means the curse of cognitive bias doesn’t harm Apollo’s effectiveness. Instead, you need to put in the upfront work to set up your requirements.

Training your staff to eliminate their cognitive bias at work is tough. That’s why we suggest simply admitting reality: everybody has cognitive biases. It’s faster and easier to leverage technology to reduce your exposure to cybersecurity risk. After all, hackers are already taking advantage of these cognitive biases to break into your organization.

Written by Nelson Cicchitto