Transforming Cybersecurity’s Weakest Link (Part I)

Harnessing the Human Factor as a Strategic Asset 

This is Part I in a series focused on how to transform your workforce into a cybersecurity asset rather than a liability.

In cybersecurity, the only constant is change. Today we’re assessing threat levels from AI-driven attacks, while tomorrow we may be facing quantum computing power strong enough to blow a hole through our toughest firewalls. 

The dynamic certainly keeps us on our toes. As a dedicated cybersecurity steward, we use every technological tool at our disposal to protect our clients’ data and business operations. There’s just one cybersecurity challenge we can never tech our way out of: humans.  

Don’t get us wrong, humans are great – in fact, our company is full of them! But one of the things humans have trouble with is assessing risk. With every new email account we set up, the inescapable human-factor magnifies the chances that a company will be hacked or otherwise have its data compromised. 

But what if we could turn that human risk on its head? What if instead of the weakest link, a company’s employees could be a defensive asset?   

In this three-part series, we’ll address how to turn your employees into one of the strongest lines of defense through a combination of psychology, policy, and culture. 

Part I: Psychology 

Understanding why we make bad decisions  

In the most recent surveys and industry reports we find that up to 95% of data breaches are caused by human error. This includes falling for phishing scams, using weak passwords, misdirecting emails, mishandling credentials, and more. Most of us know better than to fall for some of these things, but it’s also understandable why we might mess up.  

Every day each of us is responsible for an average of 35,000 decisions. And it’s exhausting! Out of necessity, our brains come up with ways to simplify decision-making and reduce the cognitive load. And while some of our mental shortcuts are perfectly fine, others may open us up to hackers, scammers and other ne’er-do-wells simply because of the way our brains work.   

Overconfidence, cognitive bias and the illusion of security 

Studies show many of us think we’re savvier about cyber-scams than we actually are. A recent report found that 86% of respondents said they could confidently identify phishing emails, yet nearly half admitted to falling for these types of scams. The mental shortcuts we use to save our brains time can open us up to something called cognitive bias: a systematic error in thinking that can affect our judgement.  

According to Perry Carpenter, chief human risk management strategist for KnowBe4, a cybersecurity platform we’ve used with our clients to addresses the human side of risk management, there are five common cognitive biases that can affect your cybersecurity.   

  1. Decision Fatigue: We’ve touched on this already. All those decisions that each of us is responsible for each day can wear us down mentally. When employees feel drained, they may forget their security training and take a quick route to get things done, even if it leaves them open to risk.   
  2. Choice Overload is a related cognitive bias where we may find that making the “best decision” involves sifting through a seemingly endless array of options. Feeling overwhelmed, the tendency is to simply pick whatever is expedient or “seems right” at the time.  
  3. Anchoring is a mental shortcut where we rely too heavily on first piece of information we acquired on a subject. By focusing on the first cyber threats that we learned about, or even the most recent one we heard about this week, we may overlook other dangerous risks.  
  4. Affect refers to our tendency to let emotion guide our decisions. If we feel an affinity for someone, we’re much more likely to follow their lead over our gut instincts. This often comes into play in what’s called social engineering, where an individual will cultivate an employee and get them to reveal their passwords or click on a dangerous link that they might otherwise think twice about.  
  5. Herd Mentality is another cognitive bias that can get us in trouble. This is the classic fallacy where we think, “Everyone seems to be doing it, so it must be okay.” When we follow the crowd, even if it goes against our training, we can get into some very unfortunate situations, where the risks are obvious in hindsight but seemed negligible at the time.  

Which brings us to an intriguing idea by cognitive psychologist Gary Klein: The Pre-Mortem. This hypothetical exercise asks you to imagine that things have gone terribly wrong because of a decision you made, and you are asked to consider how exactly that happened. Walking through this mental exercise can aid individuals and teams in thinking ahead and identifying factors that could prevent what might otherwise have been a disaster. 

Cybercriminals use numerous approaches to leverage our biases against us, whether it’s creating a false sense of urgency to short-circuit our vigilance or using trusted branding or impersonation to trigger affect and trick us into acting without thinking things through.  

Thanks to Artificial Intelligence (and the information that we share about ourselves and our employers online), breaches will only become more sophisticated. Given these and other challenges, our hope at Enetics is that businesses will take up the call to invest in their employees as one of their most valuable cybersecurity assets.  

In our next two posts, we’ll lay out some of the most useful policies companies can have in place to serve as guardrails, helping employees make good choices, as well as organizational culture changes that can encourage careful, skeptical, and methodical approaches to cybersecurity.  

Stay tuned and stay safe!  

Share the Post:

Related Posts