What behavioral experts can teach us about improving security

The “castle and moat” approach to protecting one’s domain was effective for centuries. At the start of the internet age, when a company’s greatest assets were physically located on-premises and when employees were accessing them in a predictable way (using company-issued computers, from familiar locations and at expected times), it still largely worked.

behavioral experts improving security

But that era has passed. Employees – and their devices – are no longer restricted within a “perimeter.” Employees work from home, in cafes, temporary offices and even from abroad. Granting them (and their various devices) unlimited access, as we once did when everyone sat under one roof, means trusting that they fully understand the heightened risks of remote work and will adjust their behavior to account for them.

As organizations tackle the new reality of a distributed workforce, there is much to be learned from the behavioral economics discipline. Behavioral economists study the psychological, cognitive, emotional, cultural and social factors that impact the decisions that people make. Typically, their expertise is applied to financial markets.

Companies in finance and banking commonly hire behavioral economists to identify the signs of human impulse in a market trend and to recognize why it’s occurred and how to take advantage. It can also be used to temper reflexive behavior that negatively impacts trader performance, and then to help them realize how to fight the market’s mind games.

In this new world where home is the new office, behavioral economists are the first to point out that companies are putting themselves at risk. But the mechanisms that drive human choices also suggest ways that organizations can help employees act rationally and minimize the danger when they inevitably don’t.

Rejecting reason is our default

Behavioral economics teaches us to reject the notion that people are rational, as we have proven ourselves to be the opposite. Irrationality comes through in every aspect of our behavior: investment decisions, consumer choices, health, relationships, and cybersecurity, too.

A recent study showed that over 50% of remote users who brought their work devices home also allowed their family and friends to use them freely. Talk about a bad decision! What if the employee’s son used his parent’s laptop to browse unsafe sites or download suspicious files? Exposure is very possible, especially with devices that are connected at the same time to unfamiliar home Wi-Fi networks and to multiple valuable resources.

Not surprisingly, breaches occurring from within the network are increasing in frequency and impact. Insider breaches likely occur because most security solutions aren’t equipped to defend a porous and scattered network. But it still doesn’t explain why insiders take these risks, even unintentionally, or how to consider them in the light of security.

We’re all bad actors, but why do we do it?

Our problem is, ultimately, that we’re lazy.

“A general ‘law of least effort’ applies to cognitive as well as physical exertion”, says Daniel Kahneman, author of Thinking, Fast and Slow. “The law asserts that if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of action. In the economy of action, effort is a cost, and the acquisition of skill is driven by the balance of benefits and costs. Laziness is built deep into our nature.”

Laziness manifests itself in cybersecurity when users choose poor passwords, simply because they don’t want to take the time to remember or type out a complex and long password. This is one of the most critical behaviors to correct when it comes to proper security hygiene.

The overconfidence effect is equally concerning. People often naively believe that their reasoning is greater than the objective accuracy of those judgements, so someone who is overconfident about their company’s security posture, for instance, may intentionally browse malicious content without concern for negative consequences. This also ties into optimism bias, whereby the same user always assumes the best outcome in any circumstance.

Someone suffering from optimism bias would probably not think twice before opening an official-looking email, even with some signs present that it’s a phishing attempt. Similarly, remote employees often exhibit the illusion of control, a subjective feeling that makes them overestimate their ability to control what will happen. They might believe that they have the knowledge to handle the company’s most sensitive data without consequence.

A cybercriminal’s job is to exploit human flaws

Personal data and information are the new global currency and cybercriminals are the new bank robbers. With most corporate networks today, too many people have keys to the vault and they are unaware of it. Or worse: they don’t care that indulging in their natural security impulses is the equivalent of leaving those keys on a park bench. Unfortunately, cybercriminals are acutely aware of all the biases described above, so it becomes much more important to study how users interact with the network and design security strategies around it.

These strategies already exist because the cybersecurity industry reads cybercriminals just as the criminals read their targets. Tools and processes have been designed to combat behavioral biases from within the network.

Laziness: Password fatigue and laziness are easily fought with a single sign-on solution, which condenses each employee’s many passwords into a single set of credentials, stored securely and able to grant them streamlined access to network resources.

Overconfidence: People will, unfortunately, always be confident that they’re making smart and safe choices, even if they aren’t. Constant monitoring to alert users to potential threats when this confidence leads to risky behavior is, therefore, crucial.

Optimism: Optimism is so natural that it’s hard to fight, but we can allow employees to be confident in their browsing and email habits within the confines of the web they’re allowed to surf. Website filtering blocks potentially malicious addresses from sending or receiving traffic to connected users, so poor judgment won’t result in a dangerous IP getting network access.

Control: The effortless access to cloud-based resources has tricked employees into thinking they are in total control. Most of us don’t worry about security issues, even when we should. We must ensure that employees are given access only to the network resources and devices necessary for their jobs, thereby eliminating unecessary risks.

Add awareness

A security approach that focuses on how users behave and makes use of multiple layers of security will do most of the heavy lifting when it comes to protecting the organization against our dangerous habits. Even if trust is removed with a zero-trust solution, it should be complemented with awareness training. Educating employees on how significant their individual impact on network security is can have immense ROI, even thoug a behavioral economist will say that awareness of biases doesn’t mean they’ll disappear.

There will always be a devil talking on our shoulder, daring us to simplify a tricky password. Thankfully, the security industry knows that this is happening and can help companies create a change in the security behavior of its flock. Employees will remain irrational, but being aware of this and understanding how to deal with it intelligently can help us pivot – slowly but surely – towards reason.

Contributing author: Maria Blekher, PhD, Behavioral Scientist, Founding Director @YU Innovation Lab, Clinical Associate Professor of Digital Strategies, Yeshiva University.

Don't miss