<link rel="stylesheet" type="text/css" href="https://newamericadotorg-static.s3.amazonaws.com/static/css/newamericadotorg.min.css"></link>

Why (Cybersecurity) Change Is So Hard

We create different forms of technology in order to solve a problem in our lives and make our lives easier. That’s why it perplexes me when people and companies resist technological changes in their personal life and in their business. The leaders of companies I advised often tell me, "We do it this way because we've always done it this way," completely neglecting the discussed benefits that a suggested change would have. "The new way may be better," they reason "but the way we're doing it now isn't harming us.” It would be easy for me to be upset if I didn’t understand that their resistance to change largely stems from the cognitive vulnerabilities we have as humans.

For instance, even in the face of obvious drawbacks from our current behavior, we have a status quo bias, or a tendency to want to stick to the way that things are done right now. This can lead us to perceive any change in the status quo as a net loss. We have a bunch of other biases, too, based on mental shortcuts that help us to make decisions under conditions of uncertainty, or when dealing with highly complex situations. Take, for instance, the question of how we decide whether to trust someone. We might think that we’re weighing lots of different data points to make that decision - but in reality, we could decide to trust them because they’re from the same city as us, using the familiarity heuristic.  

Cybersecurity checks the complexity (and sometime the uncertainty) box, which makes it fertile ground for mental shortcuts. Gokce Sargut and Rita McGrath (professors at Governors State University and Columbia Business School, respectively) noted in an article for the Harvard Business Review that many people feel that they "can take in more information than research suggests they actually can." This belief leads to a couple of common situations:  Someone sees the problem too broadly or someone focuses too intently on only one part of it. Neither situation is ideal, and in both, the individual runs the risk of not fully understanding the consequences of his decision.

Software often doesn’t help, especially because much of it “isn’t designed with human psychology in mind,” according to Alex Blau, a behavioral scientist at Ideas42.  I routinely see the disastrous results when humans and software clash. Take, for instance, the update feature of many operating systems. When a device or app needs to be updated, a user is invariably presented with an annoying visual alert. These usually occur at the worst possible moment, such as when the user needs to get something completed right now. The notice is either ignored or deferred until “later.” Later rarely comes.

What does come, as a result, is malware that runs wild on a corporate network. Cyber security professionals will later wonder “how can so many machines have gone unpatched?” A similar situation plays out across mobile phones and other devices in the general public. In fact, with Windows 10, Microsoft made the consumer version automatically download updates in order to prevent situations where a system is without critical security and system updates for vulnerabilities that have already been patched.

Outside of an obvious disconnect between software and humans, another big reason that many don’t want to change their digital habits is that they’re given few tangible incentives to do so. Unfortunately, when a change is made (or one is forced on a person), it is often executed poorly. The end result is a solution that is slower and more cumbersome that “nobody even asked for.” A great example is when the DoD implemented Bitlocker, a full-disk encryption solution, across its workstations. Many said that it made their systems work slower and their work flows more cumbersome. Their claims, however, didn't reflect the reality of the situation. Any slowdown that could be attributable to Bitlocker would only occur when the systems is first powered on (and they files are unencrypted) and when the system is powered off, when the files are encrypted. Usually, systems are only powered off or on outside of normal duty hours when soldiers are away from their systems. Additionally, Bitlocker has no part in how a soldier goes about getting his or her work done, meaning that many of the soldiers’ complaints were baseless and unfounded.

When a user compares their work flow under the new way versus the old way, many will claim that the old way took them less time to do the same work. These users typically gloss over the complications they had learning the old way and how long it initially took them to complete a task when comparing it to the new way. With any task, doing it in a new way is often slower than doing it in a way that one has mastered. To end-users, (whether they’re employees or grandparents) however, a slower and more cumbersome experience is a familiar rally cry for keeping things the way they are.

But when it comes to security, being behind is a very big deal.

If a particular system doesn’t get patched through an update, (whether it’s a business workstation, your personal cell phone or your kid’s tablet) it remains vulnerable to being successfully exploited. Depending on the attacker, once the system is compromised, it may be used as a platform to compromise other systems within the same company or on the same network. Imagine an attacker getting into your router at home and then moving from there to your laptop, then to your iPhone. All it takes is one opening for an entire network of systems to be compromised.

Some of the best moments in my career have come when I’ve been able to break through individuals’ biases, to help them understand that making a cybersecurity change will ultimately help them lead a better life. After all, that’s what technology is made for.