Updated: Mar 10, 2020
When it comes to decision making, humans are far from rational. Nobel Laureates Richard Thaler and Daniel Kahneman have done intensive work in this field and have demonstrated this time and again with their brilliant field experiments. Some of these biases also impact our decision making especially in the area of cybersecurity:
Herding – the wisdom of the crowds assumes that "the majority must be correct". Humans tend to conform to others and seek their validations. This human nature appears as "hype cycles". In these hype cycles, the peak of inflated expectation is generally followed by a trough of disillusionment. Cybersecurity practitioner tends to herd behind thought leaders thinking it’s better to be in the herd and make mistake than be the only person in the industry making that mistake. For example, if experts are saying "shift left", we should also ask our developers to learn and implement security even though developers want to focus on creating features. If experts are talking about "AI and ML in security" we should also look at these as the first line of defense.
Status quo bias - is preferring a choice that we are used to. Most people show a strong tendency towards sticking with familiar and known approaches even when they are demonstrably inadequate and ineffective. For example, in today's cloud world enterprises are implementing a network-style firewall in AWS or protecting cloud deployments in a way that is similar to on-prem infrastructure with VPN, Proxy, Firewall, ACL, virtual appliances, KMS, NAT, etc. because they are used to the approach and architecture.
Sunk Cost Fallacy - Placing too much importance on sunk costs (irretrievable costs) – costs that cannot be got back. For example, an enterprise might have spent millions of dollars on hardware firewalls for their on prem deployment. They are aware (a) hardware firewall is an irrational architecture for cloud and (b) it's very expensive to keep updating IP based rules for ephemeral workloads. The rational action is to cut your losses. But, enterprises can become attached to their past decisions, and try to make their decisions work – even though they just create a bigger problem.
Confirmation bias - Seeking opinions and facts that support our own belief. For example, some security professionals believe that security can only be done through hardware as hardware is much difficult to tamper as compared to software.
Optimism bias - People tend to be more optimistic about their own forecasts and beliefs. For example, in a self-assessment study, 74% of people reported they are above average in performance. Similarly, business is willing to take on bad asymmetric risk (limited upside, large downside) assuming they will not see the downside. Confirmation bias and over-optimism can be a deadly combination leading to bad outcomes.
As a security champion, we should be aware of these biases and not get blindsided by them. The mantra is simple:
Talk to new companies.
Seek to learn about new technologies and trends.
Finally, the security journey is never-ending. Strive to understand and improve your security posture starting from the visibility of your assets, to what they are doing, to reducing risk by putting the right checks and balances. If you want to get visibility into your AWS security group, try out our tool or contact us at email@example.com to schedule a time for us to run it for you. May the force be with you!
Image source: Flickr, Arbutus