2

Cognitive Biases and Penetration Testing

 2 years ago
source link: https://www.offensive-security.com/offsec/cognitive-biases-pentest/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
Cognitive Biases & Penetration Testing

by Jeremy “Harbinger” Miller

This post first appeared on November 30, 2021 and is republished with permission from the author.

Disclaimer: The ideas below are my own and may not reflect those of OffSec.


Our minds are adapted to maximize human gene replication under an extremely different environment than anyone reading this blog post is living in today. Through many thousands of years of evolution, our brains have developed heuristics to better aid us make decisions that would best serve that evolutionary purpose.

Heuristics are not always bad. For example, we are capable of making snap decisions in stressful situations without spending too much time weighing every conceivable option. However, heuristics are often maladapted to our modern circumstances. They can lead to cognitive biases that impair our reasoning and which reliably produce incorrect results.

Hacking involves thinking, so as security professionals we have an interest in improving the way that our minds work. In this blog post, I will discuss two cognitive biases I have experienced in myself and observed in students: the sunk cost fallacy and confirmation bias.

A Heap of Salt

The purpose of this article is not to help readers self-correct these biases because this isn’t often realistic. Daniel Kahneman, one of the two fathers of the study of cognitive biases and heuristics claims that he hasn’t gotten measurably better at correcting his own biases despite decades of study on the subject. In fact, he wrote his popular book partly as a guide for how to spot biases in others because noticing them in oneself is hard.

Instead, I hope this post will help readers understand a few ways their minds can in principle get stuck in a pentest, even if it won’t necessarily help them get unstuck in the moment. While we may not be very good noticers of self-bias, we can sometimes set up systems that help offset their effects in advance. In addition, if it’s easier to notice bias in others, then I hope it will help readers assist other students, friends and community members.

Sunk Cost Fallacy

What is it? The Sunk Cost Fallacy represents the systematic tendency to continue investing resources into an outcome even in the face of evidence that suggests said outcome is unlikely or not worth the investment. The Sunk Cost Fallacy often applies to financial investments, but can also apply to investments of time, emotion, or energy.

Why does it happen? We allow our past decisions regarding resource allocation to emotionally hijack our present decisions, even when there is no reason to continue investing. As noted in the above article, we might feel a sense of guilt or loss if we “give up” on an investment rather than try to see it through. Sometimes this stubbornness might pay off, but in Pentesting it can often result in frustration and an increased fear of failure.

How can it trap pentesters? The Sunk Cost Fallacy is so prevalent in information security and in Pentesting that we even have our own informal term for it: the dreaded Rabbit Hole. As far as I know, Lewis Carrol’s evocative phrase was first applied through an InfoSec analogy in The Matrix.

Pentesting students often use the term to describe the frustrating experience of attempting to attack a target that simply isn’t vulnerable in the way the attacker believes it is. Rabbit holes can occur at many levels of abstraction: we could be attacking the wrong machine, targeting the wrong service, exploiting the wrong vulnerability, or using the wrong exploit. Due to the Sunk Cost Fallacy, it’s often emotionally easier to continue down a rabbit hole rather than just move on to a different attack vector, even if it causes us more pain and sufferance than the alternative.

We can consider our relationship with a given attack vector as a pendulum between two potential failure modes. In the first case, we can abandon a truly vulnerable path too early. The thing we are attacking is actually vulnerable to our attack, but we move on out of fear that we’re wasting our time. In the second case, we continue to invest effort into making our attack work for a vector that is not actually vulnerable. This latter failure mode is where the Sunk Cost Fallacy comes in, and which (I claim) can be harder for Pentesting students to avoid.

What can we do about it? The following method works for me on many levels; we can apply it to machines on a network, to services on a machine, or to directories on a web application.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK