The State of Incident Response by Bruce Schneier 3: Effects of the Prospect Theory

The nuances covered by Bruce Schneier in this part are related to the psychological effects on IT security, namely the behavioral patterns for loss aversion.

Daniel Kahneman, Nobel Prize winner for prospect theory research

Daniel Kahneman, Nobel Prize winner for prospect theory research

Now my one piece of psychology. I am going to try to explain security in terms of one psychological theory. And the theory is “prospect theory”. You’ll also hear this called “loss aversion”, “framing effects”. Basically, it is a way that we as humans look at risk. The quintessential experiment in prospect theory is to take a room full of subjects, you know, in the beginning it’s usually college undergrads because that’s who you got; you divide the room in half and you ask one side of the room to make a choice, and the choice is between $1000 – here’s the cash – or a coin flip at $2000. Kind of appropriate experiment for Las Vegas. And if you survey a room full of people you will find that about ¾ will take the sure thing. Despite everything Las Vegas says, more people would rather have $1000 than a coin flip chance of $2000.

The second half of the experiment is you take the other half of the room and you give them a very similar but importantly different choice. I can either take $1000 from you right now – take it from your bank account – or I will let you have a coin flip chance at me taking $2000 or nothing. And it turns out, if you ask a room full of people to make that choice, about ¾ of them will take the chance. Now, this is actually really interesting. The people who came up with this theory also won a Nobel Prize in economics, even though they were psychologists, freaking out everybody because economists said this is impossible, yet this has been proven again and again. And it’s very robust result across ages, across cultures, done with little money, with real money. This experiment has been done a lot.

As a species, we are risk-averse when it comes to gains and risk-seeking when it comes to losses.

But basically, as a species, we are risk-averse when it comes to gains and risk-seeking when it comes to losses. And it’s not just us, someone figured out how to do this experiment with other primates; and we kind of all are. There has been a bunch of explanations of this. I think the best one comes from evolutionary psychology. This is the basic story: if you imagine yourself as an individual living at the edge of survival, even a small win means you’ll live to see tomorrow. So, for this half of the room, if they take coin flips at $2000 or nothing, half of them will take nothing, half of them will die. But if they take a sure thing of $1000, they’ll all live. But for the other half of the room also living at the edge of survival, a sure loss of $1000 means you’re all dead. But a coin flip loss of $2000 means half of you lose nothing, half of you survive. So, our brains are primed to have this bias. And the real interesting part of this experiment is you can actually take the exact same choice, frame it in the language of gains or the language of losses – and you still see the result. Even just a semantic difference causes the change.

What this means is security is hard to sell, because security is always a small loss (“buy my product”) versus a risk at a larger loss (what will happen if you don’t have the product). And you have probably experienced this when you went to your boss and said: “Hey, we need to buy this security thing because we are at the risk of this bad thing.” And your boss looks at you and says: “We didn’t have that product last month and we didn’t have the bad thing last month, maybe we should take the chance.” Betting losses is how we work.

Unity of protection, detection and response

Unity of protection, detection and response

Okay, so how does this affect incident response? I told you I’d get to this eventually… We all know that security is a combination of protection, detection and response – three steps. And we need response because protection isn’t perfect. We need it more and more today especially, because 1) we’ve lost control over our computing environment, there’s a lot of protection we can’t do; 2) attacks are becoming more sophisticated, we need more response; 3) we’re increasingly affected by other people’s fights; and 4) we’re living in a world where companies will naturally under-invest in protection and detection.

In the 1990s, I used to say “security is a process, not a product.” And by that I meant something very strategic. What I meant is that you can’t buy a bunch of stuff and be done; you have to continually evaluate your security and continually re-evaluate, re-purpose your stance. Tactically, security becomes a product and a process. Really, it’s people, process, and technology. What’s changing is the ratios.

'Usability guru' Lorrie Faith Cranor

‘Usability guru’ Lorrie Faith Cranor

The conventional wisdom in IT security is that people don’t generally help, that people are a liability and people need to be removed from the system. I have a quote from usability guru Lorrie Faith Cranor, she writes: “Whenever possible, secure system designers should find ways of keeping humans out of the loop.” And we all know this: people are a problem, people are the biggest security problem. And, you know, we’ve been doing pretty well at this: entirely automated prevention systems – antivirus, patching; lots of automated and semi-automated detection systems. We’re pulling people out of the loop left and right, and we’re doing pretty well.

The problem with response is that you cannot fully automate it; you can’t remove people from the loop by definition, it’s response. And if you think about it as you move from protection, detection to response, the people-to-technology ratio goes up, you need more people and less technology, for a whole bunch of reasons. All attacks are different; all networks are different; all security environments are different; and all organizations are different; the regulatory environment of the organization it’s in is different; the political and economic considerations in organizations are different.

Those differences are often more important than the technical considerations. This affects the economics of IT security. The products and services for response are different. There are less network effects, there are much higher marginal costs, there are lower switching costs, and there’s less of a lemons market. This will be interesting for us, because it means that unlike a lot of other areas in IT security, better products, better services, better companies will do better. There’s less of a first mover advantage, there are far fewer natural monopolies, and again, this is a new thing for us in the industry, this is going to be a surprise. And I think it’s a good thing. I think it’s something we are all going to benefit from.

Read previous: The State of Incident Response by Bruce Schneier 2: Security-Related IT Economics

Read next: The State of Incident Response by Bruce Schneier 4: OODA Loops in Cybersecurity

Like This Article? Let Others Know!
Related Articles:

Leave a comment:

Your email address will not be published. Required fields are marked *

Comment via Facebook: