Quantcast

Don’t Fuck It Up 2: The 7 Deadly Sins

The things that Zoz focuses on in this part are the notions of tradecraft and OPSEC as well as the 7 critical don’ts that can get you busted unless followed.

Tradecraft - evaluating biases

Tradecraft – evaluating biases

People who were trained to do sketchy shit and not fuck it up, including organized crime and the feds – two groups to which there’s not an insignificant overlap – you’ll hear terms like “tradecraft” and “OPSEC”. Tradecraft means techniques and methods. And I’m going to throw up a few things here from my friends at the CIA. Even though I’ll make fun of them later, they spend a lot of time thinking about ways they cannot fuck it up. And the best place to go when you’re looking at CIA stuff, by the way, is analysis, not operations, because operations is where they really fuck it up. Analysis – they spend a lot of time thinking about this stuff.

One thing, if you go to the CIA Tradecraft Manual, which this is from (you can download it and read it) – there’s a big thing about evaluating biases in analysis (see image above). And this stuff is also really useful for operations. I’ll just go through a couple of these. Perceptual biases – seeing what we want to see only. I think you can think of some CIA examples about that. Biases in terms of evaluating evidence – for consistency, small samples are more consistent, they contain less information; only relying on available models when estimating probability; and then, problems with causality, for example, attributing events to fixed background context. All of these things transfer over to when we analyze our own operations when we are doing something bad.

Counteracting biases

Counteracting biases

There are also a number of activities that you can do to counteract biases, and this is where the interesting stuff happens. This (see right-hand image) is just a quick selection that has a good crossover from analysis to operations. Checking key assumptions at the beginning of the project or when the project changes. Checking the quality of information. Doing contrarian techniques like devil’s advocacy; high impact / low probability; and “what if?” analysis – how that happened; and then, of course, things that we’re all familiar with from pen-testing – red team analysis, opposing force or adversary analysis. Do these things on your operations and look for where they are applicable.

OPSEC prevents data leakage

OPSEC prevents data leakage

The other side of that is OPSEC. People say that a lot in this community, it stands for “operational security”, it basically means preventing the leakage of information which could lead to discovery or advantaged by the other side. This World War II image sums it up. And, incidentally, on the topic of being old-school, I showed this picture to someone under the age of 25 and they said: “Why is Gandhi the enemy?” I can’t wait till all education comes from Wikipedia or IMDB…

Nuances of operational security

Nuances of operational security

The government uses your tax dollars to produce literature to help you with OPSEC. You can go and check this stuff out. You need to understand what information is relevant: likely threats and vulnerabilities, risk assessment, and then applying countermeasures. And the point of this poster (see right-hand image) is that OPSEC doesn’t start and end with the operation itself, it covers all of your initial exploration and preparation, and then everything afterwards. So, what you really want to get into a mindset is you don’t even know you’re going to do it, and then you forget about it afterwards. OPSEC is a 24/7 job.

The 7 deadly sins / fuckups

The 7 deadly sins / fuckups

So, here is my variant of the 7 deadly sins, the 7 deadly fuckups (see left-hand image). What makes you a candidate for getting busted? Overconfidence – thinking “Oh, they’ll never find me, I’m using an anonymization tool,” so depending on a single tool or point of failure. Excessive trust – in surveillance days, for example in East Germany, 1 out of every 66 individuals was a government informant. What do you think that ratio is like in the hacking community? Emmanuel Goldstein’s estimate is 1 in 5. Probably that’s a high bound, but talk to Chelsea Manning, for example – I bet she is regretting the trust model in the community. Conviction that your guilt is minor, no one’s going to care: “Oh, no one’s going to care about what I’m doing, I’m just defacing a website,” for example. It’s all going in your permanent record.

OPSEC is a 24/7 job.

Guilt by association – visiting the wrong chat room, coming to the wrong conference, being associated with the wrong people. Like the real estate people say: “Location, location, location,” – exposing where you’re coming from is always likely to fuck you up. It can expose you to many things besides just reverse exploitation, which the government has been doing. Of course, sending anything in the clear, not just personally identifiable information, but browser fingerprints, unique device IDs, locations you are or might be at in the future.

Keeping too much documentation about what’s going on – people who are really fighting the state and doing serious business know about this stuff. This is a quote from a Ukrainian separatist: “Home computers and personal cell phones should never be used for operational purposes; identifying documents should never be carried; details of military operations should never be discussed on phones or in front of family members. You may even need to do things that you don’t like to do, like abstaining from alcohol.

Tools can help you fuck it up

Tools can help you fuck it up

Like sins, you are going to commit one of these, you are going to fuck one of these up. So use your tradecraft analysis to figure out how you can recover from making mistakes. One of the things that you can use to stop fucking things up is tools. But tools can also help you fuck it up. A computer is just the tool to help you fuck things up a billion times faster than you could do by yourself. The increased confidence, the sin of overconfidence – that’s the likelihood of fucking it up. Using a tool badly or stupidly can be worse than not using it at all. This is one of my favorite tool injury pictures (see right-hand image above). This is from the water jet cutter. It puts out a stream of compressed water, 15,000 psi, breaks the sound barrier, cuts through inch-thick steel, and everyone who walks in the room is like: “What would happen if I stuck my hand in that thing?”
 

Read previous: Presentation by Zoz – Don’t Fuck It Up!

Read next: Don’t Fuck It Up 3: The Ins and Outs of VPNs

Like This Article? Let Others Know!
Related Articles:

Leave a comment:

Your email address will not be published. Required fields are marked *

Comment via Facebook: