In this section, Paul Asadoorian and John Strand elaborate on the aspect of annoyance that deals with making an attacker repeatedly go through a loop of directories on the targeted website.John Strand: Infinitely recursive directories are another one of the areas that you can mess with attackers’ lives. So, what we’re going to do is we’re going to show you how you can create a directory that’s referencing itself. If an attacker is going through recursively searching a file system on one of your computer systems, looking for your sensitive documentations, their tool is probably going to get kind of bogged down in this particular directory. We got a lot of help from another member of PaulDotCom Security Weekly, Mark Baggett. We couldn’t find any pictures of Mark Baggett, so it just turns out that he looks a lot like David Hasselhoff, so we just dropped it there (see image). He’s a very handsome man… Now, how to do this? In Windows, what we do first is we create a directory (see image). And as you can see in the slides, we created a directory Goaway; we changed directories into that directory, we do ‘make link /D dir1’, and it’s linking to the Goaway directory, which is the directory that you’re in. Then we do the exact same thing a second time with dir2. We created two directories that are linking back to the Goaway directory. I’ll show you why that’s important on the next slide. Whenever you go through and you start doing a recursive lookup using something like ‘dir /S’ – as Ed Skoudis likes to say, the S is for recurSSSSSive – as soon as it gets to, like, 128 characters, it says that the directory or filename is too long. So, if you have two directories though, as soon as it hits that 128 character limit, it alternates to the second directory, hits the 128 character limit, goes back to the first directory, and it alternates back and forth (see image).
So, if somebody was attacking your network, let’s say with Metasploit, and they are using the Meterpreter to try to identify directories with sensitive documents, Meterpreter would go through and recursively look through the file system; it would hit this particular directory and cease; it would just continue to go forever and ever. Your CPU would jump, like, 60-70%, which would serve 2 purposes: first, it shuts down the bad guy’s backdoor – believe it or not, the bad guy kills the session, the recursive search continues to run; and the second added bonus is it’s very easy for you to identify the malware.Paul Asadoorian: Now we’ll move on with annoyance. In this part it’s about setting traps. This was a really great concept that I was happy to apply to computer security, because I always thought that people thought of honeypots as these vulnerable systems that have all these vulnerabilities on purpose, that you’re putting out on the network and letting people hack into them.
And we kind of thought of honeypots as something more like a trap, where it’s almost expected behavior that’s not necessarily a vulnerability, but will draw the attacker in to be able to do stuff, like, for example, when they’re spidering your website.This is very close to recursively searching a directory on a file system, except now, when the attacker goes to your website and starts looking for different directories, in which maybe more web applications exist, or maybe looking for places where files are going to list out inside of your website, what you do is you trap them with something called SpiderTrap and WebLabyrinth.
SpiderTrap was the original Python script that someone who worked for us wrote, that basically just listens as a web server; when you make a connection to it, it presents a web page; and the web page links to a bunch of different directories. When you click one of those links, whether with your automated tool or with your web browser, it takes you to yet another page, which then displays yet another set of random links, and clicking one of those random links then displays another page, so you can see the tools will get stuck in this trap.
Ben Jackson converted this into PHP and added all kinds of neat functionality, so you can tell the Google bots not to get stuck inside of your trap, so Google can still index your site.When you run something like Wget against it, it will actually fall into the trap (see image). When you recursively go to a site and start following all those links, doing that dynamic search, what we call spidering, cool tools will fall into the trap. And we’ve actually caught some attackers using highly specialized tools looking for specific vulnerabilities; we’ve caught them inside of the traps, hopefully slowing them down so that they’re not finding anything on our website and slowing them down from attacking other people’s websites. w3af is another tool which gets fooled by this trap, and it’s kind of funny: the progress bar in the w3af goes about ¾ of the way through, and it says: “Yep, I’m doing good, I’m spidering your website for you.” Then it kind of backs up, then it kind of goes forward, and it just sits there forever trying to recursively search your web directories.