Chris Wysopal comments on vulnerability distribution trends within a specified time span and analyzes web applications compliance with security standards.
So, then we looked at trends over time. Are any of these vulnerability distribution percentages changing over time? Because it would be nice to think that with more technology available for testing, more training available for developers – and we’ve known about these issues for a long time – that we might actually be writing better code over time. You think it could happen, right?So we looked back 2 years on cross-site scripting. We looked at our database, we looked at our data for 2 years, and we found that cross-site scripting is essentially flat (see graph). It looks like the trend’s a little up, but the p-value has to be below 0.05 for it to be statistically significant. So, statistically it’s a flat line. In 2 years that code that we’ve seen at Veracode has not improved at all in terms of cross-site scripting, it’s almost exactly the same. But for SQL injection, the trend is going down (see graph). Somehow, after dozens of break-ins every year, high-profile break-ins with SQL injection being the cause, it’s starting to make a difference, and the trend is down over 2 years. About 38% of applications were affected 2 years ago, and today we’re seeing about 32% of our applications being infected. Over a 2-year period it has decreased 6%, which seems slow, but it’s pretty good news, right? Actually there’s some good news in my report that it’s getting slightly better. I guess the bad news is that unless it accelerates, until it gets into signal digits it’s going to take 6 or 7 more years based on that trend. So, if we were seeing a lifetime of a vulnerability here discovered in 1998, by 2018, perhaps, it will be eradicated.
And then we said: “Let’s do a deep dive on one sector”. And we picked on the government this time, because that looks like fun. And we said: “How are they doing? All industries, all code, and how is government doing?”So, again, with cross-site scripting it’s flat (see graph). We can see this grey line here that is the line from before for all industries, and the orange line is just for the government. Statistically, it’s flat.
And then if we look at SQL injection, it’s flat again (see image to the bottom right). It looks like it’s going up instead of down, like, this is the one before, the grey line going down; looks like it’s going up, but statistically it’s flat.So everyone but government seems to be doing a good job at lowering SQL injection. Our government customers don’t seem to be able to do that. I think part of the explanation for that is there’s not enough training, for one thing; and if you look at the US government standards – there’s a standard from NIST called 800-53, which is security control standard that all government agencies in the United States have to abide by.
There’s a law called FISMA – the Federal Information Security Management Act. And whatever is in the controls in the 800-53, every agency in the US government is graded on this, and there’s nothing in there that specifies application security. Web apps are not mentioned, SQL injection is not mentioned – it’s things like ‘you must have encryption‘.
So, if there’s no compliance reason and there’s no risk-based reason, and there’s no cost reason to write secure code – you’re not going to. It just doesn’t happen magically, that’s why we have to have some standards, in the US government at least, to improve this situation. There’s a new version of 800-53 coming out later this year, which specifically mentions doing things like static and dynamic analysis while you’re building applications and for apps that you purchase. I think this is going to change, and it will be interesting to see over the next 2 years if my hypothesis is right and the trend for the US government starts to go down.
Now, what percentage of web applications do you think fail the OWASP Top 10? The answer is 86%.Open source actually came out the worst, and my theory is that open source web applications really aren’t written that well and they typically don’t do anything with financial data or credit card data. So they probably aren’t held to any kind of compliance standard. If we look at commercial and internally developed code, they’re doing the best, even though it’s really poor, because there are a percentage of applications that are held to a standard, like PCI compliance, or perhaps they were just well-written. And then, if we look at the CWE/SANS Top 25, we didn’t find any issues at all on the whole CWE/SANS Top 25, which you think is a stricter standard because there’s 25 things to look for in there. Generally, software did better. And I think it’s just because things like cross-site scripting are so prevalent in web applications, it’s rare that you don’t find any. And it turned out that internally developed code actually did the best, followed by commercial and open source code.
Read previous: Data Mining a Mountain of Zero Day Vulnerabilities 3: Vulnerabilities by Language, Supplier and Industry
Read next: Data Mining a Mountain of Zero Day Vulnerabilities 5: Code Security Assessment