Blogger: Diana Kelley
Over two years ago, in the Market analysis section of a Burton Report on application security I wrote, “As the technology and market matures, Burton Group expects that large, established vendors who supply complementary technologies will either develop their own tools or add one of the startups to their portfolio.” Based on my assessment of the tools and the market, I genuinely believed we’d see that happen sometime in 2006. I was wrong. But, only by a few months.
This month we saw two titans purchase web application testing tools. IBM was first out of the gate, with the acquisition of Watchfire. And HP followed suit this week with the announcement that they’d scooped up SPI Dynamics. These are powerful data points proving that “large, established vendors” are taking the security of applications seriously. Both acquisitions make sense, IBM has a strong history in software development and owns the Rational line. HP put out a clear message about application testing last year when they purchased Mercury Interactive.
From an application security perspective, this is a really exciting shift in the market – but it surprised me that both companies picked web application testing as the strongest horses. My first questions were: Why didn’t either go for a static software analysis vendor? And, what about WAF (web application firewalls)?
IBM had a strong Rational Unified Process (RUP) relationship with static source code analysis vendor Secure Software, the original owners of the Comprehensive Lightweight Application Security Process (CLASP), which has since moved to OWASP. But Secure Software was acquired by competing static source code analysis vendor Fortify in January of this year, not by IBM. And WAF’s (like those from F5, Citrix, Breach, NetContinuum, and Imperva) dynamically learn where and how an application may be failing while it’s in production. While the WAFs can be configured to protect the application against its failures, wouldn’t it be sweet if they could consume information from the penetration testing tools, like SPI and Watchifire, and not only provide stronger protection against known vulnerabilities but also communicate their knowledge back to static source code analysis tools (Fortify, Klockwork, Ounce) – the very tools that can point a developer to the exact line of code where the problems may have originated?
Security guys – we know about defense in depth – and I think it’s time to apply that to software. Both in the SDLC and in production. Specifically, the company that really gets this right is going take the software security tool trifecta; the “shadrack, meshack, and abendigo” (gotta imagine Marlon Brando saying that in his best Sky Masterson voice) of software security. This means, static source code analysis (both in the IDE and stand-alone), pen testing tools, and WAFs – integrated and working together.
IBM – you’re first out of the gate – are you willing to make the acquisitions and do the integration work required to cross the finish line? CA and HP – you’re well positioned, are either of you willing to take the big win? And Symantec and McAfee – take note. Focusing on risk is a great direction – but let’s not forget that the software running our systems, our transactions, our core business processes directly informs what we have to “secure” after the fact. Making that software stronger is imperative.
I’m not a betting man (person?), but if I were, I’d also bet that IBM is going to figure this out first.
Short disclaimer: If you’ve read previous writings of mine on software security, you’ll know I don’t think this is a tools only problem. If you haven’t: it’s not a tools only problem. Robust software means a robust SDLC and there’s a lot of people and process in there, stuff a tool can’t always catch, that must be security aware.