Welcome!

If everyone is thinking the same, someone isn't thinking

Lori MacVittie

Subscribe to Lori MacVittie: eMailAlertsEmail Alerts
Get Lori MacVittie via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: CMS Journal, Security Journal, Cloud Security Journal

Blog Post

The Bigger the Base the Larger the Blast Radius By @LMacVittie

Information security decision making is a constant balancing act of risk versus reward, disruption versus dependability

At the end of the year, WhiteHat Security posted an interesting blog titled, "The Parabola of Reported WebAppSec Vulnerabilities" in which a downward trend in web application vulnerabilities (as collected by the folks at Risk Based Security's VulnDB) was noted beginning in 2008 after having been on an upward trend since 1994 (hence the use of parabola to describe the histogram of reported vulnerabilities. See hastily composed diagram on left. A very angular looking parabola but a parabolic shape nonetheless).

web app sec trend 2014 to 1994

The author further postulates some possible explanations for this trend. including a "more homogeneous Internet", which is explained as:

start_quote_rbIt could be that people are using fewer and fewer new pieces of code. As code matures, people who use it are less likely to switch in favor of something new, which means there are fewer threats to the incumbent code to be replaced, and it’s therefore more likely that new frameworks won’t get adopted. Software like WordPress, Joomla, or Drupal will likely take over more and more consumer publishing needs moving forward. All of the major Content Management Systems (CMS) have been heavily tested, and most have developed formal security response teams to address vulnerabilities. Even as they get tested more in the future, such platforms are likely a much safer alternative than anything else, therefore obviating the need for new players.end_quote_rb

That seems a logical (and likely) explanation. The continued refinement of highly leveraged code ultimately reaps the benefit of being more secure. It's like realizing the benefits of a thousand code reviews instead of the usual three to five colleagues. Certainly the additional scrutiny has rewards in the form of greater stability (fewer kinks need to be worked out because, well, they've already been worked out) and improved security (almost all the holes have been found and patched).

The code base is also likely well-documented and supported by an active community, making it an attractive choice for developers looking for a framework or system with a robust, extensible repository of "extras" and "options".

It's a win-win-win situation.

I know what you're thinking - there's one too many "wins" in that equation. There's the developers of the framework or system and the consumers. It should be win-win, shouldn't it?

Au contraire, mes amis!

The third win is, in fact, for the bad guys. The attackers. The would-be infiltrators and destroyers of your web application.

Consider that two of the most disruptive (and effective) vulnerabilities of 2014 were discovered in just such code bases. Long-lived, well-supported, widely distributed. Perpetrated against platforms that, like application frameworks and platforms, are highly disruptive (and costly) to patch when such vulnerabilities are discovered.

[ Heartbleed and Shellshock were highly disruptive and perpetrated against industry de facto standard technology ]

Attackers, on the other hand, have a field day because the homogenous and entrenched technology means many weeks (perhaps months) of systems ripe for exploit. It's harvest time for the bad guys.

Now, I'm not saying we shouldn't be using well-established, "seasoned" platforms. I am saying that (potentially) diminishing vulnerabilities does not mitigate all the risk associated with a platform or technology. Certainly every year the platform is in use, the risk of vulnerabilities existing decreases. But this also means the damage from the exploitation of a heretofore undiscovered vulnerability increases because continued adoption and use of the technology expands its blast radius.

In other words, there is a converse relationship between the distribution size of an established (and assumed mostly-secure) technology platform and the potential damage incurred by the discovery of a vulnerability in it.

It is important to note the author states, "Even as they get tested more in the future, such platforms are likely a much safer alternative than anything else". Not safest. Not risk-free. Safer.

What that means is you need to consider not only the likelihood of an attack for established platforms and software (which is admittedly growing lower by the day) but the blast radius should a vulnerability be discovered. If all your sites are running on platform X and a vulnerability is discovered, what's the impact to your business?

Conversely, that risk must be weighed against the risk of custom or less well-established platforms being riddled with vulnerabilities and consuming valuable time and effort in constant evaluation and remediation cycles.

Information security decision making is a constant balancing act of risk versus reward, disruption versus dependability. What 2014 taught us is we should expect that a vulnerability will be discovered in even the most well-established platform, and plan accordingly.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.