Web Application Security

The State of Web Security

After months of hard work, today we are releasing the 2013 WhiteHat Website Security Statistics Report. Collectively represented are more than 650 organizations and tens of thousands of real-world websites continually monitored by WhiteHat Sentinel Services. This is the largest data set of its kind and we’re anxious to share all the new things we’ve learned.

This year, our 6th, we’ve done things differently. We wanted to try something truly ambitious, something that advances our collective understanding of application security, and something that to our knowledge has never been done before!

So, in addition to releasing detailed website vulnerability metrics that the community has come to rely upon, we sought to measure the impact of today’s so-called “best-practices.” To find out if activities such as software security training for developers, pre-production testing, static code analysis, web application firewalls, etc. really do lead to better security metrics and fewer breaches. To answer the fundamental question, what aspects of an SDLC program actually do make a difference – and how much? Of course, every “expert” has an opinion on the matter, but the best most anyone has is personal anecdote. That is, until now.

To get there we asked all Sentinel customers to privately share details about their SDLC and application security program in a survey format – we received 76 total responses. We then aggregated and correlated their answers to their website vulnerability outcomes and reported breaches. The results of this data combination are nothing less than stunning, enlightening, and often confusing.

To give you a taste for the full report, let’s start with the high-level basics:

The average number of serious* vulnerabilities per website continues to decline, going from 79 in 2011 down to 56 in 2012. This was not wholly unsuspected. Despite this, 86% of all websites tested were found to have at least 1 serious vulnerability during 2012. Of the serious vulnerabilities found, on average 61% were resolved and they took an average of 193 days to get resolved from the date of notification.

Web

As far as the Top Ten most prevalent vulnerability classes in 2012, the list is relatively close to last year’s – though Information Leakage surpassed Cross-Site Scripting yet again:

  1. Information Leakage – 55% of websites
  2. Cross-Site-Scripting – 53% of websites
  3. Content Spoofing – 33% of websites
  4. Cross-site Request Forgery – 26% of websites
  5. Brute Force –26% of websites
  6. Fingerprinting – 23% of websites
  7. Insufficient Transport Layer Protection –22% of websites
  8. Session Fixation – 14% of websites
  9. URL Redirector Abuse – 13% of websites
  10. Insufficient Authorization – 11% of websites

Conspicuously absent is SQL Injection, which fell from #8 to #14 from 2011 to 2012, and now identified in only 7% of websites. Obviously vulnerability prevalence alone does not solely equate to exploitation.

When we took a closer look at some of the correlations of vulnerability and survey data, we found some counter-intuitive statistics – implying that software security controls, or “best practices” do not necessarily lead to better security – at least at all times in all cases:

  • 57% of organizations surveyed provide some amount of instructor-led or computer-based software security training for their programmers. These organizations experienced 40% fewer vulnerabilities, resolved them 59% faster, but exhibited a 12% lower remediation rate.
  • 39% of organizations said they perform some amount of Static Code Analysis on their website(s) underlying applications. These organizations experienced 15% more vulnerabilities, resolved them 26% slower, and had a 4% lower remediation rate.
  • 55% of organizations said they have a Web Application Firewall (WAF) in some state of deployment. These organizations experienced 11% more vulnerabilities, resolved them 8% slower, and had a 7% lower remediation rate.

Two questions we posed in our survey illustrated that compliance is the number one driver for fixing web vulnerabilities…while it was also the number one driver for not fixing web vulnerabilities. Proponents of compliance often suggest that mandatory regulatory controls be treated as a “security baseline,” a platform to raise the floor, and not represent the ceiling. While this is a nice concept in casual conversation, this is typically not the real-world reality we see.

The last point I want to bring up for now focuses on accountability in the event of a data breach. Should an organization experience a website or system breach, WhiteHat Security found that 27% said the Board of Directors would be accountable. Additionally, 24% said Software Development, 19% Security Department, and 18% Executive Management. Here’s where things get really interesting though. By analyzing the data in this report, we see evidence of a direct correlation between increased accountability and decreased breaches, and of the efficacy of “best-practices” and security controls.

Questionstats_img

We stopped short of coming to any strong conclusions based upon this data alone. However, we now have something solid to work from in establishing new theories and avenues of research to explore. Please, have a look at the report and let us know what stands out to you. What are your theories for why things are the way they are? If you’d like different slices of the data, we’re all ears.

#WebsiteVulnStats

Twitter to @jeremiahg and @whitehatsec.

Personal side note: I would like to thank all of our customers who responded to our survey earlier this year as well as to a select group of respected individuals in the security space (they know who they are) that got a sneak peek of our findings last week and whose feedback was invaluable. Also thank you to my colleagues Gabriel Gumbs, Sevak Tsaturyan, Siri De Licori, Bill Coffman, Matt Johansen, Johannes Hoech, Kylie Heintz, and Michele Cox, whose teamwork helped bring everything together.

  • Ben Rasmusen

    I’m testing this out.

  • http://www.client9.com/ Nick Galbreath

    Great report! I’m to be quoting heavily from this in my next talk 😉

    I’s nice to SQLi drop, but given the severity, 7% of sites is still way too high. And 196 days to fix. Ouch.

    Hmm, those are counterintuitive results! Here’s my take on them:

    1) security training implies less problems

    I suspect that security training helps, however the real reason for less problems is organizational maturity. I assume if your organization bothers to even have security training, that they have a more mature security organization than those that do not.

    2) Use of WAF implies more problems

    Likewise those with WAF might use these to cover large amounts of legacy applications, or those with no or little security department. And on the other hand, organizations with strong security departments frequently do not or can not use WAFs.

    3) Use static analysis implies more problems

    This was a real head scratcher, as I am a big fan of static analysis. I suspect the problem here is that many organization who buy static analysis tools, actually don’t use them, or don’t use them correctly. For large C/C++ tools they frequently require a full-time person (as I have learned the hard way). Or the investment in static analysis tools are done as a quick solution instead developing a security department.

    Unfortunately, all my theories are bit hard to test.

    Thanks again for the report!

    • http://www.whitehatsec.com/ Jeremiah Grossman

      Nick, as you know oh so well, what to fix when is a painful trade-off. Coding features or fixing vulnerabilities. Often the answer isn’t so straight forward, since many of these organizations have literally hundreds of vulnerabilities per year.

      1) That would be a reasonable theory. I’ve been trying to figure out how to capture and measure “maturity” in some way. Ideas?

      2) Most likely. I’ve got to find dates for when the WAF went in and the metrics of the site before and after its deployment. That’d help give some perspective.

      3) Yep. That’s probably right as well. Could also be that SCA just doesn’t find the same things DAST does — like Information Leakage for example.

  • http://www.avyaan.com aahna jain

    Hackers love most to hack websites in now days. Over 80% of All Websites Have Serious Security Vulnerabilities. So this website security statistics report is an excellent outcome to help website attacks from malware attacks