The CWE/SANS Top 25 Most Dangerous Programming Errors

This week saw the release of the “Top 25 Most Dangerous Programming Errors” list from MITRE and SANS.  At first skim, I nearly discarded it as just an effort to pad resumes—after all, do we really need another “top X” list (every group with a barely pronounceable acronym has their own)? Weighing heavily into that opinion is that of the Top 25, there is some serious overlap that, I think, won’t really help developers.

 

For example, CWE-20 is “Improper Input Validation” and CWE-116 is “Improper Encoding or Escaping of Output.”  If developers fully understand these two and the associated risks of not doing them, they will almost certainly resolve several of the other 25, including:


-          CWE-89: Failure to Preserve SQL Query Structure (aka 'SQL Injection')


-          CWE-79: Failure to Preserve Web Page Structure (aka 'Cross-site Scripting')


-          CWE-78: Failure to Preserve OS Command Structure (aka 'OS Command Injection')


-          CWE-119: Failure to Constrain Operations within the Bounds of a Memory Buffer

 

Isn’t “Failure to Preserve SQL Query Structure” just another way of saying that you didn’t properly filter and encode tainted data? MITRE’s Steve Christey, in discussing the CWE Top 25, says the following about mixed reviews on including both input validation and encoding:

 

Part of it seems to come down to different ways of looking at the same problem. For example, is SQL injection strictly an input validation vulnerability, or output sanitization/validation/encoding or whatever you want to call it? In a database, the name "O'Reilly" may pass your input validation step, but you need to properly quote it before sending messages to the database. And the actual database platform itself has no domain model to "validate" whether the incoming query is consistent with business logic.


I’m not disagreeing with Steve that they should both be included, but rather that they should be included rather than the CWE numbers listed above.  After all, the programming error is not properly filtering and/or encoding tainted data, the resulting vulnerability is SQL injecting, Cross-Site Scripting, etc.

 

One other thing struck me as odd, which is the “Remediation Cost.”  This seems like sheer nonsense—it’s not something a document can define across the board without regard to the underlying application or technology, not to mention the (corporate) environment one has to work in. Is input validation remediation cost really “Low” when you might have to first educate your developers, make sure they understand the context and application logic (to make smart decisions about proper changes), and finally go through testing to ensure nothing is broken as a result? Likely not.

 All that said—and those are things that may be changed in future versions of the list—my opinion has softened to where I’m now in “mixed bag” mode. This is a 1.0 release of a document, and while the authors certainly hope it is perfect, the first release of anything almost definitely isn’t.  

 


The truth is that any effort which gets more security knowledge into the heads of developers is a Good Thing and, hopefully, successful at causing applications to be more secure than they were. Also, any type of release that makes headlines outside of the security industry—where it may trigger a positive reaction in a programmer or developer—is definitely a win for security.  Certainly New York State took notice, now let’s hope their software suppliers do as well.

 

These are just a few issues that came to mind when reading the list. Gary McGraw wrote “Top 11 Reasons Why Top 10 (or Top 25) Lists Don’t Work,” and while I don’t agree with everything he says, there are a few that make reading the list worthwhile.  I’ll point out #8:


 



Automated tools can find bugs — let them. Teaching all developers the 700+ bad things in the Common Weakness Enumeration (or the even larger set of permutations allowed by languages like C++) is a futile exercise. We can start by using static analysis tools to remember all of the potential bugs and alert us to their presence.


I wholeheartedly agree with the second part of this one, and not just because I work for a company that sells automated security tools. With the complexity and sheer volume of code in existence these days, automated tools used throughout the dev lifecycle are a cost effective way to catch these issues.

 

I disagree with Gary that educating developers is a futile exercise—in fact it should be a requirement in every shop and school. Even if the developers are still not great at avoiding the problems in the first place, when the automated tool spits out a list of potential issues, an educated programmer will need to determine if/how/where to make a code change to fix the issue. Improper fixes have lead to dozens of incomplete bug remediations, and in some cases, additional vulnerabilities.

 

Anyway, to summarize (since I rambled on), I think the CWE Top 25 Programming Errors list is a little flawed, but given the attention it has received this week I can only applaud the efforts of MITRE and SANS for drawing more attention to security. After all, we’re all (ok, mostly) working toward the same goal: secure applications.

 

 

Comments
(anon) | ‎01-15-2009 08:53 PM

I think what McGraw meant for developer training is that they learn the right things and not the wrong things to do.  Instead of OWASP T10, WASC TC, or MITRE CAPEC/CWE, developers should be using the OWASP Guide, or the new OWASP Top Ten Coding Standard.


While I do believe that CWE belongs in the hands/guts of security reviewers (both the manual and the tool kind), I also feel that we need better tools for these purposes.  While I think that CAT.NET/SRE/AntiXSS/AntiCSRF, GDS Security SPF, Fortify PTA, HP DevInspect, and IBM AppScan DE are among the best current tools out there (for web application security) - I think that development shops can and should be using other tools.  For example, tools like Compuware DevPartner are closer to the unit/component or module testing level.  In addition, developers need custom framework integration of secure libraries such as OWASP ESAPI, and better base-class libraries such as ASP.NET 3.5 (although this requires continual improvement).


C/C++ is a completely different ball game.  The best way to handle this is not only through secure libraries such as Coverity Extend, but to also to utilize Hoare logics, with tool support found in products such as Coverity Prevent and Klocwork K7.  I find that the SANS/CWE Top 25 is confusing because it mixes C/C++ with JEE/ASP.NET/PHP secure programming errors.

(anon) | ‎01-16-2009 11:11 AM

Pingback from  The CWE/SANS Top 25 Most Dangerous Programming Errors - The HP …

(anon) | ‎01-16-2009 10:57 PM

Chris,

Your critique of the inclusion of SQL injection/XSS alongside input validation and output encoding is a fair one.  We decided to include these, despite the overlap, because much of the audience may think in terms of the attacks - SQLi/XSS - and not the underlying weaknesses.

The estimates for Remediation Cost would vary widely, as you pointed out.  We thought it was important to at least capture this, since it's a deciding factor for many organizations.  Note that the terms are explained in Appendix B, along with a disclaimer that specifically states that education and testing costs are not included .  While remediation cost and the other supporting fields are by no means perfect, they are something to be considered.

@Andre - one thing that McGraw didn't explicitly state was that while tools are good at finding bugs, they're not as good at finding design flaws.  The Top 25 isn't just about bugs, so the educated tool consumer will know that they need to employ other methods to ensure that they have the Top 25 covered.

Chris Sullo | ‎01-20-2009 02:30 PM

Steve, thanks for responding. You state "much of the audience may think in terms of the attacks," but I find myself wondering if it is people in the *security community* that think in terms of the attacks, or people in the *development community*? I'm not a full-time developer so I can only hypothesize that this is tru-- but if you're targeting educating developers wouldn't you be better off using programming-speak rather than security-speak?

And I did miss the caveat to Remediation Cost in the Appendix--thanks.

(anon) | ‎01-23-2009 06:38 AM

Pingback from  Cross Site Scripting  » Blog Archive   » The Cwe/Sans Top 25 Most Dangerous Programming Errors - the Hp …

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Search
Showing results for 
Search instead for 
Do you mean 
About the Author
Featured


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.