Software security - Why aren't the enterprise developers listening?

Here's a news flash - it's 2013 and enterprises are still struggling with software security.

 

While there are plenty of enterprises out there that have figured out a formula for making software security work for them, for every one organization that 'gets it' there are many times more organizations that are struggling with software security year over year, quarter over quarter, day after day. Why?

 

There are plenty of reasons we can blame these vast failures on ... immature tools, cookie-cutter processes, poor sentiment from the enterprise leadership ... blah blah blah ...bottom line is it's 2013 and companies big and small are still struggling with poor code quality, a negative dynamic between developer and security person, and other assorted issues.

 

I bring this up today because I've spent the day with a monstrous enterprise in the financial sector that is going through a transformation at the hands of a fantastic InfoSec leader in their new CISO. The new CISO "gets it", and understands that software security isn't something you can force on developers, even if you have the support of your CEO. Threats to fire developers, to hold up projects, and compliance fines just don't work ... and it seems counter-intuitive to those who feel like all the CISO needs is "power".

fail-stamp.jpg

Case in point... let's take an enterprise, a "big bank" with a heavy online presence, billions in assets and a healthy growth curve... this also paints a big bulls-eye on their front page. From the perspective of software security, the executive ranks from the CEO down are in-line with making security a priority ...but somewhere between the senior management and line-of-business owners the message gets lost. Understandable, if you think about it from various perspectives - not 'security'.

 

So here we go. The CISO, in order to spark interest in security, invites for developers to take a voluntary self-assessment on software security. A challenge which yields a chance to win a prize... a prized gadget that has a street value of $400 nonetheless. Now, for the sake of round numbers let's assume there are 1,000 developers in the organization. Given this data how many developers do you suppose took the challenge? Try less than 3%

 

Hearing that jolted me in my seat a little today. I would have bet that the number was at least closer to 30% or something, but no it was a 10th of that! That's madness!

 

If a chance to win a $400 gadget isn't enough - what is? How do we properly incentivize developers, and others around them, to star thinking security? We already know the carrot vs. the stick, and why the punishment method simply doesn't work, but shouldn't the incentive method work, or at least have a greater uptake than 3%?

 

The answer is obvious, yet extremely difficult. We must properly offer incentives based on the culture and the task presented, and the group that is being incented. This sounds easy to say, but in practice it's extremely difficult as evident from the few organizations which have true voluntary developer involvement. I don't see this being solved any time soon, unless we finally figure out a way to properly incentivize the whole of the organization to think 'security' as an integral part of everyone's success, and collapse the 3 pillars of quality into one unit.

 

  • Does it function? - quality organization
  • Does it perform? - quality organization
  • Is it secure? - security organization

 

Those must be collapsed into a single reporting structure, and a single 'responsibility' structure, otherwise software security will continue to be the security team's problem...

 

It starts with this, Total Quality Management, where quality is a non-fragmented issue, based on the 3 critical pillars...and you can't say things like "well, that is someone else's problem".

Comments
Valued Contributor | ‎01-09-2013 04:25 AM
I'm not surprised, that the change of 1%o to get a $400 gadget does not motivate a high paied, overworked, deadline driven programmer to invest 10 minutes in a self assessment. The question, that I hoped to find an answer here is: What gets a programmer to invest more if his limited time in software testing, security (and documentation)? I assume, the answer will be different for a young programmer and for an experienced one. May be, the answer lies in the obove question itself: Allow for more time to program security, definde security as detailed as you define the required features of the software, avoid all to many requirement changes shortly befor the next project deadline. I hoped to find a hint about what are those companies that "get it" doing better or different than all the others. That would be a point to start with. Bye Ralf
James Jardine(anon) | ‎01-09-2013 10:32 AM

I am also not surprised that so few of the developers did not respond to the survey.  When thinking about incentives, you need to really consider your audience.   Is this gadget really something a developer wants, or that he can't go out and just get on his own?   Does the gadget line up to the developers?   For example, is it a .Net shop and you are giving away an Android tablet?  There might be little interest there.   You have to know your audience and understand what really motivates them.   Unfortunately, that is difficult to do because different people value different things.   Some may enjoy a monetary reward, but honestly, probably unlikely in this situation.  What about time off, or the ability to get some dedicated time to work on a project you are interested in? 

 

While there needs to be some recognition of developers doing security right, there also has to be consequences to doing it wrong.   I know we discussed this on a secbiz twitter fest a while back and there were lots of differing opinions on how to approach it.  I don't know the answer, but there needs to be some thought about it.   If we just continue to let bad code get deployed, it will never stop.  Everyone is ok with not allowing a deployment because of minor mistakes, links in the wrong location, or maybe links that don't work; cosmetic type issues.  Why is it wrong to hold stuff up because of actual issues like security.  These are real problems.  It doesn't have to be a "do we terminate the employee" type of deal because we need to understand it is much more than just the developer causing security issues.  We do, however, have to start enforcing the standards that we have set forth.

 

That company needs to identify the 3% that took the assessment and get their story.  Are they interested in application security?   If a small percentage of them are, then get them involved, and quickly.  Use them to start pushing the security trends into the development groups from the inside.  Find out from them what drives them to be interested in Security.  Find out if they have ideas for getting others involved.  Maybe they want to do some lunch and learns, or some other technique to push other developers.   These are going to be your key drivers to start engraining security into the developers.

 

 

Matt Presson(anon) | ‎01-09-2013 02:09 PM

Raf,

 

Nice article on the struggles of almost every company who has in-house developers, but I have to say that your premise "Why aren't the enterprise developers listening?" is simply asking the wrong question.

 

First off,  the question itself is hostile and fosters an "Us vs. Them" mentality which absolutely should not exist. The development and the security organizations both work for the same company, and ultimately have the same goal - enable to business to do what needs to be done in order to be profitable. The development org does this by delivering tools and services that the business units use to do their jobs. Security should be there to help ensure that those tools and services have the necessary controls in place to protect the data accordingly. Both are working towards the same goal, and having an attitude of "If they would just isten to me, then everything would be great" is NOT the frame of reference that you want to start from.

 

Secondly, you wonder why developers don't "care" about security. To this, I offer you two reasons: incentive and choice of development framework.

 

Most development organizations within a company are driven by three overarching factors: 1) Deliver what the business wants, 2) do it on time, and 3) do it under budget. When these three things are done, the business is happy, no one complains, the company makes money, and (hopefully) employees get bonuses or salary increases. The catch is when all of it is done AND it is done securely ... nothing changes. When it is done, but NOT done securely ... again nothing changes. In essence, there is no incentive for developers to take the extra time and effort to make a feature secure, if making it secure does not result in any appreciable outcome. How many developers do you know that tout how secure their code was during their annual performance reviews? How many do you know that would say "I didn't get feature X developed on time because I was too busy making sure that feature Y was secure." If you want to change this culture, we as security professionals are going to have to change those three factors that drive the development organization's work. We can start by working OUTSIDE of IT and development orgs, and start integrating with our key business units, demonstrating the need for secure software, and showing how it is beneficial to the business. When we can do that, security requirements will become a part of the business requirements. In some orgs this may even lead to the creation of SLAs between the business and IT that states that the business refuses to accept software that contains security defects. The great thing about this is that security is not a part of that SLA. An added bonus to this situation is that it also lets the development orgs simply keep on doing what they have always done - deliver on time and under budget. They don't have to change because the "deliver what the business wants" now includes security.

 

Lastly, you alude to the notion that developers are just not aware of how to code things securely. To this I offer a simple solution. How about we as security professionals start making sure that the frameworks we urge them to use incorporate security by default. As a direct example of this, I point to the modern Microsoft ASP.NET frameworks, and especially the MVC framework. When looking here, and simply using the DEFAULT RECOMMENDED PRACTICES OF THE FRAMEWORK, a developer can easily prevent 7 of the OWASP Top 10! And that is by DEFAULT. Where the developer doesn't have to think one bit about security. Instead, they just "do their job" of delivering what the business wants on time and under budget. To cater for the other three, that would only require a minimal amount of work. To me, this is a MUCH cleaner and better solution than anything else, as it actually requires less work on both the developers part AND the security team's part. All-in-all, this seems like the best scenario.

 

Overall though, I say we have to move away from this "Us vs. Them" mentality. It gets the organization nowhere, and usually ends up in the typical "my VP is more influential than you VP" battle that only drives things further apart and makes things harder to accomplish. There are simple solutions to most of our technical issues. We just have to look in the right places, and remember that we are all here for the same reason - to help our company succeed. So go work with your business leaders and learn what concerns they have. Get out of the IT department and your comfort zone. Demonstrate to the benefits of the services that you offer, and how you can help the business solve THEIR problems. That is how you win at being a security professional.

secolive(anon) | ‎01-11-2013 10:31 AM

While I agree with all the insightful comments above, I also must confess I don't believe we can ever have developers write secure code. Sure, with greater awareness, more training, better education in school (more emphasis must be put on security), strong incentives, you can have developers write somewhat better code - but it will never be free of security bugs.

 

Functionalities is usually the primary thing projects want to deliver - the most important in the eyes of most people usually. Yet, we still produce lots of functional defects that need to be adressed. If you can't prevent functional defects from occuring, why would you be able to prevent security (non-functional) defects?

 

And anyway, are there only top developers in your organization? Or do you confess that some developers are not as good as others? Do you plan to get rid of all the less-performing ones (until you have no dev anymore), or more realistically will you agree that some developers will produce annoying defects and that you have to live with it?

 

Utilimately, it means you will always have to deal with security defects (and I belive it is normal and you cannot blame developers for that). Hence, you must detect them (SAST, code review, testing...), but also work to reduce their likelihoodby providing safer frameworks, coding patterns, better designs... Even with security-aware developers, you will still need those activities, so why not start with them directly, instead of complaining about developers not caring about security?

 

Let me tell you that if your plan for more secure software is demanding that developers care about security, well you will be disappointed by the outcome. Instead, you must _drive_ security _improvements_. Like including security requirements. Or mandating code reviews. Or reviewing the designs. Read McGraw. Read BSIMM. Read OpenSAMM. Read the rugged handbook. There's no shortage of good ideas.

 

Raf you're right with TQM: make software security an explicit and important aspect of quality. Sit it in the QA team and adress it with the QA processes & tools. But also make sure you understand specificities of security vs usual quality.

Roy Lyons(anon) | ‎01-11-2013 02:57 PM

Here is my thought.  You have to start reporting security issues the same way QA does.  Put in a ticket.  Give it a priority.  Have an arm of QA that performs security testing.  Dev organizations don't like being dinged by defects, and take care of them accordingly.

 

It has to be communicated that if an exception isnt handled correctly, it is a defect.  

 

Right now security audit teams are outside of QA testing, and that's what should change.  

Rohit Sethi(anon) | ‎01-11-2013 05:13 PM

Hey Raf,

 

I think I may have some insight into the specific case you outlined above. Deveopers were asked to fill out a survey, voluntarily, in order to win a prize. One of the issues here was that it wasn't clear to the contestants why they were being asked to fill out this survey, how the results would be used, and most importantly how it linked to their job

 

In fact, I would submit that many more than 3% of the IT staff in this organization have some interest in security. The problem is that they were being paid to do a specific job - usually to build or test features for their lines of business, and the contest was not part of their duties. Even if they wanted to win the prize, they had to weigh that against the cost of not performing the work they were paid to do.

 

And that, I would submit, is the crux of the problem in this case. While some organizations, particularly agile software startups, allow developers to work on whatever they want, most large enterprises have specific results that they expect from their developers with defined due dates. Working on anything outside of the defined workload increases the risk that their mandatory work won't get done.

 

I can tell you from experience that if the appliation owners / product owners / project managers added security requirements to the original project plans, developers would be much more incented to try a new security process - even if there was no prize attached to it. It's an intersesting and (sadly) common phenomenon where individual developers, senior development leadership, and security all agree in principle that focusing more on security is a good thing, only to see everyone focus on features under the heat of product deadlines. Without *specific* buy in from the people who assign workloads to developers, the high level security mandates get lost in the pressure of building production software.

 

Robin Jackson(anon) | ‎01-12-2013 12:10 PM

Great points...it is a matter of education.  The perfect paradigm is that security is built into the entire development process and not bolted on after the fact.  Most programmers DO NOT understand buffer overflows etc..  

 

I don't know if anyone has mentioned it, but another huge issue is the outsourcing of coding to other countries. Coding is considered "fungible," but the coding, testing and security standards of different overseas coders varies greatly.

Net Force - Josh(anon) | ‎01-13-2013 03:37 AM

Two issues:

 

  1. Short-sighted and tunnel vision. 

    Current management and Generation X sees a lack of ROI when it comes to security NOW rather than later. Despite our best efforts to explain that it costs significantly more time, money and reputation later, they always defer to point 2.


  2. It will never happen to us.

    "We're too small."
    "We don't have enemies."
    "We have nothing valuable"
    "Why would anyone want to target us?"


    I hear that way too often.
Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Search
About the Author


Follow Us
Community Announcements
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation