Relatively recently at a well-respected information security conference, during a BYOD discussion, a colleague in the industry said "security people don't give a **** about ease of use..." at which point the audience made up almost entirely of fellow security professionals of varying experience exploded. Yep, let that soak in for a minute. You have someone that is a security leader at an organization out there saying openly that they believe security should have zero regard for ease-of-use.
Now we in the information security industry already have a bit of what I believe is a self-inflicted perception problem when it comes to the rest of the world... and I think people that say half-cocked things like this do more damage than they realize. That being said, the reason it was said is because this person and many others in the industry like them - believe this is an acceptable modus operandi. Let me tell you from the years and years I've spent either directly involved or consulting to organizations of all sizes - this is laughably false.
Here's the deal - security doesn't exist in a bubble, but you already know that. If you make something so secure that it's not usable ...well, no one will use it. A perfect example is Brian Katz's term "crapplications" which he liberally applies to applications that are difficult to use and notorious pushed and enforced by the enterprise. What Brian advocates is that when consumers or enterprise users find an alternative in a public app store somewhere - they take it, adopt it, and use it for corporate use, thereby bypassing corporate mandates and policy. Does this shock you?
Look down at the device you're reading this blog post on... is it your corporate laptop? Or have you, like many others gotten so sick of how much your corporate laptop is locked down, and over-loaded with 'security things' that you prefer to do your work - most of which doesn't require any special tools - on a personal device without all the restrictions?
If you think you don't care about ease-of-use in Information Security, you're missing the point.
It sounds odd to say, but in the enterprise if you don't realize there is a Devil's bargain to be made between usability and security while taking on some level of risk - the user will make the choice for you. You won't like their risk tolerance either, I can virtually guarantee it.
So why isn't security "easy to use"?
I believe there is a trifecta of blame going on here, let me explain:
- Overcomplication - Occam's Razor says that the solution to the problem making the fewest assumptions is the right one. I'd like to introduce the security corollary to Occam's Razor - "The solution making the fewest assumptions about its use is probably the most secure one". I'm willing to bet this holds, that the most complicated a 'security solution' is the less likely it is that it is actually more secure in real life use scenarios. That last part of the statement is the rub though, because we all have designed really good security into a system in our lab only to have the system consumer blow our best designs completely out of the water with their 'modified behavior' in the real world tests. For example, an authentication system may be very secure forcing complex passwords, password rotation at regular intervals and storing passwords with a salted hash ... but when in the world users start to share passwords your security goes right out the window. This carries into everything else in business and technology - overcomplication is a guarantee that the system consumer or user will find a more productive, albeit less secure, way.
- Misunderstanding - I've said it before, Brian's said it before, but people still don't get it... you need to understand how the business and consumer is going to use something before you can even begin to design security for that something. Recently in an application review when talking to the security analyst assigned to the project I asked him why he took no precaution with the local data store. The designer's answer was that all laptops have full-disk encryption, anti-malware, and the network does tight filtering of traffic so it isn't a risk... then I alerted him to the fact that the actual use-cases required the application to be installed on consumer devices way outside corporate control. I got a blank stare back. Step 1 of any security design - make sure you're confident you understand the use-cases!
- Workload - This unfortunately applies broadly to the security engineer, developer and end-user alike. We're all so busy trying to keep up with workload that we 'just need to get it done' and that creates problems. From the developer who doesn't have time for all these security features (security what?!), to the security analyst who doesn't have time to read through all 75 pages of the design and user cases document, to the user who simply doesn't care what the 'right way' or 'secure way' to do something, they just know they have a task to be done and it needs to be done yesterday. While I can lay the blame for the first two of these issues at the feet of security people, this one goes out to everyone equally but blame falls to the enterprise. You can't expect 10 elves to work overtime and produce a million quality toys over the weekend any more than you can ask a security analyst to review 10 web applications in a week ('full review') while also doing the other 75% of their job... there simply isn't enough time to do a quality job.
There you are - 3 big reasons why security is not easy to use. So what the heck do we do about it, and how can we move the bar up even just a tiny bit? Here's 3 suggestions for you security architects, designers, and analysts...
- K.I.S.S. - You know what this stands for. Keep is simple stupid... the simpler the security control the more likely it is to be effective. This means less steps, less things to install, and more clean and easy-to-understand workflows that have security as their objective.
- Be the ball - Do you remember in little league they would teach you to hit a ball from a pitcher by oddly yelling "be the ball!" ... I do. I never understood that, but maybe I'm starting to get it in my late 30's? Put yourself in the shoes of the person who will be using the thing you're designing. Now take the extra step of putting yourself into their mindset. Are these consumers as advanced as you? Doubtful. Can they click the right icon? Hopefully. So you should be thinking how they think, and how they work, and how they will try to bypass all the cool security controls you put in place - so make it easier for them to do it your way, than to find an alternative.
- Test - The best way to figure out whether your security controls are worth a darn is to take it out on the road and give it to the end-user without telling them that you're testing security. "Hey there, pal, here's a shiny new device I'd like to see if you can use our new x-ray diagnostics app on"... and just sit back and take notes. Do they struggle with figuring out how to log in, or fumble for that post-it that has their password? You get the idea.
It makes me batty when I see really smart security controls implemented in terrible ways where no one uses them. Yes, yes you do give a **** about ease-of-use, I promise you that. You can keep telling yourself you don't, and keep shouting it at the business people around you but what you may find one day in the near future is they don't really care for you either.
Have you struggled with the balance of usability and security? Let's discuss it on Twitter, using the hashtag #SecBiz ... post your question or comment below, leave your Twitter handle and let's discuss this. Ultimately we need to share our successes and failures so we can collectively raise the bar, and serve our business and consumers better.