More Difficult Means More Secure, Right?

Rafal Los
3 min readFeb 9, 2024

I’ve been involved in an unnatural amount of conversation lately about the failings of the “cyber security” industry. So much so that it has be thinking that there may be something going on out here — maybe it’s dawning on people that the utter madness of repeating the same failing strategies over and over isn’t going to suddenly turn into winning tomorrow.

Enter LinkedIn. My former colleague and swell guy, Bill Bernard, posts these little things on Fridays where he relates something real-world to security. You should be following him. And today’s was a doozy. Take a minute and go read his super-short post, here. Then take a minute to soak that in, including the image below.

borrowed from Bill’s original post

I looked at that image for a minute because there was another lesson in it for me. Then it hit me.

The last ~25 years or so that I’ve been associated with the field of information, network, enterprise, or cyber security has been an escalation in making critical tasks more difficult to the point of absurdity.

I don’t know if anyone’s come up with a name for this phenomenon yet so I’ll outline it here in case it’s actually original’ish. Maybe we call this Rafal’s theory of cyber equilibrium …or something.

This whole thing revolves around the idea that you can’t just make something more difficult to use and assume that this means better security. It doesn’t. Your user base will work against you to find ways to use the system and do their jobs bypassing all those glorious security measures. And what you’re left with is an insecure system, with a ton of security in it, that is out of balance. Angry users. Poor security. I think I just described the last 25 years in security…didn’t I?

Theory: Users of a system are willing to tolerate a certain amount of additional complexity or difficulty of use for the sake of security. When measures are added that push beyond this limit, user behavior adapts to restore balance in favor of usability, thereby reducing or eliminating the security benefits of the added complexity.

Cybersecurity professionals have largely mistaken making something more difficult to use, with making that thing more secure.

I don’t think we do this on purpose, to hurt usability and productivity — but I think it’s a side effect of the way we’ve been thinking. Bill mentions password + PIN as one example. I’ve seen it, and it’s been a head scratcher. On the surface, it’s “two things you need to know”, but in reality it’s not actually increasing the difficulty in breaking into that account because those two things (password, pin) will likely be written down or stored in the same place so…

I’ve believed for a long time that the most secure systems are the easiest and more simple — because adding complexity increases the chances of getting something wrong. It is in that simplicity that cybersecurity professionals need to work. But that requires us to show up when the system is being designed, and work our mechanisms that increase security right into the system. We need to think “How do I minimally disrupt the intended usage of this system, while making it more secure?” That’s not trivial — but it can’t be that difficult.

Having worked in a large enterprise as a security architect, I know how difficult it can be to shimmy security components and ideas into an application, system, or product. It takes compromise, and system-level thinking. It means you have to consider the entirety of the system and its intended use, not just the security measures you’ve been taught to jam in.

So the next time you’re looking at adding security into something — ask yourself the big question.

How can I add security without increasing complexity and reducing usability, so users don’t work against the security measures meant to protect them?

THAT, my dear reader, is the $64,000 question. Start there.

--

--

Rafal Los

I’m Rafal, and I’m a 20+ year veteran of the Cyber Security and technology space. I tend to think with a wide-angle lens, and am unapologetically no-bullsh*t.