There is little doubt that it’s difficult to develop secure software. First, you need to be aware of the need for security, accepting it as an important element of software quality. This is generally not something we learn in school. Not that it matters much, given how many developers are skipping education only to dive straight into building software.
Programming has always been something people can pick up, for better or worse. This is especially true today, with the ridiculous pace at which the Internet is growing and the seemingly permanent skills shortage. Because security awareness is not the norm, chances are that newcomers are going to miss it.
Those few who are aware of the pressing need for security will find that the awareness alone isn’t enough. Because most technologies we use today are insecure by default or can be turned insecure by even the slightest mistakes, and given that the documentation is massively lacking, it takes great skill to navigate the development landscape to avoid the security pitfalls. In essence, we have to be security experts and software developers. How likely is that?
Both of these problems can be overcome, and I discussed how we can achieve that in my previous column. In my first installment I discussed how we needed to revisit our development tools and libraries, making them secure by default. We need to have the security “built-in”. In my second installment I discussed the Broken Window theory, which can help us raise the awareness and establish cultural norms that include secure software. Today I want to discuss the third problem, the cost of developing secure software. Clearly, it costs more to developing secure software; the question many are asking is this: is this something worth doing?
It’s a perfectly reasonable question, because security is not a matter of black or white but many shades of grey. What’s secure for one person might not be secure for someone else. If a business is to survive, it needs paying customers. To get them, you need to build a product that’s a good fit for the market and spread the word. And the problem is this: people buy products for their primary function, not for their security (except for security products, but let’s assume that’s not the case for the sake of argument). Your budget is limited—how much of it do you spend on security, which is invisible if you do it right? If you build a super-secure product you might not have enough money left to build features and you might end up failing to attract a critical mass of customers.
It gets more interesting when you take competition into account. As if it isn’t hard enough to build a good product, now you also need to worry about your competitors luring your prospects away from you. If they build a better product, good for them! But what if they build a “better” product by sacrificing the rarely seen bits such as security?
The problem is two-fold. First, people are generally not aware of the importance of the software being secure. If you think getting developers to understand is difficult, try convincing those who won’t know anything about software development. Second, your customers are not going to be good judges of security. Our entire industry struggles with such assessments, and that’s with all the experts trying to do a good job.
To understand how this affects software security, we can look at the 1970 paper called The Market for “Lemons”: Quality Uncertainty and the Market Mechanism, which won its author, George A. Akerlof, the Nobel Prize for Economics in 2001. George looked at the market for second-hand cars in the US. He discovered that there is a significant information asymmetry: although the sellers knew a lot about their cars, the buyers knew little and couldn’t find more until after they bought the car and used it for some time. For this reason they weren’t willing to pay a premium, because, more often than not, they would get a bad car (called a “lemon” in the US). This of course skews the entire market, because it leads the owners of good cars to refuse to participate. As a result, you can buy only bad second hand cars.
The situation is similar with secure software. Because it’s difficult to tell which software is secure and which isn’t, we find out only when it’s too late. Not only have we already paid for the software, but we have also probably invested a great deal of time into learning how to use it. As a result—as on the second hand car market—there is little motivation for software developers to focus on security. The budget is spent on marketable features instead.
Now that we understand the scope of the problem, in my next instalment I will discuss if there’s anything we can do to tip the scale toward security.