The Danger of Invisible Software Dangers
When dangers are invisible, people take crazy risks. We just do. People aren't really very good at risk -- particularly not the risks associated with modern IT. Our instincts are designed for risks that are much more obvious. Think sharp and hot things. We've been trying to get companies to produce more secure code for decades. I think we can probably agree that no form of "disclose-and-pray" is going to make much of a difference.
In 1986, California adopted the Safe Drinking Water and Toxic Enforcement Act, commonly known as Prop. 65. Basically, it requires companies to warn the public of any exposure to hundreds of chemicals known to cause cancer, birth defects, or reproductive harm. Notice that it doesn't prevent companies from using chemicals. In fact, it doesn't require companies to do anything except warn consumers.
Nevertheless, programs like Nutrition Facts labels, Monroney Stickers (Car Window Sticker), cigarette warnings, Energy Guide, Drug Facts, MPAA movie ratings, ESRB video game and music ratings, and many others have been very effective at changing corporate behavior. Importantly, this doesn't have much to do with whether it affects consumer behavior. People didn't start looking at Nutrition Fact labels for decades, but companies immediately started to clean up their act.
Specifically, Prop. 65 has caused many companies to change their behavior. For example, companies have removed:
- lead and other heavy metals from children's products
- arsenic from playground equipment
- formaldehyde from classrooms
- cancer causing fire retardant from furniture
- chemicals associated with developmental disorders from hospitals
- and many more...
Today, the information technology industry has struggled with vulnerabilities for decades, with no end in sight. This isn't about hacker geniuses. It's all about basic security blocking and tackling. And we are failing dismally. The underlying problem is that consumers cannot make informed choices about the software that they use.
What do you really know about the security of your online banking website? Do you know what security defenses they have in place? Do you know who wrote the code, whether they are employees of the bank, or how they were trained in security? Do you know what software libraries are used in the software or if they have known security holes? Do you know whether it was tested for security? Do you know what tools are used to verify security and protect against attacks? You are trusting your finances (and much more) to technology that you have no reason to trust.
We can learn from Prop. 65 and its brethren. The best way to influence change is to simply require disclosure of some of the details above. Even if consumers never read these labels, companies will simply not permit their products to issue with a label saying that they did absolutely no security testing. We can argue about exactly what ought to go in the label, but the exact design doesn't make that much difference. It's the existence of the label that changes corporate behavior.
Car manufacturers won't sell cars that fail safety tests. Food producers won't sell foods that contain poisons or trans fats. Movie producers go to great lengths to avoid an "X" rating. And software vendors won't sell products that have glaring security problems if they are forced to disclose what they've done.
It's time to fix the asymmetric information problem in the software market. We're trusting more and more critical information and functions to software every day. At the same time, software is moving faster and getting more complex rapidly. We can't afford to keep security invisible.
Finally, if you're concerned about (or responsible for) the security future of your own company, you can adopt this approach yourself. Make security visible to everyone. Enact your own "Security in Sunshine" program. There's no better way to establish the "culture of security" that everyone wants but so few achieve.
Great article, Jeff. It occurs to me in thinking about the CAMS methodology, as a framework for DevOps as discussed in Puppet's recent State of DevOps Report, that 3 of 4 of these pillars closely relate to transparency. It may be increasingly relevant, particularly in Appsec, now that silos begin to crumble and agile, cross-functional teams reemerge as the predominant force.