Start Requiring Secure Development Skills In Your Programmer Job Ad Requirements

Start Requiring Secure Development Skills In Your Programmer Job Ad Requirements

Every job description for a developer should include the skills requirement of secure coding.

This week, I had the pleasure of attending the 2024 MIT CAMS Cybersecurity Innovation Symposium and being part of a panel session entitled Next Generation Secure By Design, along with the US’s Cybersecurity Infrastructure Security Agency’s (CISA’s) Lauren Zabierek, MIT’s Sander Zeijlemaker, and BNP Paribas’ Bernard Gavgani. ?

It was especially great to hear from CISA. Lauren, along with Bob Lord and ?Jack Cable, created and head CISA’s new Secure by Design (https://www.cisa.gov/securebydesign) program. The panel both discussed long-time Security Development Lifecycle (SDL) recommendations and CISA’s latest take on it.

Note: Secure development has been known by many names over the years, including Security Design Lifecycle (SDL), secure design, and secure development. I’m historically partial to SDL, but getting used to SBD (Secure by Design) these days.

If you care about software being designed more securely, whatever the name, you should check out CISA’s Secure by Design (SBD) initiative. It contains many elements SDL proponents have been pushing for decades, including more secure coding, less vulnerabilities, promoting memory type safe languages, secure defaults, aggressive patching, and application hardening. It also includes some new ideas I hadn’t seen before, such as possibly providing a partial legal safe haven for companies who meet certain SBD criteria, vendor ownership of problems, increased transparency, and improved logging.

I’m a big fan of CISA’s SBD program and have nothing but kudos for it. There are always ways to improve any program, but this is a GREAT, first start. If it succeeds in its goals, it can only help make computing safer.

CISA Director, Jen Easterly, who I’m a tremendous fan of as well, recently announced that at least 68 technology companies, such as Amazon AWS, Cisco, Google, Microsoft, and Veracode, have pledged support (https://www.cisa.gov/news-events/news/cisa-announces-secure-design-commitments-leading-technology-providers) to CISA’s SBD program.

The only outstanding question is will CISA’s SBD program significantly succeed any better than all the previous decades of SDL programs before it? CISA isn’t a regulatory body. It can’t require or enforce anything. It can only recommend and try to encourage actions and behaviors among government and private companies who volunteer to do it. It’s up to other government agencies for requirements and enforcement. Still, I think CISA’s SBD has a good chance of broadly advancing its objectives because it has strong private vendor support. Most of the previous SDL programs were either vendor-specific (Microsoft has a great one) or only pushed by a security consortium without such a broad level of support from multiple major vendors. CISA’s SBD could significantly move the needle.

You can learn more about CISA’s SBD program here: https://www.cisa.gov/securebydesign and you can read more about CISA’s Secure By Design program initiatives here: https://www.cisa.gov/sites/default/files/2023-10/SecureByDesign_1025_508c.pdf.

My Big SDL Two Questions and Recommendations

I’ve been in cybersecurity for over 35 years. I’ve been reading and promoting SDL objectives for over two and half decades, at least since Gary McGraw’s multiple books on securing software in the late 1990’s and Michael Howard and David LeBlanc’s 2001 Writing Secure Code book (https://www.amazon.com/Writing-Secure-Code-Michael-Howard/dp/0735615888). Even today, I have weekly conversations with Loren Kohnfelder, creator of digital certificates and author of 2021’s Designing Secure Software: A Guide for Developers (https://www.amazon.com/Designing-Secure-Software-Guide-Developers/dp/1718501927/). This is to say that it’s a long-term subject I’m intimately interested in. Who doesn’t want to make software and firmware more secure?

Last year, we had over 29,000 different publicly declared vulnerabilities (https://www.cvedetails.com/browse-by-date.php). Although only 3.7% of all publicly announced vulnerabilities will ever be exploited by any real-world criminal against any real-world company (https://www.cisa.gov/news-events/directives/bod-22-01-reducing-significant-risk-known-exploited-vulnerabilities), they still account for 33% of all successful data breaches (https://www.action1.com/patching-insights-from-kevin-mandia-of-mandiant/).

Note: Only social engineering is involved in more successful attacks.

For a variety of reasons, sadly, some non-minor percentage of users, usually between 10% - 20%, will never patch the vulnerable software or firmware in a timely manner, if ever at all. Every found vulnerability becomes a nearly forever way for attackers to break into some places. We need to decrease the number of vulnerabilities that are put into our software and firmware and better patch the vulnerabilities that do end up occurring. Below is the number of publicly known exploits by year (https://www.cvedetails.com/browse-by-date.php).

Every vulnerability found in software and firmware after its public release becomes expensive not only to potential victims, who must patch it, but to the vendor who must respond to it, reviewing, analyzing, creating patches, releasing patches, and helping customers troubleshoot patching problems. Certainly, every compromise becomes expensive to the victim. We must decrease the amount of bugs that are being put into our software and firmware by developers.

Yet, today, thousands to tens of thousands of developers are, today…right now…putting in new bugs and vulnerabilities. Whenever you read about a newly found vulnerability…say a hard-coded password in some application appliance, just know that not only is that particular vendor have the problem…but today, there are likely thousands of other software and firmware programs with the same problem that haven’t been found and publicly announced. Worse, there are hundreds of developers currently, right now, putting hard-coded passwords into the software and firmware they are writing today. It’s not like the one vendor you are hearing about with a hard-coded password problem is the only one with the problem. Nope, every vulnerability type you can think about, including hard-coded passwords, are being put into today’s and tomorrow’s software and firmware!! This is guaranteed, for sure.

Why?

There are two main reasons.

First, almost no developer is being trained in secure development. Almost no programming curriculum in the world has a REQUIRED course (or a significant number of hours) dedicated to training new developers in how to securely code. How can we expect our developers not to put in hard-coded passwords into the software and firmware they are coding if they aren’t being taught that it’s very dangerous and wrong to do so?

Today, we just hope our developers have heard about it and cared enough when they read it to not to do it themselves. Obviously, with over 29,000 publicly found vulnerabilities found last year alone, hoping developers get the message is not working.

I’m aghast that in this day and time, with absolutely everyone apparently aware that vulnerabilities are numerous, bad, and essentially out of control, that almost no programming school (e.g., university, college, technical school, programming course, etc.) includes required significant training in secure coding as a part of their process. How is that possible?

The university or school you went to, if you went to school, very likely does not require secure coding skills as a part of its process. Most don’t cover it at all or only spend a few hours on it. If a school does have a secure development course, it’s not required.

I spent a few months last year trying to rectify the problem. I reached out to multiple big universities and computer security groups trying to get more universities and colleges to put secure coding into their required curriculums. For every university and college I contacted, you would have thought I wanted to put poison in their water. It was hard to find a school or computer security group where my idea…which seems so merited…wasn’t met with confusion, apathy, derision, or active disregard.

I did find two groups that cared: IEEE and CISA. IEEE is working to promote a secure development curriculum, but it’s lacking some instrumental topics (for example, it doesn’t mention post-quantum cryptography…which seems strange to exclude with that deadline hanging over us), and it seems to be slow-tracking its own initiative. It’s been years in the making, years in review, they really don’t want any updates made to it (such as post-quantum cryptography), and appears to be on track to be years in pushing it out. I wonder if the IEEE’s initiative will ever see the light of day in a college curriculum. Still, it’s the best shot we have at the moment. The IEEE is trying and that’s far more than most. I applaud them.

CISA is obviously interested in programming curriculums containing SBD training, but it isn’t part of their formal program and isn’t mentioned in their current materials. They might feel that they already have too much on their plate. I get it. They do have a lot on their plate. They are already pushing dozens of great recommendations. Who am I to come along and ask them to put another on their already long list of recommendations?

When I did talk to universities and groups that determine college curriculums about adding SBD requirements, most said their curriculums were completely full, with other important stuff waiting to get in (such as containers and microservices), first, and so there just wasn’t room for SBD training. So, yes, you’re reading this right, program training curriculums are too full to add SBD training. I’m not an instructor or curriculum developer so who am I to just point fingers and complain? But I can’t imagine a more important topic to add…there’s got to be something else less important that can be dropped.

Then I came to the conclusion that the reason why universities, colleges, technical schools, and remote training programs, aren’t teaching SBD ideas is that employers aren’t demanding it! I’ve yet to see a programmer job ad where SBD training was required. Maybe there are some, here and there, but I’ve never seen them and they certainly aren’t popular.

So, we’ve got over 29,000 publicly known vulnerabilities in 2023. We’ve got over 15,500 so far by May of this year, so we are on record to break last year’s record. Could part of the problem be that we are not training programmers in secure development, and could part of that problem be because employers aren’t requiring it?

Yeah…that’s probably it.

I can’t get any university, college, or programmer training program to add secure design to their curriculum. No single person alone can do it. It will take everyone. If the demand is there the schools will quickly follow.

?So, think globally and act locally.

If you’ve got input in hiring a programmer or can influence your company’s programmer job ads, add a secure development skills requirement. We can only start to fix the problem by forcing our supply chain (of programmers) to have training in the first place. And apparently colleges, universities, and schools will only do it when employers start demanding it.

If employers started requiring that their programmers have formal secure by design training, one of the biggest solutions to our vulnerability problems would be solved in a few years. And that would benefit the world. It would increase global productivity. It would decrease patching. It would decrease successful compromises.

What could be better?

要查看或添加评论,请登录

Roger Grimes的更多文章

社区洞察

其他会员也浏览了