Beyond Blame: A Nuanced Approach to Software Security and Developer Responsibility
Nathan Byrd
Principal Application Security Architect | Expert in securing the SDLC with SAST & SCA tools | Advocating for transparent, easy-to-implement security solutions | Open-source enthusiast & retro computing hobbyist
Introduction
I want to thank Cassie Crossley for her engaging and thought-provoking YouTube presentation and Escape 's LinkedIn post, which highlight an important topic in software development and security. Although I do not fully agree with every point, her material is well thought out, and her contributions to the security community are deeply appreciated.
Distinguishing Developers and Engineers
In summary, Cassie’s talk distinguishes between software developers and software engineers, calling for responsibilities and accountability akin to other engineering disciplines. It suggests that developers, including open-source contributors, should observe a “duty of care.” It also endorses a commercial certification, the Systems Software Integrator (SSI), as one potential solution. The video also discusses some important considerations about the costs of software development including ongoing maintenance and challenges with ensuring appropriate security across the industry. I recommend watching the video for the complete discussion and point of view. It is an area that is not talked about nearly enough, and I wanted to add some of my own thoughts to the conversation.
Accountability Without Oversimplification
I believe there’s a risk in oversimplifying the problem by concluding that “developers lack responsibility and should be held accountable," however. This perspective can be hazardous, as it overlooks the root causes of software security gaps. If developers are expected to bear personal liability for code, shouldn’t security professionals, their leadership, and security tool vendors be held even more accountable for validating software resilience? After all, security is their primary focus, while developers must balance performance, functionality, maintainability, and security.
When considering accountability, we might compare it to the responsibility of a civil engineer if a bridge collapses under deliberate attack. If a bridge must be bomb-proof, that requirement must be spelled out and budgeted from the start; otherwise, blaming the engineer for failing to anticipate sabotage in every scenario is unfair. Likewise, we wouldn’t design every office door with the same high-security measures as a data center vestibule. Controls must match each project’s needs, considering cost and risk, especially when considering purposeful acts rather than conditions encountered during the normal course of operations.
Embedding Security Early
A constructive, forward-looking approach is to embed security requirements, costs, and responsibilities into project planning from the beginning. Higher accountability may call for more rigorous training, licensing, and professional liability insurance for developers, security professionals, and other stakeholders. But these measures would significantly drive up costs and can slow innovation. NASA, for example, often relies on older components that have undergone extensive certification. Similarly, licensing in software would slow the adoption of new frameworks and tools, since they too would need formal approval and updated training before use.
领英推荐
In a competitive environment, such requirements could widen the gap between large corporations that can afford them and startups that cannot. Without uniform legal mandates, organizations bearing added security costs are at a disadvantage; with strict regulations, costs would skyrocket and innovation might relocate to countries lacking such requirements. Nevertheless, there’s an opportunity to strengthen global security practices by balancing innovation with accountability, potentially through collaborative initiatives or industry standards that guide best practices.
Implications for Open Source and Gen-AI
Regarding the assertion that open-source developers should certify their work, it simply isn’t realistic. These contributors share their work for free and cannot take on legal liability. Universal accountability for software would effectively eliminate open source. All open-source licenses disclaim liability for good reason: contributors receive no compensation. As an open-source developer myself, I can say with certainty I would not be able to continue to create open-source software without this limitation of liability. If companies dislike these terms, they can simply avoid using open-source software. Organizations that do rely on open-source projects should treat them as their own—through funding, code contributions, or both. This approach helps improve quality and security while also fostering a supportive community that drives innovation forward.
Similarly, universal accountability would constrain the use of Gen-AI, because companies would hesitate to adopt tools that expose them to liability for every potential software flaw. If software developers were universally held accountable for vulnerabilities, the same would apply to Gen-AI tools and the companies behind them, making the technology prohibitively expensive for most use cases. Indeed, much of the current Gen-AI boom depends on the relative absence of liability. However, with thoughtful safeguards and guidelines, organizations can still harness Gen-AI responsibly, benefiting from its potential while mitigating risks.
Constructive Paths Forward
Still, the inability to enforce personal liability doesn’t doom software security. On the contrary, it provides an opportunity for security professionals to show leadership in cultivating secure development practices. Cybersecurity experts can create and promote tools that simplify secure coding, advance DevSecOps best practices, and strengthen detection and response to mitigate inevitable breaches. We can collaborate with developers to provide training, foster security champions, and help management understand the benefits and costs of enhanced security measures. Such partnerships often result in more resilient software, stronger teams, and a healthier organizational culture overall.
In limited, high-security contexts, licensing or added accountability may be appropriate, but only when supported by clear requirements, sufficient resources, and a full understanding of the tradeoffs. By carefully defining the scope and intent of such requirements, organizations can maintain both security and agility where it’s truly needed. Most software projects benefit from early, thoughtful attention to security—from the functional requirements stage onward—along with a culture of collaboration between security and engineering. This careful approach can significantly improve security while creating an environment that encourages innovation and creativity.
Ultimately, fostering secure software requires shared responsibility across developers, security professionals, managers, and the broader community. By embedding security early, aligning risk and requirements, and nurturing a culture of collaboration and continual learning, we can make meaningful progress without stifling innovation. I encourage organizations, teams, and individual contributors to come together—support open-source projects, share security research, and exchange effective practices. Let’s collaborate to build a safer, more resilient software ecosystem that benefits everyone.
Great article! My favorite sentence: Most software projects benefit from early, thoughtful attention to security—from the functional requirements stage onward—along with a culture of collaboration between security and engineering.
Book Author: "Software Supply Chain Security" | VP, Supply Chain Security at Schneider Electric | Software Supply Chain Security thought leader | Product Security Officer | Speaker | Board Member
1 个月I completely agree with every point you made, Nathan Byrd. When Alexandra asked me to present on a polarizing topic, I knew developer accountability was perfect for the application security community to consider. With only a short time to present, I took the typical straight line one would find in a debate topic. As a product security professional, I agree fully with risk-based approach to design and build software and hardware. I often use the example that car tires are designed for normal wear and tear, but not to defend against nails, bombs, lasers, and any other type of attack that is discovered each day. The code I wrote in the 80s/90s had no security considerations because it wasn't even a topic for discussion or learning (guilty of admin-admin). As a security leader, we must provide technologies, trainings, and support to our developers. This includes time for learning and performing the activities to increase the security posture. For open source, I agree that there is limited liability, but transparency from the developer / project will help although we should assume that most open source has not been run through SCA and SAST tools and the consumer of open source is responsible as mentioned in my book. Well done!!!