The Cybersecurity "Glass Ceiling"?

The Cybersecurity "Glass Ceiling"

Adopting a Secure By Design Approach to Protect Critical Systems and Assets

There is an emerging and troubling reality in the world of threat actors and cyber-attacks that many are hesitant to discuss or are slow to recognize. That reality is the "glass ceiling" in cybersecurity. During the past several decades, we have developed cybersecurity programs and technologies largely in stovepipes that attempt to protect systems from the outside in. In simple terms, this means we have employed security measures at the perimeters of our systems and networks to provide sufficient penetration resistance to stop adversary intrusions. Such intrusions can result in data theft, data manipulation, the loss of a critical capability, the prepositioning of malicious code to be triggered at a time of the adversary’s choosing, or the propagation of misinformation.

Security measures (sometimes referred to as cyber hygiene) can be highly effective when the measures are implemented correctly and are operating as intended. However, they are not effective in every situation. While it is unrealistic to expect bulletproof systems and networks, there is a "glass ceiling" in the effectiveness of the current security measures employed by organizations. While no one has a precise number, an 80-90% effectiveness rate in stopping cyber-attacks would not be unreasonable, although this percentage may be significantly lower for cyber-attacks launched by near peer nation-state adversaries. What causes the glass ceiling and why is this issue important?

To understand the glass ceiling problem, it is instructive to look at the two perspectives of cybersecurity—the "above the waterline" view and the "below the waterline" view. Organizations live above the waterline for the most part and implement security measures through people, processes, and technologies. They develop security programs, implement security policies and procedures, purchase and deploy security technologies, and do everything they can to protect their organizations from threats, attacks, vulnerabilities, hazards, disruptions, and exposures. That is good. And it would be even better if that’s where the story ended. But unfortunately, this story has another chapter.

Chapter Two: Going Below the Waterline

The "below the waterline" cybersecurity perspective involves the world of complex systems made up of hardware, software, and firmware components—components that end up in everything from supercomputers to cyber-physical systems to IOT devices such as smartphones and tablets. These systems represent trillions of lines of code, billions of endpoints and devices, and millions of connections over powerful networks worldwide. The complexity is growing rapidly as industry continues to innovate and the appetite for new technology continues unabated.

From the adversary’s perspective, complexity equals "attack surface" or those parts of a system that contain vulnerabilities that can be exploited. It’s a numbers game, plain and simple. The larger the attack surface, the more opportunities the adversaries have to inflict damage, leaving organizations susceptible to ongoing destructive cyber-attacks. Stopping 80-90% of the cyber-attacks launched by adversaries is outstanding, but what about the other 10-20% of attacks that might be successful? What if those attacks occur in systems that are critical, mission-essential, or where human lives are at risk?

Principled Systems Engineering

There is only one way to address the "below the waterline" problem in cybersecurity. We must reduce and manage system complexity and build systems that are secure by design. Significant attention must be given to the time-tested principles for trustworthy secure design that when employed in a disciplined and structured engineering process, produce systems that are well understood and assured—systems that exhibit only the authorized and intended behaviors and outcomes necessary to carry out mission-essential functions and operations.

Too often, we are consumed by cybersecurity silver bullets, shiny objects, and quick fixes that provide short-term benefits but institutionalize long-term, systemic vulnerabilities. We continue to treat cybersecurity as a stovepipe activity, trying to add security measures to systems after the fact, and trusting that everything will be OK. However, trust in systems is not a given—it must be based on the "trustworthiness" of systems. The determination that a system is trustworthy is based on assurance. Assurance is the grounds for justified confidence that a claim or set of claims about the system has been or will be achieved. Justified confidence is derived from objective evidence that reduces uncertainty to an acceptable level, and in doing so, also reduces risk.

"The trust we place in our digital infrastructure should be proportional to how trustworthy and transparent that infrastructure is, and to the consequences [losses] we will incur if that trust is misplaced." Executive Order (EO) on Improving the Nation’s Cybersecurity

Conclusion

It is time to recognize that traditional cybersecurity programs, activities, and technologies are largely "tactical" in nature with an upper bound on their effectiveness. In contrast, systems security engineering provides "strategic systems thinking" that can guide and inform the proper deployment and use of those tactical security measures to create systems that are trustworthy secure by design [1 ].

We continue to work on cutting-edge systems security engineering guidance in the upcoming release of NIST Special Publication 800-160, Volume 1, Revision 1. Showcasing the design principles for trustworthy secure systems that have been developed during the past four decades and proven to be effective in mitigating destructive cyber cyber-attacks including subversion will be a key objective for the revised NIST publication.?

There is a lot at stake and we are at a watershed "fork in the road" in our cybersecurity journey. How is your organization dealing with the cybersecurity glass ceiling? Is your organization spending as much time below the waterline as above the waterline? Is your organization taking proactive steps to implement a systems engineering approach that results in trustworthy secure systems and systems that can control the loss of critical assets? Let’s keep this issue on the front burner to ensure that we do what’s right—for our organizations, for our communities, and for our country.

[1]??R. Ross, "The Need for Systems Thinking in Cybersecurity " ISMG’s CyberEd.io interview.

[2]??R. Ross, J. Oren, M. McEvilley, NIST SP 800-160, Volume 1 , "Systems Security Engineering: Considerations for a Multidisciplinary Approach in the Engineering of Trustworthy Secure Systems."

A special note of thanks to Mark Winstead and Tony Cole , long-time cybersecurity and SSE colleagues, who graciously reviewed and provided sage advice for this article.

Daniel Krawczyk

M.S. Comp Science, BSEE, / Cyber Security, CISSP, RMF, GSLC, CCNA, ICS410, PML3, SE L3, IAM L3, CNSS 4011-4016

2 年

Totally agree not enough effort or design mainly due to lack of expertise and skill sets by chief engineers and program director

回复

Thank you Ron Ross and well said. Resonates very well with the work we are extending here for INL: https://www.observer.solutions/services/idaho-national-labs-cce/ that they have coined CCE - Consequence Driven Cyber Informed Engineering. Actually the consequences are being felt most acutely in the insurance sector where claims for consequential loss have become a serious issue. To quote Andy Bochman of INL: “Every year while we may be improving slightly, the gap between attacker and defender capabilities is widening. The folly of continuing down the same well-trodden incremental improvement path we’ve constructed over the past few decades is now plain for all to see. More and more money spent on new cybersecurity products and services, with hard-to-measure but-low-percentage-of risk “transferred” via the emerging cybersecurity insurance market. Sadly, insurance isn’t the escape hatch it seemed it might become. Those who’ve been in the fight for a few years will find CCE a compelling resource to share with their mentees. But for the far too many who still turn to hope and hygiene to address these challenges, the perspective will serve as a cold dash of reality.”

回复
James De Rienzo, CISSP, ICP

Information Security Risk Specialist | Data Analytics and Reporting

3 年

The evolution of digital Information Processing (IP) will continue to increase its pace. That growth coincides with the commercialization of the Internet, beginning in 1994. https://www.ucsusa.org/resources/each-countrys-share-co2-emissions. Complexity is relativistic. What appears complex to the human brain is trivial to a computer. This quote is very revealing: "Our brain does not process information, retrieve knowledge or store memories. In short: your brain is not a computer." https://aeon.co/essays/your-brain-does-not-process-information-and-it-is-not-a-computer. IP at its best augments human decision making. Systems Engineering / Systems Security Engineering provide methodologies for managing complexity, which translates to "work". So expect more stakeholder involvement, collaboration, project planning, validation, documentation, diagrams. IP can help with the workload to an extent. When systems become complex, automation is inevitable, and then we're talking about Model Based Systems Engineering (MBSE) and beyond, but the principles remain constant.

回复
?? Dave Fairburn Jr. CISSP, “Father of FedRAMP"

Helping you achieve FedRAMP authorization on the FIRST attempt! ??Creator of FedRAMP ??

3 年

This should be mandatory reading for Security SMEs and Network engineering alike. While many Senior level Security SMEs have been preaching this issue for more than a decade, no one (myself included) has managed to communicate it so clearly. Thank you Ron Ross !! #cmmc #fedramp

回复
Ryan B.

CUI Safeguarding Strategy

3 年

Ron Ross thanks for taking the time to differentiate between "above the waterline" security activities (traditionally non-technical) and "below the waterline" activities (complex system engineering implementations). In an ideal world, organizations complete their "mile-deep" systems implementations as part of a bottom-up engineering effort. Top-down, "above the waterline" controls implementations simply review and validate "below the waterline" decisions as an in-line activity. Finally, it's worth acknowledging that most non-federal organizations have performed none of these activities. Therefore, organizations benefit greatly from prescriptive reference architectures where much of the systems security engineering is already documented. This frees them up to launch their ISO-style information security programs, closing gaps in the assumptions laid out in SP 800-171 Appendix E.

要查看或添加评论,请登录

Ron Ross的更多文章

  • Systems Security Engineering Framework

    Systems Security Engineering Framework

    An Engineering-Based Approach to Protecting Cyber-Physical Systems Security, like safety, reliability and resilience…

    3 条评论
  • Secure-by-Design Is More Than Just a Cybersecurity Risk Problem

    Secure-by-Design Is More Than Just a Cybersecurity Risk Problem

    Building trustworthy secure systems has a great deal in common with building a house. It starts with a good…

    14 条评论
  • Making Zero Trust “Trustworthy”

    Making Zero Trust “Trustworthy”

    A little over a year ago, I wrote an article about assurance that attempted to make a convincing argument as to why…

    14 条评论
  • New Year’s Resolution: More Assurance, Less Seat of the Pants

    New Year’s Resolution: More Assurance, Less Seat of the Pants

    Using Assurance Cases to Demonstrate Systems Are Trustworthy Secure With today’s cutting-edge computing technologies…

    24 条评论
  • Yet Another Wake Up Call

    Yet Another Wake Up Call

    A Time for Reflection and Change in Our Cyber Protection Strategy We are once again confronted with another serious…

    22 条评论
  • Diving Below the Cyber Waterline

    Diving Below the Cyber Waterline

    The Danger of Existential Cyber-Attacks on Critical Systems and Assets In a previous article entitled “The…

    15 条评论
  • Engineering Can Make Your Systems More Secure and "Stealthy"

    Engineering Can Make Your Systems More Secure and "Stealthy"

    In Bruce Schneier's recent blog post entitled "The Proliferation of Zero-days," he references the MIT Technology Review…

    9 条评论
  • A Bridge Too Far?

    A Bridge Too Far?

    The Power of Science and Engineering When we drive across a bridge, we have a reasonable expectation that the bridge we…

    13 条评论
  • Security Is Everyone’s Responsibility

    Security Is Everyone’s Responsibility

    Time for Stepping Up to the Plate and Requiring Accountability As the NIST team is entrenched in the 2021 update of SP…

    16 条评论
  • NIST Updates Cyber Resiliency Guidance for Critical Systems

    NIST Updates Cyber Resiliency Guidance for Critical Systems

    Why is cyber resiliency important? It's important because you can’t stop cyber-attacks. Even with “the right”…

    9 条评论

社区洞察

其他会员也浏览了