Assurance Approaches
Unless otherwise stated, all views expressed are mine and don’t necessarily reflect those of my employer or MITRE sponsors.
Continuing into assurance thinking - last newsletter article was a transition from secure by design to need for assurance for that to be meaningful. Now a bit on approaches to assurance that Michael McEvilley, Ron Ross , and myself wrote about in Appendix F of NIST SP 800-160 Volume 1 Revision 1. The approaches were something Michael and I brought to the table in something delegated to us to write - and that source we cite in Appendix F, a Defense Science Board report, was something now Deputy Assistant Secretary of Defense Melinda Reed brought to our attention (don't know if Ron knows that part of the story).
The approaches are axiomatic, analytic, and synthetic. Generally in that order they are weakest to strongest but overlap in strength, and parallel the concepts of some of the principles in Appendix E of Volume 1 such as commensurate trustworthiness - the more severe the consequence of loss, the more trustworthiness is needed and thus the more evidence (quality and quantity) from assurance - and more toward synthetic we may need to go. Less severe, some axiomatic check the box may suffice.
Going into a bit of detail:
Axiomatic (assurance by assertion)
Axiomatic is based on beliefs accepted on faith in an artifact or process. The beliefs are often accepted because they are not contradicted by experiment or demonstration - more commonly and sadly - the contradictions are ignored. Axiomatic assurance is not suited to complex and novel scenarios.
Examples:
o?? ?Case history successes
o?? “Best practice” application
o?? Demonstration of compliance or conformance to criteria expressed in a standard, checklist, or other basis without analytical validation of the correctness and effectiveness of results
Analytic (Assurance by Test and Analysis)
Analytic derives? from testing or reasoning to justify conclusions about properties of interest. Belief is relocated from an artifact or process to trust in some method of analysis (commonly using formal verification[1]). The feasibility of establishing an analytic basis depends on the amount of work involved in performing the analysis and on the soundness of any assumptions underlying that analysis. Analytic methods are most relevant in a model that spans all relevant uses and all interfaces to the environment. That is, the model must not ignore too many details.
Testing demonstrates the presence of but not the absence of errors and vulnerabilities. Testing and analyses will have uncertainty that cannot be ignored, especially when they lack comprehensiveness. Uncertainty contributes to risk.
Analytic assurance, while stronger than axiomatic, is not well suited for complex and novel scenarios on their own.
Synthetic (assurance by structured reasoning)
Synthetic derives from the method of composition of the “components of assurance” (i.e., the assurance derives from the manner of synthesis of the constituent parts). Assurance is necessarily considered at every step of the system life cycle, including concept exploration, requirements, realization, use, and retirement.
领英推荐
Structured reasoning serves to fill the gaps associated with the axiomatic and analytic assurance approaches. Since synthetic assurance is based on expert judgment of available evidence, it is not complete. However, synthetic assurance does further reduce uncertainty and thus reduces risk.
Assurance cases are the best known (only?) form of synthetic, with claims shown by arguments and evidence. Assurance cases may argue an axiomatic approach has a contribution to make, and do similarly for test and analyses. It also will point to gaps and have the engineer having to find new ways to demonstrate assurance.
Complexity and novelty
You may have noted the words "complexity" and "novelty" in describing the fallacies with the assurance approaches weaker than synthetic. D. J. Rinehart, J. C. Knight and J. Rowanhill n "NASA/CR-2017-219582 "Understanding What It Means for Assurance Cases to Work"," (2017) discuss this in some detail. They point to a third characteristic beyond complexity (especially design complexity) and novelty (especially design and new technologies) - to high risk efforts - as reasons that prescriptive means like axiomatic and analytics do not suffice.
Easy buttons - check boxes, select and deploy security controls based on axiomatic selection means or in response to simple analyses - are too popular for comfort for systems beyond relatively simple enterprise IT systems.
Final thoughts
Engineering secure systems is not a matter of just do, but it is a quality activity as well. It is not just put in access control - but knowing with evidence and argument that just the right strength and the approach to access control is there. Then evidence and argument that it is implemented and maintained just right. And training those that use it to do it right.
Assurance is a necessity.
[1] Designs are often amenable to mathematical analysis, proving some theorem about a model of the design or element of the design
Interesting and makes good sense. Would wonder however that if a security environment isn't static (which it probably isn't), does that mean the approach (or approaches) have to be continuously be applied to ensure that the "right strength and approach" is correct at any one time? Or am I confused as usual?
Ok. Can't argue with the concept, but does creating assurance cases aligned with ISO 15026 address these concerns? Assurance is not just a security issue it is an engineering issue. So in the same way we look to systems engineering processes to conduct systems security engineering can we look to systems engineering standards for assurance? As well, don't forget the link with verification and validation as described in IEEE 1012.