The Need for a New Cybersecurity Metric
Chip Block
Vice President and Chief Solutions Architect at Evolver, a Converged Security Solutions Company and CEO/CTO of Kiwi Futures, LLC
Early in my career, I worked on adaptive tracking algorithms for military aircraft radar systems. Adaptive tracking is where a radar takes an initial broad radar hit and predicts a window where that target will be next. This allows more energy can then be focused on that window to get a more accurate prediction. With each prediction and verification, the windows get smaller and smaller until there is a very high precision of location prediction. This is the famous “lock on” you see in movies. Logically, the defense against this type of radar “lock on” is unpredictable behavior by the target.
Maybe because of this work in adaptive tracking, I have long been a believer in the best cyber defense is agile and unpredictable behavior. A cyber campaign is very similar to a military radar in that initial broad attacks are cast to acquire targets, reconnaissance is conducted to determine systems and vulnerabilities, predictions are made as to behaviors and then a specific vulnerability or human action is exploited. In other words, attackers start with broad nets and eventually “lock on” to attack targets.
Over the last six months, major cyber attacks including Solarwinds, Microsoft Exchange and Colonial Pipeline have now convinced me more than ever that agility and unpredictability needs to be the new direction of cyber defense. I also believe that the current state of technology enables us to achieve this type of defense without the need for some ground breaking new research. As one who believes that nothing can be achieved if it can’t be measured, I also believe we need a new metric in the cybersecurity world that measures agility and unpredictability.
The variance in agility and unpredictability in cybersecurity became aware to me as I compared two clients of ours. One of our clients has a cloud infrastructure build in Docker containers. They destroy and rebuild their entire environment every 14 days in a matter of minutes. Another client who was affected by the Microsoft Exchange vulnerability took three weeks just to find and change the passwords on their privileged accounts. Obviously, the first client has the ability to be much more agile and unpredictable than the second.
What To Call the Metric
I struggled to come up with the actual metric that is being measured. Agile is already overused and does not necessarily mean that next actions can’t be predicted. A fighter jet is agile, but it can still be tracked. Unpredictable implies randomness which has its own issues. So I settled on elusive. I am not wedded to this word if someone else wants to come up with something better.
We need to begin to develop, implement, test and, most importantly, measure the elusiveness of our infrastructures. I am not just referring to the network but every element in the technology stack from the IP address to the operating system to the applications to the data. The more critical the system, the more agile and elusive the environment must become. And there should be a way to measure and validate this elusiveness.
The Technology Is Here
Elusiveness in cyber is not a novel idea, it has been around for decades. Address space layout randomization (ASLR) was developed in the early 2000s. Almost every element of the current technology stack has some form of agile and elusive capabilities. We just haven’t put all of the pieces together and measured it in a logical and repeatable fashion. Additionally, as I will discuss later, I believe elusiveness can be the core of a Zero Trust implementation. Examples of technology that enable elusive environments follow.
Containerization/Kubernetes
I believe the move toward containerization may be the greatest enabler of elusive environments we have seen up until this point. The reason for this optimism is that there is motivation for containerization for both the software development and the security communities. Unlike many security actions that increase burdens on the development and operations staff, containerization improves speed and agility for both developers and security staff. As mentioned earlier, reconstituting containers every few weeks (or days if desired) gives attackers a very short window to land and attack. If changes are made to the containers with each deployment, then more elusiveness is injected into the environment. Great strides have been made in this area in the past year. The Air Force Platform One program demonstrates that containerization is possible for very large scale, and highly mission critical, environments.
Deceptive Networks
Deceptive network technologies have been around a while. All the way back to my military aircraft technology days where we practiced frequency hopping methodologies. In the IP based networks, a number of technologies exist today including TrapX and Illusive Networks that inject deceptive capabilities into the network element of the stack. As a matter of course, it does seem to me that having a set IP address assigned to a specific asset that holds really important information is not a great idea. At least making the attacker have to regularly go find a new address and verify the target seems like a fairly low bar to meet.
Polymorphic Operating Systems
This is one area I do believe needs more concentration. We achieved ASLR on memory two decades ago but only in the past few years has the use of polymorphic operating systems come to fruition. If we can make the memory location polymorphic, why not the rest of the processing environment? In particular, Polyverse Corporation has developed this capability and it is getting good market adoption. There are other technologies being developed in the space as well. Imagine that a Kubernetes container is updated every 14 days and that the operating system is polymorphic. First, any malware injected by an attacker would have to execute in the 14 days and secondly it would have to be able to operate in the varied operating system environment.
Credential Agility
There are very few attack vectors that do not eventually result in the compromise and escalation of privileges. As advanced as we have progressed in other areas, most environments today still have a fixed user with a credential as a basic element. This credential, whether it is a person, server or application, determines what can be accessed and viewed. Technologies, such as privilege access management, exist to make this process more agile ( CyberArk, CA PAM, Centrify) but many organizations have not employed them. If credential life was measured in a similar timeframe as the container life we mentioned earlier, then attackers would have very short attack windows. There is a direct connection with the agility in credentialing and the move toward Zero Trust (discussed later).
Dynamic Data
Ok, so this is the hard one but one that I believe has the greatest impact. Michael Conlin, former DoD Chief Data Officer, and I wrote a paper on this concept a few months ago. (https://www.dhirubhai.net/feed/update/urn:li:activity:6736667979016085505/) The objective of this approach is to essentially containerize data similar to what has been achieved with applications. Data would only be able to go to specific locations or users, have a set “life” time, and minimize the likelihood of large scale data loss. From an elusive point of view, the data could be aligned with the container and polymorphic operating system so that the time for compromise is limited in time and scope. I realize this is the area that is farthest from being highly available but I also believe it has the greatest impact in terms of reducing the value of return by attackers.
Measuring Elusiveness
So I stated that nothing can be achieved if it can’t be measured. As far as I know, there is not a current “elusive metric” that measures how agile or adaptive an environment might be. There are ways of testing this, however, if a metric was created. The use of automated attack technologies such as Scythe can simulate attacks based on the MITRE ATT&K framework and the success, or lack of success, could be captured. A measurement based on the areas listed above would be start. Points or levels for dynamic containerization, network, operating system and data could be captured. I will admit this article is more of a call for a “what we need” than a “how to do it” article.
The Certification and Audit Conflict
There is one topic that I have to bring up because it could cause some conflict in the current method of doing cybersecurity. We currently are in a highly compliance based mode today. The move toward certified environments, whether CMMC or ISO, have as a basic construct measuring a fixed environment with logs, users, operating system version, etc. There is not an inherent conflict between an elusive environment and compliance audits, but many of the current approaches would have to change. If an assessor evaluates an environment and two weeks later every IP, user and data source changes, then the authority of the assessment could be questioned. What this requires is that the assessment has to be done at a much earlier stage of the development and production stack. Assessment have to be done to evaluate the process and technologies building the dynamic environment.
Elusiveness and Zero Trust
The President’s Executive Order specifically calls out for the employment of a Zero Trust strategy for federal systems. Without a doubt, Zero Trust will improve overall cyber defense. The basic concept of Zero Trust is to not trust anything until verified. Of course, the objective of the attacker will be to gain validated trust through some mechanism such as credential theft or creation, compromise of the validation software or injection into trusted applications. This is where having credential and privilege agility becomes essential. If an attacker is successful in gaining a trusted credential, limiting the time and the impact of having that credential becomes critical. This also means that new administrative tools that are constantly changing username/password and associated credentials needs to be developed.
An elusive environment is, ultimately, the desired Zero Trust implementation. Not only is trust not assumed with any connection until verified, but the trust has to align with the right time, structure and data to be accepted. The use of dynamic identity management and segmentation through containerization aligns directly with the Zero Trust concepts laid out in NIST 800-207.
Technologies
I have mentioned a number of products and technologies in this article. Though I am familiar with these products, this is not an endorsement or promotion for any of them. These are examples and there are a number of other great products on the market that do similar functions. The products mentioned are to demonstrate the current state of technology.
Elusiveness Metric Next Steps
As I said earlier, this is more an article on “what we need” than a “how to do it paper”. Having said that, I do believe there are some recommended actions going forward. Obviously, moving toward an elusive cybersecurity strategy is top of the list. Secondly, developing some method of measuring elusiveness and rewarding this design and behavior would be next. That would include injecting elusiveness into the assessment processes. The initial steps can be small, such as rate of credential updates. We need to begin to measure and report this in some fashion. As we move toward Zero Trust, the agile and elusive nature of the implementation should be measured. I believe that by bringing together the advances of containerization, network deception, agile credentials, polymorphism, dynamic data and Zero Trust will give the attackers a much more difficult target to "lock on" in the future.
Author of How to Manage Cybersecurity Risk - A Leader’s Roadmap with Open FAIR
3 年Have you considered quantitative risk as the key security metric?