The Cloud is the most dangerous place to keep data for Small and Medium sized businesses - unless you keep it on premises.
There is a commonly held belief, especially among small and medium sized businesses (SMB), that data is safer on premises. Of course, this is a natural response. As humans, we are drawn to the idea of setting up a perimeter around our treasures. Everything outside the castle is dangerous and everything inside is safe. In fact, it is why we used to build large walls around cities for protection. The city of Jericho is a famous example. There were two problems with the walls: First, if the enemy got inside the walls, e.g. a Trojan Horse, they could easily defeat the city. Second, the walls made trading with the outside world more difficult and ultimately reduced the wealth of the city. Eventually, the world moved beyond the perimeter strategy. Today, the earth is littered with empty castles and the ruins of walled-cities like Jericho, Troy, and King's Landing. OK, the last one isn't real, but it suffered the same fate.
Erecting a perimeter was the security model in the 1990s. In 2003, a group of forward thinking computer professionals, led by David Lacy of the Royal Mail, realized that the nascent Internet would force companies to live in de-perimeterized world. They knew that electronic walls, like their physical analog, would be easily defeated and inhibit commerce. They cleverly named the group, The Jericho Forum, which created a set of principles for de-perimeterization. The Jericho Forum Commandments encouraged a vision without a perimeter. Firewalls became necessary but not sufficient. They should not be be the focal point of a modern Zero Trust Security architecture. The Jericho Forum merged with The Open Group in 2014.
Zero Trust Commandments
On a recent episode of the Azure Security Podcast, Mark Simos (Microsoft and also Zero Trust Architecture Working Group Co-Chair) announced The Open Group's release of the Zero Trust Commandments - a refinement of the original commandments. One could dedicate many articles to these commandments, and I highly recommend reading them, but here is a brief summary:
Practice Deliberate Security
Support Business Objectives
Enable Sustainable Security
Deploy Agile and Adaptive Security
One of the important takeaways from this list is that we should protect assets (data), no matter where it is. Notice the focus on a security culture and not security solutions. This is a very different mindset from the one many SMBs hold today.
The State Of Small-Medium Business Security
For the most part, many SMBs are still using the perimeter (firewall plus antivirus) mindset of the 1990s. This is in part due to the fact that SMBs do not have the resources of larger organizations. Rick Doten described this on a Google Security Podcast earlier this year as the problem of the Security-Haves and Have-Nots. SMBs generally do not have a Chief Information Security Officer, an Incident Response Team, Threat Hunters, etc. SMB IT departments are historically slim. Because of this security chasm, SMBs are still operating under many dated assumptions and practices.
"We are too small to attract the attention of the threat actors"
SMBs are getting more attention from threat actors. The 2023 Verizon Data Breach Incident Report states:
“…SMBs and large organizations have increasingly become similar to each other. This phenomenon began several years ago, and by now there is so little difference based on organizational size that we were hard-pressed to make any distinctions whatsoever.”
Since larger companies have improved their security posture, it is now easier to attack the smaller companies with less mature security postures and use them to leverage access to the larger companies and or the government agencies they serve. The "we're-too-small" assumption also ignores the current market configuration of threat actors. Rarely does a single person handle the entire attack sequence. Actors have specialized much like legitimate businesses. The first group is called Initial Access Brokers. Their one job is to break into company networks and set up persistent access to be sold to the highest bidder. They don't have the time nor want the trouble of getting money from companies. Other groups will purchase the access from these brokers, exfiltrate the data, optionally encrypt it, then finally sell it after getting all the money they can from the ransomed company. Supporting the entire threat industry, there are other groups, like LockBit, who write and sell the tools to enable the entire process. They do not care how big or small you are. Some are motivated by money and others are motivated by a political cause or state interest.
"We have a firewall and antivirus"
Let us put aside the fact that probably every single company who experienced a cyber incident this year also had a firewall and antivirus software. The underlying assumption is that the firewall will block the majority of threat activities and antivirus will catch the few that make it through. How many ways can an attacker gain access into a local network in spite of these protections?
CISA has a list with other intrusion points.
"We don't have to patch that quickly since those systems are not exposed to the Internet"
As mentioned earlier, if an Initial Access Broker has compromised the network, there will be a service (shell) just waiting to execute commands by the threat actor - including those that will take advantage of unpatched systems not directly connected to the Internet.
Time For A New Mindset
Hopefully by now, one accepts the fact that the perimeter around our local network no longer really exists. The security posture of SMBs has become passive. Why?
Many SMBs spend a lot of time and effort building one of a kind solutions. These solutions require much care from their owners. Administrators even give the servers running the custom solutions cute and meaningful names. They are like pets. Those of us who have or have had pets know how painful it is to lose one. No wonder we build walls around them.
For cloud providers, the situation is very different. They add and remove servers at a fast pace. It would be impossible to give clever names to each one. Each of these physical servers also host many virtual machines that customers spin up and down daily. On top of these virtual machines, there could be many applications running in encapsulated containers. If a container becomes unresponsive or gets malware, the server can kill it and start up a new one. This is much closer to how we treat cattle. If an individual cow gets infected, we simply remove it from the herd to protect the others, and another will take its place.
At the RSA Conference, Sounil Yu introduced a new security paradigm: D.I.E. His argument that we should strive to keep data Distributed, Immutable, and Ephemeral. You can watch his presentation for an excellent description of the new paradigm. He discusses the Pets vs. Cattle and how moving to a cattle approach can make companies more resilient.
Implementing The New Mindset
Here are some steps for SMBs to follow to get to the new mindset.
Follow The Commandments
The ideas behind the Zero Trust Architecture are two decades old. The marketing hype cycle has moved on to AI. It is the time for Zero Trust to continue on to the Plateau of Productivity. The Zero Trust Commandments are an updated vision, so look for more articles and videos to appear shortly.
领英推荐
Know What You Don't Know
A company cannot protect what it doesn't know it has. The point of taking an inventory is not to "check a box" but to understand the environment - not just once a quarter or year but at least weekly. What new devices have been added? What new services are running? Which devices have not been seen in awhile? SMBs do not have the bandwidth to take a continuous inventory and manually update spreadsheets. Fortunately, there are SMB friendly services like runZero and others that make this possible. Once a company knows what they have, they can make informed decisions and give limited IT resources time to remediate the risks. A known device has a slightly lower risk component than a new one which should be treated with more suspicion. We should use any attributes that help to determine the validity of a device or individual requesting access to a company resource.
It is also very important to have a data inventory. Frankly, this is a more difficult task. Data may be on network drives, cloud storage, email boxes, SaaS applications (ERP, CRM, etc.), communication apps (Slack and Teams), laptops, removable storage, IoT devices, mobile devices, etc. Finding the data is the first problem but once found, prioritizing is required for Zero Trust. While we want to protect all data, making sure that personally identifiable information (PII), healthcare data (HIPAA), and Intellectual Property (IP) should be secured first. Secure assets by risk.
Be Proactive
Now that we are assuming breach, set up traps to alert us when there is inappropriate snooping around our valuable assets - a canary in the coalmine as it were. There is a free tool from Thinkst Canary that will alert you if somebody opens a particular Word, Excel, or PDF document, browses to a Windows folder, accesses an SQL Database, uses AWS Keys, among many other useful canaries. Place them in email boxes of your executive team, in cloud storage locations, file servers, etc. These "trip wires" will let you know when someone is where they don't belong.
If your company writes their own APIs, you could also create your own canary methods like "/api/adminsvc" that never should never be called by anyone. If someone does, it redirects them to a fake login page and can discern who may have accidentally exposed their credentials. If you feel especially clever, activate a playbook that would block the external IP address of the requesting device until you have time to investigate.
By the way, canaries can catch internal threats as well. Internal threats can be both accidental or malicious. Either way, it is good know when these events occur.
Break The Data Addiction
Admitting you have a problem is not the first step of data recovery - immutable backups are. Checking if you are still with me. However, what data we choose to keep, where we keep it, and for how long is a worthwhile discussion.
I blame Gmail for setting the the expectation of never deleting email. The largest problem with storing mail and documents in mailboxes is Business Email Compromise. Email has been a struggle to secure. When a threat actor gains access to an email account, it provides them the ability to create very powerful social engineering campaigns since they can step right into the middle of a conversation and sound authoritative. With the advent of Large Language AI Models, it will be even easier to summarize emails and run personalized phishing scams at scale.
Of course, Google is not the villain here. It is cheaper and cheaper storage. In the early 80s, a 10MB (no, not GB) drive was well over $1000 USD. No wonder we didn't store the century in dates prior to Y2K! Today, storage is nearly free by comparison. This has lead to an explosion in storage utilization. It has never been cheaper to just save everything on network storage devices. But like email, keeping data means one has to protect it. This "gray data" can be more of a liability than an asset.
Segment, Segment, Segment
In the early 2000s, we were very proud when we could reach any resource in the network. Today, pervasive connectivity is a massive liability. Unfortunately, rewinding all this work is not easy. Again, start with the business objectives and prioritize areas to segment. Separating the Operational Technology (shop floor, POS, etc.), Information Technology (IT), and Industrial Process Control (IPC) networks will help reduce risk and improve business continuity.
Segmentation is not just physical networking. Here is a place where the cloud can make SMBs safer than keeping all of their eggs in one basket. Instead of direct integrations with your internal systems, spin up dedicated, isolated services in the cloud. These services in turn are the only ones allowed to speak to particular on-prem resources with limited permissions. If the cloud service is compromised, the threat actor won't be able to pivot to other resources on your local network.
Source code for SMB's custom apps require the same protection as data. If the local network and backups are compromised, the company can lose hundreds, even thousand, of hours of development time.
Even within the cloud, try to keep workloads separate. For example, in Azure, subscriptions are a security boundary. One can use a different subscription for development, staging, and production - even subscriptions by individual application if necessary. This isolates internal developers and third party consultants from your production data. Moreover, test systems should use test data and not production data. While this is more work for you, it also makes breaches in those areas less harmful. Segmenting implements least privilege access as well as defense in depth.
Flip the Script
When it comes to automation, there is a popular question, "Why should one spend hours to automate a task that one can perform in a few minutes?" There's even an XKCD post that calculates the time saved by automation.
I'm not kind of person who would tell Randall Munroe he's wrong. He's not. However, there is an assumption in this calculation: the person who knows how to do the task is always going to be there to perform it. If that person is sleeping, sick, on vacation, or leaves the company, how much time did we save on the task not being automated? The task only took a few minutes a month but will now take days or even weeks because the task runner is unavailable. Worse, what if we have to hire someone to recreate the task's purpose from scratch?
Automation is documentation.
Sure, we could write documentation. But if we are willing to take the time to write documentation, which will get out-of-date over time, why not write the script at the same time? As mentioned earlier, SMBs have small staffs and become heavily reliant on a few or even one person. The on-prem setup tends to reflect the personality of the administrators, which can make it difficult to recreate when they are not available. A great (free) tool that combines scripting and documentation is Jupyter or Polyglot notebooks. These allow the author to embed markdown cells with cells that can be executed in various languages.
Having scripts that can rebuild infrastructure also enables resiliency. If there is a fire, flood, or other catastrophe in the on-prem data center, well-designed scripts can get the company back in business in a remote data center or even the cloud if physical servers are difficult to procure as was the case during the pandemic. Treat scripts like source code. Version it and segment it away from the local network so they are available when you need them.
A quick word about scripting languages. If there is anything that SMBs might be more afraid of than the cloud, it is PowerShell. It's understandable as many adversaries effectively use it. However, the National Security Agency (NSA), the Cybersecurity and Infrastructure Security Agency (CISA), the New Zealand National Cyber Security Centre (NZ NCSC), and the United Kingdom National Cyber Security Centre (NCSC-UK)?all recommend using PowerShell and not blocking it.
To learn more about how attackers and defenders can use PowerShell, readers may want to consider this book from Miriam C. Wiesner : PowerShell Automation and Scripting for Cybersecurity: Hacking and defense for red and blue teamers There is also a very good technical video by Lee Holmes called Defending against PowerShell attacks" that explains why PowerShell can be much safer than other scripting languages.
Don't Operate Machinery While Intoxicated
...with power. Performing daily tasks (browsing, email, etc.) as a Local Admin, a Domain Admin, or (gasp) a Global Site Admin is asking for trouble. It is more difficult for a threat actor to gain access to a device or network when starting without admin privileges. Utilize tools that will temporarily elevate privileges, log the work performed, and even require another's approval. Use a dedicated workstation to perform sensitive tasks that doesn't have email access and use conditional access policies to prevent those high-impact tasks being performed from other workstations. This was a tough lesson learned in the LastPass breach where an admin was running a vulnerable version of Plex on a machine that was also performing admin tasks.
Epic failures require epic power.
Ditch Passwords
The computer password was first implemented in 1961. It worked well for separating individual users on a single machine in an on-prem world, but in the Internet era, passwords are not up to the task. A good password is long, difficult to predict, and never reused. These characteristics make them hard for humans to use properly. Like the old speakeasy (ask your grandparents), anyone who knew the password could enter. A security researcher had a mentor who once told him the holy grail of information security and swore him never to reveal it. When asked what it was, the researcher said, "People can't keep secrets."
Seriously, the best password is no password. Apple, Google, and Microsoft have started supporting The FIDO Alliance's passwordless authentication system, also known as PassKeys, which avoid the problems we have with passwords.
It will take some time to eliminate passwords. In the meantime, use a password manager. Yes, there are some risks involved, but the rewards far outweigh the risks. In addition to colleagues having better quality passwords, the company will actually have a complete list of all of the logins their employees use for various customer, supplier, and governmental portals.
Measure Progress
While compliance does not equal security, using a framework like the CIS Controls to measure your progress will assure your management and your customers that you are Improving and Evolving Your Security Controls. Use the controls to demonstrate to your partners (customers, suppliers, and government) that you can secure their data, whether on-prem and/or in the cloud. Doing so can be a competitive advantage as well as a lowering your legal liability.
Conclusion
Just as the walls came tumbling down in Jericho, they are currently crumbling around the on premises world. In order to close the security chasm, SMBs will need to adapt to the current threat landscape by adopting a different mindset that secures data and not devices.
Sales Consultant | Freshworks | Empowering your businesses with SaaS technology
10 个月That's quite an interesting take on the need of the hour! give or take another decade and the feasibility of an On premise system would almost be gone! A lot of our SMB partners at Freshworks started of with a hybrid model with few apps on prem! but were eventually able to transition into cloud!
Customer Experience Professional
1 年Great primer and lesson for everyone
Driving Data Management, Governance and Security for Profitability and Risk Reduction
1 年Thanks for sharing this article. In the work I've done, there's a curve that comes with getting SMB's to gain understanding of how to implement cloud solutions. There is a definite level of rigor in data strategy and control to insure the data assets are protected when planning the transition to cloud.
Director of Business Process and Technology
1 年Nice article Mark! Small and medium sized businesses should embrace the ability to hire out skills and services that would otherwise put the burden of responsibility on them to do. I fully understand the desire to control it all, but with the ever-changing landscape of cloud computing and information security, it's much better to mitigate the risk by hiring the right company to help instead of hiring the right person.