Protecting Data in the Age of Disasters
Welcome to the latest edition of the Komprise Intelligent Data Management newsletter! We cover new ways for IT managers to be more productive managing enterprise data and storage to dealing with ever-changing compliance issues, working with departments on data strategies and understanding the new requirements for data management and AI. Learn more about Komprise, a SaaS solution for unstructured data management and mobility here and follow us on LinkedIn.??
This month’s newsletter covers the changing landscape for disaster recovery and data protection.?
Disasters of epic proportions are on the rise. Climate events are becoming more frequent and devastating. The year 2023 had the most billion-dollar-plus climate disasters than any other year, costing the U.S. alone an estimated $92 billion, according to?NOAA. These disasters not only threaten human lives from physical infrastructure damage and impacts on clean water and air quality—but they can take down data centers too.?
According to the?New Relic 2023 Observability Forecast?report, the median annual cost of an IT outage has now reached $7.75 million. Ransomware and other cybersecurity threats are increasingly sophisticated and nefarious, adding to the challenge. A recent report on the expansive Chinese hackers for hire market is troubling, to say the least. Generative AI innovations have also introduced new threats and risks to corporate data.?
In this environment, no stone should go unturned when it comes to protecting an organization’s greatest assets: its data.?
Unstructured data, which today constitutes at least 80% of all data created and stored, has become difficult to protect with traditional backup and storage methods because of its sheer size. Most enterprises have petabytes of data under management.??
?Here are a few tactics from our COO Krishna Subramanian, as covered in this recent article on TDWI.
1.??? Know your data.?Although it may sound obvious, you need holistic understanding of all data in storage. Gaps in visibility, hidden applications, and obscure data silos in branch offices contribute to higher risk if the data is not managed properly. Consider that protected data will end up in places where it shouldn’t, such as on forgotten or underutilized file servers and shadow IT cloud services. Employees unwittingly copy sensitive data to incompliant locations more often than you’d think. You’ll need a way to see all your data in storage and search across it to find the files to segment for security and compliance needs. All data needs some level of protection—but some data sets need the highest safeguards. Understand your data environment and spend wisely.
2.??? Use AI/automation to tag and find sensitive data.?A real struggle with massive volumes of unstructured data spread across enterprise data silos is that it can be painstaking work to find data sets that need a higher level of protection. Start by enriching file metadata with custom tags that indicate regulated or sensitive PII and IP information. This classification is also useful in the case of a regulatory audit or even for use cases such as legal discovery. AI tools such as Amazon Macie can help by analyzing the content of millions of files for sensitive data (such as customer contact information or credit cards) and then tagging them. IT can use the output of such AI scans to segregate those data sets, move them to the most secure storage location, or delete them altogether if corporate rules require that.
3.??? Create policies for automated data movement across vendors.?Such policies, for example, could dictate that files containing financial data move to encrypted cold data storage after one year of age, customer files move to immutable cloud object storage for a period once an account is closed or inactive, or ex-employee data be deleted after 30 days from an employee’s last day. Automated policy features in storage and data management technologies can make this easier to execute for small IT teams. The idea is to lower the risk of data being in the wrong place at the wrong time, thereby creating security loopholes that a bad actor can easily exploit. Getting rid of unnecessary data and/or moving it to archival storage is also a great way to save money on expensive primary storage.
4.??? Leverage monitoring and alerting features in IT systems.?IT and data management applications today provide alerts and notifications that can help you proactively identify threats. Make use of these tools to monitor storage and backup systems for any anomalies, such as excessive file retrievals from one user account or excessive writes to a storage location, indicating a possible security incident. Monitoring features can show other details such as orphaned data or duplicate data that may increase liabilities unnecessarily or be reflected in metrics indicating potential performance problems, such as a file server or NAS device reaching capacity. Ensure you have a process in place to review alerts and monitor data to escalate and fix issues; AI, of course, is already doing this automatically in newer technologies.
5.??? Incorporate data auditing and tracking for generative AI.?There is much to consider when it comes to safely and ethically adopting generative AI solutions in the workplace. The role of data storage and data governance specialists is multifaceted. Strategies may include developing employee guidelines for which data is sanctioned to send to generative AI tools and for what kinds of research and use cases. Conversely, IT must lock down sensitive data (such as software code, proprietary information, customer information, HR data) that individuals should not access for use in AI.
?
The role of the cloud
?As cloud infrastructure matures, IT leaders are viewing the cloud as a sound strategy for disaster recovery and ransomware protection. ?Not to mention, using the cloud for backups and DR is often much cheaper than a colocation site.
领英推荐
As noted in this CSO Online article:
“Off-site cloud-based storage is an excellent option for ensuring data cybersecurity and access in natural disaster situations. If your primary data repository has been knocked out of service but its data has been backed up in an unaffected region, it is relatively easy to restore data services to users without compromising cybersecurity.”
For ransomware protection, the cloud also offers a unique proposition. Backups might not be the best strategy because increasingly, backups are being targeted by ransomware actors. Security Intelligence reports that in 93% of ransomware incidents, threat actors actively target backup repositories. This results in 75% of victims losing at least some of their backups during the attack, and more than one-third (39%) of backup repositories are completely lost.
You can create an affordable logically isolated recovery copy of all data in an object-locked destination such as Amazon S3 IA, so data is protected even if the backups and primary storage are attacked. This blog post explains how Komprise can help create a ransomware plan with the following tenets:
?
Earlier this month, Komprise announced a new product enhancement, Komprise Elastic Replication, to make DR more affordable than using traditional Network Attached Storage (NAS) mirroring on all your data.
Watch the short video below!
Read the blog here for more detail:
Last Words…
Security threats are constantly evolving – becoming harder to detect and attacking more services and surfaces in the enterprise. Using last year’s security and DR plan won’t work. With an analytical, data-centric approach, you can understand and classify your data surgically and create a right-sized security and DR plan which leaves no stone unturned in your overall data protection plan.
Subscribe to the Komprise blog to receive new posts in your inbox. Comment on the post or send a note to: