January 20, 2024
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
In many instances, CISOs who want clear risk guidance from their board don't get it. Barely more than one-third (36%) described their board as offering them clear enough insight into their organization's risk tolerance levels for them to act upon. "The evolution of the CISO role over the past few years has accelerated dramatically," says Nick Kakolowski, research director at IANS. With organizations digitizing more of their operations, CISOs are taking on more responsibilities and have become de facto owners of digital risk, he says. "[But] organizations haven't figured out how to support and empower them as the scope of the role grows." Concerns have been growing within the CISO community in recent years about the escalating expectations around the role, even as their ability to meet those expectations has remained largely unchanged. Incidents like one last October where the SEC charged SolarWinds CISO Tim Brown with fraud and internal control failures over the 2020 breach at the company, and where a judge sentenced former Uber CISO Joe Sullivan to three years of probation over a 2016 breach, have fueled those concerns.?
“Satisfaction has been rising consistently for the past few years, but last year, it dipped,” says IANS Research Director Nick Kakolowski. “Last year, the pressure on CISOs ratcheted up big time with the new SEC rules and CISOs being held personally liable for breaches. ... “The environment surrounding CISOs is extremely turbulent right now, and their individual exposure to lawsuits is at an all-time high. CISOs face a real danger of being indicted or sued for things outside of their control,” adds Patrick “Pat” Arvidson, chief strategy officer for Interpres, a maker of a threat-informed defense surface management platform. ... Another finding in the report is that CISOs aren’t getting the facetime with boards that they need. Eighty-five percent of CISOs in the survey indicated their board should offer clear guidance on their organization’s risk tolerance for the CISO to act on, but only 36% found that to be the case. “We are seeing some boards figuring this out and being effective there, but across the board, there’s either a lack of visibility at the board level—CISOs aren’t consistently reporting to the board—or CISOs and boards haven’t figured out how to speak each other’s language,” Kakolowski says.
The Domain Working Group meetings were instrumental in helping both business stakeholders and technology developers walk through examples and requirements for merging sometimes incomplete, inaccurate, and inconsistent data from 3 sources into a single complete, accurate, and consistent golden record. As business stakeholders started to understand the savings in time spent querying 3 data sources, reconciling and explaining differences between sources, and deciding which data is most trusted, and also started to see the benefits of having a single authoritative view of their domain data, enthusiasm for the Data Vault initiative increased. Embedding data governance practices and tools by creating a data governance workstream within a business or technology project is one of many approaches an organization can take to expand or accelerate engagement, adoption, and implementation of a data governance program. The success of this Data Vault project was partially attributed to the established data governance framework and team, but the biggest benefit was the adoption of data governance by dozens of previously unaware employees through exposure to the data governance program and witnessing real-life benefits of active end-to-end data governance made part of their everyday job responsibilities.
领英推荐
Several quantifiable metrics can serve as a starting point for evaluating the cost of bad data, including the rate of occurrence or number of incidents per year, time to detection, and time to resolution. ... Number and frequency of incidents: While some companies may experience data incidents on a daily basis, others may go days – if not weeks – without one. The criticality of the incidents can vary from something “minor,” such as stale data linked to a dashboard that nobody has used in ages, to a data duplication problem causing the server to overcharge and ultimately go down. ... Mean time to resolution (MTTR): What happens once an incident is reported? MTTR is the average time spent between becoming aware of a data incident and resolving it. The resolution time is greatly influenced by the criticality of the incident and the complexity of the data platform, which is why we are considering the average for the purpose of this framework. ... Mean time to production (MTTP) is the average time it takes to ship new data products or, in other words, the average time to market for data products. This could be the time spent by an analyst “cleaning” the data for a data science model.?
Despite the apparent advantages, there are various challenges that I think are important to highlight. Worth noting is that they are all avoidable when considered and planned around upfront. A common reason why teams end up sticking with a traditional monolithic approach includes the fact that microservices bring increased complexity. This complexity comes in the form of teams needing to understand how to design, build, and manage distributed systems. More specifically, not knowing how to implement a reliable communication protocol for microservices to be able to communicate is a recurring pain point that leads to decreased system performance, and in turn, has teams switching back to their monolithic system. Another challenge that arises from having an increased number of interactions comes in the form of system testing and debugging. Aside from these difficulties, another major concern when considering microservices includes that of security. Implementing robust authentication, authorization, and encryption across each and every service is crucial.
The history of ABE goes back to a ground-breaking 2005 paper titled “Fuzzy Identity-Based Encryption.” Fifteen years later, recognizing the paper’s significance, the International Association for Cryptologic Research (IACR) gave it a 2020 Test of Time Award. One of its co-authors, Dr. Brent Waters, later said the paper has had a three-fold impact. First, there has been the concept of ABE as its own application with distinctive new use cases, several of which are discussed below. Second, the cryptographic research community not only has spent years studying ABE, but also used ABE as a building block, leveraging it to obtain new results in work on other problems. Third, according to Dr. Waters, the work in ABE “inspired us to rethink encryption in even bigger and grander ways.” One such overflow has been functional encryption, which allows a user to learn only a function of a data set. For ABE, the end goal is fine-grained access to the data itself. On its own, that’s a revolution. An ABE scheme can provide the right user with a key to very specific data. Not to an entire file cabinet, so to speak, but to a single line item within a category of filed documents.