November 29, 2022
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
There's interest from the CFO organization in third-party tools for cloud cost management and optimization that can give them a vendor-neutral tool, especially in multicloud environments, according to Forrester analyst Lee Sustar. "The cost management tools from cloud providers are generally fine for tactical decisions on spending but do not always provide the higher level views that the CFO office is looking for," he added. As organizations move to a cloud-native strategy, Sustar said the initiative will often come from the IT enterprise architects and the CTO organization, with backing from the office of the CIO. "Partners of various sorts are often needed in the shift to cloud-native, as they help generalize the lessons from the early adopters," he noted. "Today, organizations new to the cloud are focused not on lifting and shifting existing workloads alone, but modernizing on cloud-native tech. Multicloud container platform vendors offer a more integrated approach that can be tailored to different cloud providers, Sustar added.
APIs are a core part of how financial services firms are changing their operations in the modern era, Akamai said, given the growing desire for more and more app-based services among the consumer base. The pandemic merely accelerated a growing trend toward remote banking services, which led to a corresponding growth in the use of APIs. With every application and every standardization of how various app functions talk to one another, which creates APIs, the potential target surface for an attacker increases, however. Only high-tech firms and e-commerce companies were more heavily targeted via API exploits than the financial services industry. “Once attackers launch web applications attacks successfully, they could steal confidential data, and in more severe cases, gain initial access to a network and obtain more credentials that could allow them to move laterally,” the report said. “Aside from the implications of a breach, stolen information could be peddled in the underground or used for other attacks. This is highly concerning given the troves of data, such as personal identifiable information and account details, held by the financial services vertical.”
Gartner research estimates that we exceeded one billion knowledge workers globally in 2019. These workers are defined as those who need to think creatively and deliver conclusions for strategic impact. These are the very people that cloud technology was designed to facilitate. Cloud integrations in many cases can be hugely advanced and mature from an operational standpoint. Businesses have integrated multi-cloud solutions, containerization and continuously learning AI/ML algorithms to deliver truly cutting-edge results, but those results are often not delivered at the scale or speed necessary to make split-second decisions needed to thrive in today’s operating environment. For cloud democratization to be successful, companies need to upskill their knowledge workers and upskill them with the right tools needed to deliver value from cloud analytics. Low-code and no-code tools reduce the experiential hurdle needed to deliver value from in-cloud data, whilst simultaneously delivering on the original vision of cloud technology — giving people the power they need to have their voices heard.
领英推荐
Every effective BI system has a potent DWH at its core. Just because a data warehouse is a platform used to centrally gather, store, and prepare data from many sources for later use in business intelligence and analytics. Consider it as a single repository for all the data needed for BI analyses. Historical and current data are kept structured, ideal for sophisticated querying in a data analytics DWH. Once connected, it produces reports with forecasts, trends, and other visualizations that support practical insights using business intelligence tools. ETL (extract, transform, and load) tools, a DWH database, DWH access tools, and reporting layers are all parts of the business analytics data warehouse. These technologies are available to speed up the data science procedure and reduce or completely do away with the requirement for creating code to handle data pipelines. The ETL tools assist in data extraction from source systems, format conversion, and data loading into the DWH. Structured data for reporting is stored and managed by the database component.?
Ransomware and extortion groups usually publicly release stolen data if a victim doesn't pay. In many cases, the victim organization hasn't publicly acknowledged it has been attacked. Should we write or tweet about that? ... These are victims of crime, and not every organization handles these situations well, but the media can make it worse. Are there exceptions to this rule? Sure. If an organization hasn't acknowledged an incident but numerous media outlets have published pieces, then the incident could be considered public enough. But many people tweet or write stories about victims as soon as their data appears on a leak site. I think that is unfair and plays into the attackers' hands, increasing pressure on victims. Covering Cybercrime Sensitively Using leaked personal details to contact people affected by a data breach is a touchy area. I only do this in very limited circumstances. I did it with one person in the Optus breach. The reason was at that point there were doubts about if the data had originated with Optus. The person also lived down the road from me, so I could talk to them in person.
NIS2 will set the baseline for cybersecurity risk management measures and reporting obligations across all sectors that are covered by the directive, such as energy, transport, health and digital infrastructure. The revised directive aims to harmonise cybersecurity requirements and implementation of cybersecurity measures in different member states. To achieve this, it sets out minimum rules for a regulatory framework and lays down mechanisms for effective cooperation among relevant authorities in each member state. It updates the list of sectors and activities subject to cybersecurity obligations and provides for remedies and sanctions to ensure enforcement. The directive will formally establish the European Cyber Crises Liaison Organisation Network, EU-CyCLONe, which will support the coordinated management of large-scale cybersecurity incidents and crises. While under the old NIS directive member states were responsible for determining which entities would meet the criteria to qualify as operators of essential services, the new NIS2 directive introduces a size-cap rule as a general rule for identification of regulated entities.