July 22, 2023
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
The fact IBM, HPE, and Microsoft made such similar data fabric and lakehouse announcements indicate there is strong market demand, Patel says. But it’s also partly a result of the evolution of data architecture and usage patterns, he says. “I think there are probably some large enterprises that decide, listen, I can’t do this anymore. You need to go and fix this. I need you to do this,” he says. “But there’s also some level of just where we’re going…We were always going to be in a position where governance and security and all of those types of things just become more and more important and more and more intertwined into what we do on a daily basis. So it doesn’t surprise me that some of these things are starting to evolve.” While some organizations still see value in choosing the best-of-breed products in every category that makes up the data fabric, many will gladly give up having the latest, greatest feature in one particular area in exchange for having a whole data fabric they can move into and be productive from day one.
The integration of DAST in the early stages of development is crucial for several reasons. First, by conducting dynamic security testing from the onset, teams can identify vulnerabilities earlier, making them easier and less costly to fix. This proactive approach helps to prevent security issues from becoming ingrained in the code, which can lead to significant problems down the line. Second, early integration of DAST encourages a security-focused mindset from the beginning of the project, promoting a culture of security within the team. This cultural shift is crucial in today’s cybersecurity climate, where threats are increasingly sophisticated, and the stakes are higher than ever. DAST doesn’t replace other testing methods; rather, it complements them. By combining these methods, teams can achieve a more comprehensive view of their application’s security. In a shift left approach, this combination of testing methods can be very powerful. By conducting these tests early and often, teams can ensure that both the external and internal aspects of their application are secure. This layered approach to security testing can help to catch any vulnerabilities that might otherwise slip through the cracks.
In the first attack detailed by Checkmarx, which occurred on 5 April and 7 April, a threat actor leveraged the NPM platform to upload packages that contained a preinstall script that executed its objective upon installation. To appear more credible, the attacker created a spoofed LinkedIn profile page of someone posing as an employee of the victim bank. Researchers originally thought this may have been linked to legitimate penetration testing services commissioned by the bank, but the bank revealed that to not be the case and that it was unaware of the LinkedIn activity. The attack itself was modeled on a multi-stage approach which began with running a script to identify the victim’s operating system – Windows, Linux, or macOS. Once identified, the script then decoded the relevant encrypted files in the NPM package which then downloaded a second-stage payload. Checkmarx said that the Linux-specific encrypted file was not flagged as malicious by online virus scanner VirusTotal, allowing the attacker to “maintain a covert presence on the Linux systems” and increase its chances of success.
领英推荐
By introducing domain?oriented data ownership, domain teams become accountable for their data and products, improving data quality and governance. Traditional data lakes often encounter challenges related to scalability and performance when handling large volumes of data. However, data mesh architecture solves these scalability issues through its decentralized and self?serve data infrastructure. With each domain having the autonomy to choose the technologies and tools that best suits their needs, data mesh allows teams to scale their data storage and processing systems independently. ...?Data Fabric is an integrated data architecture that is adaptive, flexible, and secure. It is an architectural approach and technology framework that addresses data lake challenges by providing a unified and integrated view of data across various sources. Data Fabric allows faster and more efficient access to data by extracting the technological complexities involved in data integration, transformation, and movement so that anybody can use it.
It has become evident that there is a gap between the architecture team and those who interact with the application on a daily basis. Even in the context of the microservice architecture, failing to adhere to best practices can result in a tangled mess that may force a return to monolithic structures, as we have seen with Amazon Web Services. I believe that it is necessary to shift architecture left and provide architects with better tools to proactively identify architecture drift and technical debt buildup, injecting architectural considerations into the feature backlog. With few tools to understand the architecture or identify the architecture drift, the role of the architect has become a topic of extensive discussion. Should every developer be responsible for architecture? Most companies have an architect who sets standards, goals, and plans. However, this high-level role in a highly complex and very detailed software project will often become detached from the day-to-day reality of the development process.?
The case for legacy modernization should today be clear: technical debt is like a black hole, sucking up an organization’s time and resources, preventing it from developing the capabilities needed to evolve and adapt to drive growth. But while legacy systems can limit and inhibit business growth, from large-scale disruption to subtle but long-term stagnation, changing them doesn’t have to be a painful process of “rip-and-replace.” In fact, rather than changing everything only to change nothing, an effective program enacts change in people, processes and technology incrementally. It focuses on those areas that will make the biggest impact and drive the most value, making change manageable in the short term yet substantial in its effect on an organization's future success and sustainable in the long term.?In an era where executives often find themselves in FOMU (fear of messing up) mode, they would be wise to focus on those areas of legacy modernization that will make the biggest impact and drive the most value, making change manageable in the short term yet substantial in its effect on an organization’s future success.
Realtor Associate @ Next Trend Realty LLC | HAR REALTOR, IRS Tax Preparer
1 年Thanks for sharing.