November 19, 2024
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
"There is a clear need to align quality engineering metrics with business outcomes and showcase the strategic value of quality initiatives to drive meaningful change," the survey's team of authors, led by Jeff Spevacek of OpenText, stated. "On the technology front, the adoption of newer, smarter test automation tools has driven the average level of test automation to 44%. However, the most transformative trend this year is the rapid adoption of AI, particularly Gen AI, which is set to make a huge impact." ... While AI offers great promise as a quality and testing tool, the study said there are "significant challenges in validating protocols, AI models, and the complexity of validation of all integrations. Currently, many organizations are struggling to implement comprehensive test strategies that ensure optimized coverage of critical areas. However, looking ahead, there is a strong expectation that AI will play a pivotal role in addressing these challenges and enhancing the effectiveness of testing activities in this domain." The key takeaway point from the research is that software quality engineering is rapidly evolving: "Once defined as testing human-written software, it has now evolved with AI-generated code."
Here’s where it gets complicated. Implementing least privilege requires an application’s requirements specifications to be available on demand with details of the hierarchy and context behind every interconnected resource. Developers rarely know exactly which permissions each service needs. For example to perform a read on an S3 bucket, we also need permissions to list contents of the S3 bucket. ... This is where we begin to be reactive and apply tools that scan for misconfigurations. Tools like AWS IAM Access Analyzer or Google Cloud’s IAM recommender are valuable for identifying risky permissions or potential overreach. However, if these tools become the primary line of defense, they can create a false sense of security. Most permission-checking tools are designed to analyze permissions at a point in time, often flagging issues after permissions are already in place. This reactive approach means that misconfigurations are only addressed after they occur, leaving systems vulnerable until the next scan. ... The solution lies in rethinking the way in which we wire up these relationships in the first place. Let’s take a look at two very simple pieces of code that both expose an API with a route to return a pre-signed URL from a cloud storage bucket.
Inexplicable black boxes lead back to the bewitchment of the Sorting Hat; with real life tools we need to know how their decisions are made. As for the human-in-the-loop on whom we are pinning so much, if they are to step in and override AI decisions the humans better be on more than just speaking terms with their tools. Explanation is their job description. And it’s where the tools are used by the state to make decisions about us, our lives, liberty and livelihoods, that the need for explanation is greatest. Take a policing example. Whether or not drivers understand them we’ve been rubbing along with speed cameras for decades. What will AI-enabled road safety tools look and sound and think like? If they’re on speaking terms with our in-car telematics they’ll know what we’ve been up to behind the wheel for the last year not just the last mile. Will they be on speaking terms with juries, courts and public inquiries, reconstructing events that took place before they were even invented, together with all the attendant sounds, smells and sensation rather than just pics and stats? Much depends on the type of AI involved but even Narrow AI has given the police new reach like remote biometrics.
领英推荐
Documentation doesn’t need to be a separate task or deliverable to complete. During every meeting or asynchronous interaction, you can organically create documentation by using a virtual whiteboard to take notes, create visuals, and complete activities. ... Look for tools that can help you build and maintain your technical documentation with less effort. Modern visual collaboration solutions like Lucid offer advanced features to streamline documentation. These solutions can automatically generate various diagrams such as flowcharts, ERDs, org charts, and UML diagrams directly from your data. Some even incorporate AI assistance to help build and optimize diagrams. By using automation, teams can significantly reduce errors commonly associated with the manual creation of documentation. Another advantage of these platforms is the ability to link your data sources directly to your documents. This integration ensures your documentation stays up to date automatically, without requiring additional effort. What's more, advanced visual collaboration solutions integrate with project management tools like Jira and Azure DevOps. This integration allows teams to seamlessly share visuals between their chosen platforms, saving time and effort in keeping information synchronized across their environment.
The complexity of modern cloud environments amplifies the need for robust observability. Cloud applications today are built upon microservices, RESTful APIs, and containers, often spanning multicloud and hybrid architectures. This interconnectivity and distribution introduce layers of complexity that traditional monitoring paradigms struggle to capture. Observability addresses this by utilizing advanced analytics, artificial intelligence, and machine learning to analyze real-time logs, traces, and metrics, effectively transforming operational data into actionable insights. One of observability’s core strengths is its capacity to provide a continuous understanding of system operations, enabling proactive management instead of waiting for failures to manifest. Observability empowers teams to identify potential issues before they escalate, shifting from a reactive troubleshooting stance to a proactive optimization mindset. This capability is crucial in environments where systems must scale instantly to accommodate fluctuating demands while maintaining uninterrupted service.
The onset of widespread remote work made the strategy much more prevalent, given that many organizations already had VDI infrastructure and experience. Due to its architectural design, infrastructure requirements scale more or less linearly with usage. But that means most organizations are often upside-down in their VDI investment — given that the costs are significant — and it seems that both practitioners and users have disdain for the experience. ... Maintaining VDI can be costly due to the need for patch management, hardware upgrades and support for end-user issues. An enterprise browser eliminates maintenance costs associated with traditional VDI systems because it requires no additional hardware. It also lowers administrative costs by centralizing controls within the browser, which reduces the need for multiple security tools and streamlines policy management. ... VDI solutions and their back-end systems can have substantial licensing fees, including the VDI platform and any extra licenses for the operating systems and apps used in VDI sessions. An enterprise browser can reduce the need for VDI by 80% to 90%, saving money on licensing costs. ... Ensuring secure and compliant endpoint interactions within a VDI session often requires additional endpoint controls and management solutions.?