June 14, 2024
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
State machines are instrumental in defining recovery and failover mechanisms. By clearly delineating states and transitions, engineers can identify and code for scenarios where the system needs to recover from an error, failover to a backup system or restart safely. Each state can have defined recovery actions, and transitions can include logic for error handling and fallback procedures, ensuring that the system can return to a safe state after encountering an issue. My favorite phrase to advocate here is: “Even when there is no documentation, there is no scope for delusion.” ... Having neurodivergent team members can significantly enhance the process of state machine conceptualization. Neurodivergent individuals often bring unique perspectives and problem-solving approaches that are invaluable in identifying states and anticipating all possible state transitions. Their ability to think outside the box and foresee various "what-if" scenarios can make the brainstorming process more thorough and effective, leading to a more robust state machine design. This diversity in thought ensures that potential edge cases are considered early in the design phase, making the system more resilient to unexpected conditions.
Sketch a data stack architecture that delivers the capabilities you've deemed necessary for your business. Your goal here should be to determine what your ideal data stack looks like, including not just which types of tools it will include, but also which personnel and processes will leverage those tools. As you approach this, think in a tool-agnostic way. In other words, rather than looking at vendor solutions and building a stack based on what's available, think in terms of your needs. This is important because you shouldn't let tools define what your stack looks like. Instead, you should define your ideal stack first, and then select tools that allow you to build it. ... Another critical consideration when evaluating tools is how much expertise and effort are necessary to get tools to do what you need them to do. This is important because too often, vendors make promises about their tools' capabilities — but just because a tool can theoretically do something doesn't mean it's easy to do that thing with that tool. A data discovery tool that requires you to install special plugins or write custom code to work with a legacy storage system you depend on.
A small AI approach has worked for Dayforce, a human capital management software vendor, says David Lloyd, chief data and AI officer at the company. Dayforce uses AI and related technologies for several functions, with machine learning helping to match employees at client companies to career coaches. Dayforce also uses traditional machine learning to identify employees at client companies who may be thinking about leaving their jobs, so that the clients can intervene to keep them. Not only are smaller models easier to train, but they also give Dayforce a high level of control over the data they use, a critical need when dealing with employee information, Lloyd says. When looking at the risk of an employee quitting, for example, the machine learning tools developed by Dayforce look at factors such as the employee’s performance over time and the number of performance increases received. “When modeling that across your entire employee base, looking at the movement of employees, that doesn’t require generative AI, in fact, generative would fail miserably,” he says. “At that point you’re really looking at things like a recurrent neural network, where you’re looking at the history over time.”
领英推荐
In the current IT landscape, one of the most pressing challenges is the evolving threat of cyberattacks, particularly those augmented by GenAI. As GenAI becomes more sophisticated, it introduces new complexities for cybersecurity with cybercriminals leveraging it to create advanced attack vectors. ... Several transformative technologies are reshaping our industry and the world at large. At the forefront of these innovations is GenAI. Over the past two years, GenAI has moved from theory to practice. While GenAI has fostered many creative ideas in 2023 of how it will transform business, GenAI projects are starting to become business-ready with visible productivity gains becoming evident. Transformative technology also holds a strong promise to have a profound impact on cybersecurity, offering advanced capabilities for threat detection and incident response from a cybersecurity standpoint. Organisations will need to use their own data for training and fine-tuning models, conducting inference where data originates. Although there has been much discussion about zero trust within our industry, we’re now seeing it evolve from a concept to a real technology.?
QA is a funny thing. It has meant everything from “the most senior engineer who puts the final stamp on all code” to “the guy who just sort of clicks around randomly and sees if anything breaks.” I’ve seen seen QA operating in all different levels of the organization, from engineers tightly integrated with each team to an independent, almost outside organization. A basic question as we look at shifting testing left, as we put more testing responsibility with the product teams, is what the role of QA should be in this new arrangement. This can be generalized as “who should own tests?” ... If we’re shifting testing left now, that doesn’t mean that developers will be running tests for the first time. Rather, shifting left means giving developers access to a complete set of highly accurate tests, and instead of just guessing from their understanding of API contracts and a few unit tests that their code is working, we want developers to be truly confident that they are handing off working code before deploying it to production. It’s a simple, self-evident principle that when QA finds a problem, that should be a surprise to the developers.?
Implementing identity-based passwordless authentication in workstation-independent environments poses several unique challenges. First and foremost is the issue of interoperability and ensuring that authentication operates seamlessly across a diverse array of systems and workstations. This includes avoiding repetitive registration steps which lead to user friction and inconvenience. Another critical challenge, without the benefit of mobile devices for biometric authentication, is implementing phishing and credential theft-resistant authentication to protect against advanced threats. Cost and scalability also represent significant hurdles. Providing individual hardware tokens to each user is expensive in large-scale deployments and introduces productivity risks associated with forgotten, lost, damaged or shared security keys. Lastly, the need for user convenience and accessibility cannot be understated. Passwordless authentication must not only be secure and robust but also user-friendly and accessible to all employees, irrespective of their technical expertise.?