August 07, 2024
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
Training an AI model is not cheap; ChatGPT cost $10 million to train in its current form, while the cost to develop the next generation of AI systems is expected to be closer to $1 billion. Traditional AI tends to cost less than generative AI because it runs on fewer GPUs, yet even the smallest scale of AI projects can quickly reach a $100,000 price tag. Building an AI model should only be done if it’s expected that you will recoup building costs within a reasonable time horizon. ... The right partner will help integrate new AI applications into the existing IT environment and, as mentioned, provide the talent required for maintenance. Choosing an existing model tends to be cheaper and faster than building a new one. Still, the partner or vendor must be vetted carefully. Vendors with an established history of developing AI will likely have better data governance frameworks in place. Ask them about policies and practices directly to see how transparent they are. Are they flexible enough to make said policies align with yours? Will they demonstrate proof of their compliance with your organization’s policies? The right partner will be prepared to offer data encryption, firewalls, and hosting facilities to ensure regulatory requirements are met, and to protect company data as if it were their own.
Artificial intelligence technologies, including machine learning and natural language processing, have revolutionized how businesses analyze and utilize data. AI systems can process vast amounts of information at unprecedented speeds, uncovering patterns and generating insights that drive strategic decisions and operational efficiencies. However, the use of AI introduces complexities to data governance. Traditional data governance practices focused on managing structured data within defined schemas. AI, on the other hand, thrives on vast swaths of information and can generate entirely new data. ... As AI continues to evolve, so too must data governance frameworks. Future advancements in AI technologies, such as federated learning and differential privacy, hold promise for enhancing data privacy while preserving the utility of AI applications. Collaborative efforts between businesses, policymakers, and technology experts are essential to navigate these complexities and ensure that AI-driven innovation benefits society responsibly.?
Forensic data analysis faces a variety of technical, legal, and administrative challenges. Technical factors that affect forensic data analysis include encryption issues, need for large amounts of disk storage space for data collection and analysis, and anti-forensics methods. Legal challenges can arise in forensic data analysis and can confuse or derail an investigation, such as attribution issues stemming from a malicious program capable of executing malicious activities without the user’s knowledge. These applications can make it difficult to identify whether cybercrimes were deliberately committed by a user or if they were executed by malware. The complexities of cyber threats and attacks can create significant difficulties in accurately attributing malicious activity. Administratively, the main challenge facing data forensics involves accepted standards and management of data forensic practices. Although many accepted standards for data forensics exist, there is a lack of standardization across and within organizations. Currently, there is no regulatory body that overlooks data forensic professionals to ensure they are competent and qualified and are following accepted standards of practice.
领英推荐
Businesses need to start at the top and ensure all DevSecOps team members accept a continuous security focus: Security isn't a one-time event; it's an ongoing process. Leaders must encourage open communication between development, security, and operation teams, which can be achieved with regular meetings and shared communication platforms that facilitate constant collaboration. Developers must learn secure coding practices when building their models, while security and operations teams need to better understand development workflows to create practical security measures. Peer-to-peer communication and training are about partnership, not conflict, and effective DevSecOps thrives on collaboration, not finger-pointing. Only once these personnel changes are implemented can a DevSecOps team successfully execute a shift left security approach and leverage the benefits of technology automation and efficiency. Once internal harmony is achieved, DevSecOps teams can begin consolidating automation and efficiency into their workflows by integrating security testing tools within the CI/CD pipelines.?
Micro-credentials, when correctly implemented, can complement traditional degree programmes in a number of ways. Take for example the Advance Centre, in partnership with University College Dublin, Technological University Dublin and ATU Sligo, which offers accredited programmes and modules with the intent of addressing Ireland’s future digital skill needs. “They enable students to gain additional skills and knowledge that supplement their professional field. For example, a mechanical engineer might pursue a micro-credential in cybersecurity or data analytics to enhance their expertise and employability,” said O’Gorman. By bridging very specific skills gaps, micro-credentials can cover materials that may otherwise not be addressed in more traditional degree programmes. “This is particularly valuable in fast-evolving fields where specific up-to-date skills are in high demand.” Furthermore, it is fair to say that balancing work, education and your personal life is no easy feat, but this shouldn’t mean that you have to compromise on your career aspirations.?
Adopting AI technologies requires a lot of computational power, storage space and low-latency networking to be able to train and run models. These technologies prefer hosting environments, which makes them highly compatible with data centres, therefore, as the demand for AI grows, so will the demand for data centres. However, the challenge remains on limiting new data centres to connect to the grid, which will impact data centre build out. This highlights edge data centres as the solution to the data centre capacity problem.? ... With this pressure, cloud computing has emerged as a cornerstone for these modernisation efforts, with companies choosing to move their workloads and applications onto the cloud. This shift has brought challenges for companies relating to them managing costs and ensuring data privacy. As a result, organisations are considering cloud repatriation as a strategic option. Cloud repatriation is essentially the migration of applications, data and workloads from the public cloud environment back to on-premises or a colocated centre infrastructure.