3 things every CTO should know about Artificial Intelligence in 2023

3 things every CTO should know about Artificial Intelligence in 2023

Just before the end of 2022, OpenAI released a new Deep Learning model called ChatGPT. Within a matter of days, over one million users registered to use its capabilities. That is mind blowing in and of itself, but to put it in perspective: Netflix was out for more than 3 years before a millions users registered for its services. This shows that the next generation of software applications will not be engineered, they will be modelled through Artificial Intelligence (AI).

For organisations to stay ahead in this competitive age, CTOs need to have a clear understanding of how AI is impacting the software engineering lifecycle and how to adopt it. In particular, they should be aware of three things:

#1 Pouring money into AI without a proper AI strategy is a recipe for disaster

AI projects can be complex and involve many different factors, such as data collection and preparation, model training and evaluation, and deployment and maintenance. Without a clear strategy, it can be difficult to ensure that all of these factors are properly considered and that the AI solution meets the needs and goals of the organisation.

A well-defined AI strategy can help an organisation to:

  • Clearly define the business problem that the AI solution is intended to solve
  • Identify the resources and expertise needed to develop and implement the solution
  • Determine the appropriate AI technology and approach for the task
  • Establish a roadmap for implementing and deploying the solution
  • Monitor and measure the performance of the solution to ensure that it is meeting its objectives

#2 A new subfield of AI will move out of research labs and into the industry

There are several types of machine learning, each with its own unique characteristics and applications. One type of machine learning is supervised learning, which involves training a model on labeled data, where the input features and corresponding output labels are provided (think of an image of a computer chip as the input features and the label being whether it contains any defects or not).

Another type of machine learning is semi-supervised learning, which involves training a model on a mixture of labeled and unlabelled data. This is useful when we have a limited amount of labeled data and a large amount of unlabelled data, as the model can still learn from the unlabelled data.

Large language models, such as GPT-3, are a type of generative AI that can generate human-like text and perform a wide range of language tasks. These models are trained using a combination of supervised and unsupervised learning, allowing them to learn the structure of language and generate coherent text.

Finally, there is reinforcement learning, which is a type of machine learning in which an agent learns to interact with its environment in order to maximise a reward signal. This type of learning is often used to train AI agents to perform complex tasks, such as playing video games or controlling robots. The agent learns through trial and error, receiving rewards or punishments based on its actions.

Up until now, reinforcement learning has seen most of its successes in robotics and gaming, but we start to see new industries adopting this subfield of AI (e.g. Finance and Manufacturing). For example, a trading agent might be trained to increase profits by buying and selling stocks at the right times, while a manufacturing optimisation agent might be trained to reduce production costs by adjusting process parameters.

These new models are among the most sophisticated AI models out and we can expect them to continue becoming more and more powerful in the years ahead. According to Mo Gawdat (former Chief Business Officer of GoogleX), we can expect AI to become a billion (yes, billion with a 'b') times smarter than every single person on this planet and unlocking an array of new applications on its way.

#3 Data will no longer be the limiting factor

In recent years, the importance of compute in training machine learning (ML) models has been increasing, with some experts even suggesting that compute is becoming more important than data. There are several reasons for this trend.

First, the size of ML models has been increasing dramatically, with some models having billions of parameters. Training such large models requires a significant amount of compute power, as the model must be trained on a large dataset and perform many calculations to update its parameters.

Second, the demand for ML models with higher performance has also been increasing, which often requires training on larger and more complex datasets. This can further increase the amount of compute required to train the model.

Third, the field of ML is constantly evolving, with new techniques and approaches being developed all the time. These new techniques often require even more compute power in order to be practical.

Finally, the availability of cloud-based compute resources has made it easier for organisations to access the large amounts of compute needed to train ML models. This has further fuelled the trend towards more compute-intensive ML training.

Overall, the increasing size and complexity of ML models, the demand for higher performance, the evolution of the field, and the availability of cloud-based compute resources have all contributed to the trend of compute becoming increasingly important in ML model training.

The takeaway

AI has had a profound impact on our society in recent years, but will increase to do so in ever more impactful ways in the years ahead. Staying innovative requires CTOs to have an understanding of the impact of AI on their industry and coming up with a (cost) strategy that integrates software engineering practices with AI. Fortunately, getting started doesn't have to be that difficult. Curious how? let's discuss that in next week's blogpost!

Thomas Vieveen

Freelance .NET Developer | MSc

2 年

Nice insights Stef, I'm looking forward to next week's blog!

回复

要查看或添加评论,请登录

Stef Ruinard的更多文章

  • Time Machines & The Uneven Now

    Time Machines & The Uneven Now

    Rapid advancements in AI have collapsed the distance between the future and the present. As William Gibson famously…

  • Understanding Systems or Users?

    Understanding Systems or Users?

    Living through a technological disruption is humbling. The AI field is moving forward at such an incredible pace, that…

    1 条评论
  • Used by humans, built for AI

    Used by humans, built for AI

    It’s the start of a new year, and I've been thinking about something that's been quietly changing in the tech world…

  • Fine-Tuning Strategically: When to Resist the Urge & What to Do Instead

    Fine-Tuning Strategically: When to Resist the Urge & What to Do Instead

    Elon Musk outlines five steps for improving systems: Make the requirements less dumb. Delete the part or process.

  • Connecting the dots in the world’s playground

    Connecting the dots in the world’s playground

    For the past month, I’ve been working with a close friend on a robotics project. It’s mind-blowing to experience what…

  • Intelligence is Free - Use it to amplify your creativity.

    Intelligence is Free - Use it to amplify your creativity.

    I've been noticing a suboptimal pattern in dealing with intelligence lately, and it's got me thinking. We're in the…

  • Can AI Reason? Who Cares! Let's Solve Some Problems.

    Can AI Reason? Who Cares! Let's Solve Some Problems.

    There's a lot of talk about whether AI can truly reason or understand emotions. Some say it's just pattern-matching…

  • The "Unlock Fallacy": Why We Crave Simple Solutions

    The "Unlock Fallacy": Why We Crave Simple Solutions

    It's fascinating how our human minds, shaped by narratives and prone to seeking patterns, often overestimate the impact…

  • Decoding Human Behavior

    Decoding Human Behavior

    Decoding Human Behavior Let's chat about how online experiences are about to get a major makeover. And no, I'm not…

  • Manage complexity and find flow

    Manage complexity and find flow

    Starting an engineering project from scratch can be challenging. There are so many components that still have to be…

社区洞察

其他会员也浏览了