The Short Shelf Life of AI Models
Yes, your AI models have a “best before” date.
For anyone that comes out of the world of technology, the old school thinking and approach to software development needs to be thrown out. AI is a completely different beast and the models you are thinking about building, training, and using in your business will have a very short shelf life.?
This is not the world where you can develop an application, and have that application live on for many years with straightforward updates and maintenance. AI is evolving rapidly, and you’ll have to deal with changes at a pace you’ve never experienced before.
The Rapid Evolution of AI
Think of the AI models that have become antiquated and tiresome over the past 2 years. Even ChatGPT 4 has become last year’s news, never mind 3.5 getting to end-of-life in record time — and everyone is looking beyond 4o.?
Putting concept drift, data drift and model collapse aside for the moment. Just the pace of change in new models is astounding and the models you use this year are not going to be the same models you use next year— unless you want the business to fall further behind.?
Additionally, the techniques being applied in AI ecosystems are becoming far more advanced and intricate.? This includes recent interest in Agentic Flows as well as promotion of ‘Baby Models’ to cope with domain specific use cases and continuous learning.? It also includes new innovations in multi-modality and augmentation that will change how models are used in production environments.?
Technical Debt and Lifecycle Management
Technical debt??You are going to be swimming in it unless you have a mature AI Lifecycle Management process — and that includes AI Governance. You need to become best friends with your data and extremely intimate with transformational change in the business.?
Forget about downloading off-the-shelf models and training them ad-nauseum. Stop focusing on training and start focusing on inferencing — this is where the money is for the business and not the backroom exercise of train, train, train. This is more experimentation, and expensive demoware gone wild.?
AI is not cheap, and the hype in the market makes it appear like everyone can just pick up a model and suddenly get value.? That’s certainly not the case.? AI is complex and costly, especially if there is no forethought to the lifecycle of build, test deploy, manage AND repeat.?
领英推荐
The New AI Lifecycle
Don’t misunderstand me, you will have to train and keep training — it’s just that your *new* AI lifecycle is going to have to own this process, and you’ll need to automate it fast. You will also need to figure out how to test, test, test and this is going to hurt because unit tests suck in the world of AI. So, get comfortable with behavior testing and the intricate details that go along with that. With AI Governance looming large, your testing and monitoring skills are going to have to be much sharper and deeper than ever before.?
Even your version control mindset will need to shift into real-time operations for AI Governance.??No longer can you as a software developer, or a DevOps/MLOps engineer, or a data scientist remain in the comfort of your VCS.??Version control takes on an entirely new life as business outcomes get decided by operational models.? This is no truer than in regulated industries such as Finance.?
Key Points that Just “Scratch the Surface”
The following points are just starting to scratch the surface of an AI Lifecycle, but they’re important, and are ingredients in developing an AI Governance framework:?
And most importantly, figure out the right use cases that matter to the business for real gains in the market – not just the “scratching the surface” productivity gains that might be perceived with chatbots.? Business units and business executives must get involved in understanding the costs of adopting AI.?
Lessons from Experience
At?Charli, we have “battle scars” that can prove out the pain that comes with AI. Even our prior experience with Digital Twins proved how difficult the world of AI can be without the right lifecycle in place. This year alone, we’ve switched out and upgraded models so many times it could make someone’s head spin, and we’ve relied heavily on our automation and behavior tests to ensure that outcomes don’t deviate from customer expectations. We’ve even blocked mainstream models (you’ll know their names) because they just can’t make the grade.?
Along with the “battle scars”, we also have proof that when done right, AI can be a game changer!
Boost leads from LinkedIn, Calls, and E-mails | Base Hands | Yes Straws
1 个月How are your sales going, Kevin?
CPO & Co-Founder @ Charli AI, Business Executive | Generative AI for Financial Services
4 个月Great perspective Kevin Collins. AI models are expensive to train and not taking the cost of maintaining models into consideration can kill any business case out of the gate. It's also important to have the right architecture to minimize the amount of ongoing model training required.
Interestingly, I received "The Information" news email that talks about LLMs hitting a wall (see link below). The email just came in this morning but this topic was discussed months ago and reinforces the need to look at newer techniques in the world of AI to generate returns. We've seen the push to ever larger models and bigger context windows generate mixed results in our own testing -- and we live in a finance world where "mixed" doesn't cut it. https://www.theinformation.com/articles/expect-ai-developers-to-benefit-from-energy-buildouts-face-chip-uncertainty-under-trump-experts-say-are-llms-reaching-a-plateau