GenAI: 10 Key Insights for Leadership - Part 1
Mirza Rahim Baig
Top AI Voice | Educator | Author| Startup Mentor | GenAI, AI, Machine Learning
With “game changing” events occurring practically every single day, it is impractical to keep track and figure out what matters. Are there some key patterns emerging over the past year? How do they matter for your business/organization? How should you act? I answer these questions in this two-part article to help decision makers make sense from noise. Presenting to you the 10 key insights from the past year, why it matters, and what you should do as a leader.?
Insight 1: LLM Technology is now highly accessible, with numerous no-code solutions available.
Less than even a year ago, creating a decent chatbot with function calling and memory, or creating a decent Retrieval-Augmented Generation (RAG) application needed a fair amount of skill. This was the domain of coders exclusively, and you needed skilled resources. But now, we have low code AI agent frameworks like Relevance AI , StackAI , FlowiseAI (YC S23) , VectorShift etc. LLM based Agentic solutions can now be built using drag-and-drop tools in minutes. The following image shows a simple playground by Cloudflare that you can try for yourself.
Why does it matter? These solutions have significantly lowered the barrier to entry to building Ai driven solutions. Now smaller firms can innovate and move at a high velocity, quickly developing and deploying AI applications. They can outpace traditional, slower development processes and the firms, on aggregate, can become a considerable competition.
Action for leaders: First, get comfortable with the technology yourself. Enable your teams to explore these tools and solutions. Encourage a culture of innovation among not just the developers but across departments.
Insight 2: Big tech is going big on platforms
Related to insight 1, a year ago developers needed to stack together hacky packages and utilities to make even simple applications work. Now, not surprisingly, major tech companies have built multiple solutions themselves and integrated them into their platforms. Now you don’t need to use any external utility or leave the cloud platform to use open-source models. 亚马逊 , 微软 , 谷歌 and IBM / IBM iX all heavily invested in their own AI platforms.? We’re increasingly moving towards a competitive landscape dominated by big players.?
Why does it matter? The ecosystem increasingly being defined and dominated by big players has both pros and cons. Flexibility of solutions is limited and vendor lock in is a real thing. The good news is that integration is so much easier!?
Action for leaders: Choose your technology partners wisely. Align your AI strategy with the platform that best suits your business needs, but also consider developing in-house expertise to avoid over-reliance on a single provider.
Insight 3: Inference costs are declining sharply
The costs for inference with LLMs have fallen sharply - dropping about a whopping 85% year over year. For instance, OpenAI 's GPT4o mini flagship model is significantly cheaper to operate than even GPT-3.5 Turbo. This applies across providers and perhaps will be the new “LLM law” or something.?
What does it matter? Other than one off usage cost (say personal productivity usage) being lower, the lower inference costs mean much cheaper scaling! Expect potential cost savings and improved margins.?
领英推荐
Action for Leaders: It is a good time to reevaluate your AI budget. If cost was your biggest concern, then now is a good time to explore new opportunities to explore and scale AI solutions in your business. For customer delight, consider passing some of these cost savings on to customers. :)
Insight 4: LLMOps emerges as a discipline, RAGs mature, Agents pick up
LLMOps (Large Language Model Operations) is quickly emerging as a discipline in itself. We have come a long way from very hacky chatbot interfaces with zero guardrails, to being close to developing performant RAG solutions with efficient guardrails and controls. AI agents are the rage now and are the topic with most active research and hype. There is a tendency to try to “automate everything to replace humans” by using agents, which requires caution.
Why does it matter? Businesses are now rightly demanding RoI, and a key necessity for that is reliability and security. We are making good progress in this regard. The tendency to automate everything, however, is risky.
Action for Leaders: Focus on building balanced, human-in-the-loop systems that leverage AI while maintaining necessary oversight. Keep pushing your teams for system reliability and encourage them to monitor system health.
Insight 5: Solution evaluation has improved, but remains a tricky task
Insight: Model and solution evaluation methods have surely improved over the past year. For more general tasks, we now have multiple benchmarking frameworks like Chatbot Arena, Ragas, Helm, LMSYS and more. However, for industry-specific applications and specific use cases, it remains a significant challenge.
Why does it matter? Trust in AI systems is essential - that they deliver business value. Deploying with proven reliability and validation in relation to business outcomes and regulatory requirements is essential.
Action for Leaders: Invest in developing evaluation frameworks for your industry, company. Ideally, for each specific project. You need to ensure continuous monitoring and evaluation of AI systems. Make sure to align them with business goals and regulatory requirements.
That's it! These were the first five insights. I hope you found them useful!
Stay tuned for part 2 for the remaining five.
Do you agree with these insights? Would you act differently? Let me know in the comments!
#businessintelligence #artificianintelligence?#data #leadership
Leading AI Projects | Enterprise Architect | Real-World AI Solutions | Transforming Data into Solutions | Data-Driven Leadership || MSc in AI & ML | TOGAF Certified
6 个月It's great to see that while many experts focus solely on technology—which is undoubtedly important. We also need to recognize the crucial role that leadership plays in steering a company towards AI. It's commendable that you're emphasizing this critical aspect as well, Mirza Rahim Baig .
Multi patented AI expert building GenAI systems at Tomtom
6 个月You have shed light on an important topic which is least talked about in Insight 2. Cohere's co founder choice some heavy but true words about cloud providers. Looking forward for part 2
Insightful!