Large Language Models from Buzz to Reality
Kashif Manzoor
Enabling Customers for a Successful AI Adoption | AI Tech Evangelist | AI Solutions Architect
The most significant investment in Generative AI is at an all-time high. According to a Goldman Sachs report, AI investment will reach $200 billion globally by 2025. As per CBInsights, investment in Generative AI has risen to 4x in 2023 compared to the previous year. This all pressures organizations to get onto the bandwagon of Generative AI quickly.
Everyone is trying to find the best use case that can be implemented for the organization. Navigating the buzz is challenging and time-consuming for everyone, including me.
Over the last year, I have explored and curated different use cases published in the earlier newsletter, 100+ Generative AI Use Cases Across Industries. You can download the guide.
Today, we will discuss deploying large language models (LLMs) in the IT ecosystem, which involves choosing the right approach and following a strategic process. The most common question that comes to mind is how to decide to make or buy LLMs for your organization. Luckily, the Initiative for Applied Artificial Intelligence has provided 9 points on A Guide for Large Language Model Make-or-Buy Strategies: Business and Technical Insights to follow before deciding on this area.
Knowing the various deployment strategies for LLMs will help you leverage their capabilities within your existing IT ecosystem. This will allow you to integrate and unlock your organization's full potential seamlessly.
We have also discussed a few approaches for deploying Generative AI, starting with Consume, Embed, Extend, and build. According to Gartner, here's a breakdown of the five significant approaches for deploying GenerativeAI in various applications:
1 - Consume: Gen AI Embedded in Applications In this approach, generative AI is already integrated within applications. Users consume the AI capabilities as they are without needing to modify or customize the underlying AI models.
2 - Embed: Gen AI APIs in Custom Application Frameworks Here, generative AI capabilities are accessed through APIs and embedded into custom applications. It allows for a more tailored use of AI functionalities within a specific application framework.
3 - Extend: Gen AI Models via Data Retrieval This method extends existing generative AI models by retrieving and incorporating external data. This approach enhances the AI model's capabilities by leveraging additional data sources.
4 - Extend: Gen AI Models via Fine-Tuning This approach also extends generative AI models through fine-tuning rather than data retrieval. Fine-tuning adjusts the AI model's parameters for specific tasks or data sets, providing more customized outputs.
5 - Build: Gen AI Custom Models from Scratch The most advanced approach involves building custom generative AI models from scratch. This allows complete control over the AI model's design and capabilities, which can be tailored to unique and specific requirements.
Deployment Steps:
Based on the observation, the best approach is to start small with the initial use case with a simple task and a less resource-intensive approach (consume or embed) before attempting complex implementations.
The most critical point is that Security is paramount to any approach, so you'll need to make sure your chosen LLM provider offers robust security measures, especially when dealing with sensitive data.
Weekly News & Updates...
This week's AI breakthroughs mark another leap forward in the tech revolution.
The Cloud: the backbone of the AI revolution
Favorite Tip Of The Week:
Here's my favorite resource of the week.
Potential of AI
领英推荐
Things to Know
The Opportunity...
Podcast:
Courses to attend:
Events:
Tech and Tools...
Data Sets...
Other Technology News
Want to stay on the cutting edge?
Here's what else is happening in Information Technology that you should know about:
Earlier Edition of a newsletter
That's it!
As always, thanks for reading.
Hit reply and let me know what you found most helpful this week - I'd love to hear from you!
Until next week,
Kashif Manzoor
The opinions expressed here are solely my conjecture based on experience, practice, and observation. They do not represent the thoughts, intentions, plans, or strategies of my current or previous employers or their clients/customers. The objective of this newsletter is to share and learn with the community.
Solutions Architect | Project Manager | Digital Transformation Lead | Oracle Consultant | DevOps | ACCA Mentor | FCCA | PMP | PMI-ACP | ITIL | CC | OCS
1 年Absolutely, navigating the landscape of large language models (LLMs) and finding the best use cases can be daunting. It's crucial to approach it with a strategic mindset and focus on identifying applications that align with your organization's goals and objectives. Collaborating with experts and conducting thorough research can help streamline the process and ensure successful implementation.
CEO UnOpen.Ai | exCEO Cognitive.Ai | Building Next-Generation AI Services | Available for Podcast Interviews | Partnering with Top-Tier Brands to Shape the Future
1 年I agree that it can be challenging to find the best use case for large language models. However, I believe that LLMs have the potential to revolutionize many industries and create new opportunities for businesses and individuals. I'm excited to see how LLMs continue to evolve and how they can be used to solve real-world problems. #AI #LLMs
Vice President - EMEA Technology Software Engineering
1 年Very useful info Kashif. Thanks for sharing