The AI-Powered Enterprise: Fit-for-Purpose AI
Sebastian Krause
C-Level Executive I Go-to-market & Technology Leadership I Supervisory Board Chair/Member I Ex-IBM SVP/CRO
In the world of artificial intelligence (AI), there's been a growing emphasis on the importance of efficiency and sustainability. As companies look to harness the power of AI for business, they're increasingly turning to more specialized models.
Traditional large language models (LLMs), which are designed to handle a wide range of tasks, can be incredibly powerful but also come with significant costs and complexities. These models require vast amounts of computing resources, data storage, and energy to operate, leading to high operational expenses and a substantial environmental footprint.?
IBM is focused on developing?smaller, fit-for-purpose language models?that not only use computing resources more efficiently, but also offer superior performance for specific tasks at a fraction of the cost.?
At IBM, we have seen firsthand the benefits of these more specialized models. The IBM?Granite Foundation Models?have outperformed other LLMs in numerous use cases, while consuming significantly less computing power, data storage, and energy. At 13 billion parameters, the Granite models are more efficient than larger models, fitting onto a single V100-32GB GPU. This has led to substantial cost savings and a reduced environmental impact.
Our model approach is centered around co-creation and collaboration with our clients. We work closely with them to identify the right use case, model, and deployment plan, ensuring that they get the most value out of their AI investments. To support this effort, we have invested in?designers, UX builders, and architects who are available to our clients at no cost.
领英推荐
At IBM we are laser-focused on building?models that are targeted for business. The Granite family of models is no different, and we trained them on a variety of datasets — totaling 7 TB before pre-processing, 2.4 TB after pre-processing — to produce 1 trillion tokens. By training models on enterprise-specialized datasets, we help ensure our models are familiarized with the specialized language and jargon from these industries and make decisions grounded in relevant industry knowledge. With fit-for-purpose versus general purpose models, enterprises can achieve better performance, lower costs, and a reduced environmental footprint — all while staying ahead of the competition.
The rise of models that are targeted for business represents a significant shift in the generative AI landscape. By focusing on smaller, fit-for-purpose models, enterprises can unlock the full potential of AI while minimizing costs.
How does your company prioritize?efficiency and sustainability in AI? I welcome your comments.
This is the second in a series on "The AI-Powered Enterprise," offering insights on integrating generative AI into business operations to drive growth and competitive advantage. Learn more about IBM and watsonx – the AI and data platform built for business:?www.ibm.com/watsonx
?? Empowering organizations with the trusted Hybrid Cloud & AI innovation that matters for their ?? and for the ??
9 个月My POV: Fit-for-Purpose AI emphasizes creating AI systems that are tailored to specific needs, ensuring they are more effective, sustainable, and capable of delivering targeted outcomes without the unnecessary complexity or resource consumption of more generalized AI models. This is the future of practical AI deployment to gain tangible AI while mitigating risk, enhancing security, and boosting governance.
Chief Executive Officer at NexaQuanta | AI Technology Leader | Serial Entrepreneur with Successful AI Startup Exit | Fractional CAIO and CTO | Sartup Mentor
9 个月Sebastian Krause Great article and perspective on #AI and #Sustainability, and how IBM watsonx AI and Data platform is ensuring that we don't overlook #sustainability while reaping the efficiency and productivity benefits of enterprise #GenAI.
R&D / AI, Sr. Executive 20+ years at Siemens Healthcare | Led up to 50 People with $15M+ Budgets | Developed from Concept to Launch Products Used Globally by 200M People | Expertise in AI/ML and DFSS for Medical Systems
9 个月Good post Sebastian Krause. Most organizations fear leakage of sensitive organizational data. As a result, they would prefer to go with a RAG approach (which looks good on paper but the verdict is not out yet). Ideally, to avoid any security issue, organizations would prefer trainable standalone domain specific models that can remain within their own walls. Real or virtual. However, many organizations do not have or can afford compute resources for retraining the entire model. I am curious if the Granite series or any other class of models allow Parameter Efficient Fine Tuning (PEFT) to limit model fine tuning costs. Does the LLM model architecture needs to be pre-built to allow PEFT?
It's great to see AI evolving towards sustainability, Sebastian Krause. How does IBM ensure its AI models prioritize environmental impact?
Sanjay K Saxena Director (WW IBM SW Licensing & Sales) IBM-Kyndryl Project Office IBM S&A Partial Renewals Project Office IBM Team SAM & Licensing Microsite
9 个月??