Newsletter #29: The Generative AI native Enterprise case studies are arriving. Here. We. Go.

Newsletter #29: The Generative AI native Enterprise case studies are arriving. Here. We. Go.

Enterprise conviction in the transformative power of Generative AI can be described in many ways.


One of the more universal is the often referenced 埃森哲 $1B run rate of Gen AI pilots / POCs.?



Another one would be the Gartner CEO survey.


Apparently the Enterprise conversion rates aren’t where folks would like to see them, but it’s only a matter of time before those pilots / POCs move into production environments to scale.?


As that happens, Enterprise case studies illustrating best practices will begin to circulate and you know what will happen next and how 贝恩公司 will update the slide above...

(s / o Benedict Evans for the Accenture + Bain slides)

For what it’s worth, anecdotally, I’m hearing there’s a lot more progress being made at the Enterprise level but we shall see…

Now I know, suggesting there’s a significant amount of demand for tangible Gen AI case studies is a ridiculous understatement.?

But speaking of tangible Gen AI case studies, how about the 欧莱雅 “Gen AI-as-a-Service” case study profiling their collaboration with LangChain and Google Cloud ?


“As manager of L’Oréal’s AI Center for Excellence, Thomas Menard wanted to get ahead of the game. He marshaled his team to create GenAI as a Service, a set of APIs that could empower developers to leverage generative AI quickly and safely.”

“The team created GenAI as a Service, a set of declarative generative AI APIs, available in the L’OréalAPI portal. GenAI as a Service gives IT teams and developers a platform for flexible, scalable, and easy deployments with built-in time and cost savings, but with the assurance of robust security and validation.”

“Last but not least, this toolkit avoids creating silos and teams reinventing the wheel for every project. And with more than 5,000 IT employees, this will easily save time and resources.”

“The APIs are delivering key, in-demand capabilities of generative AI. GenAI as a Service can handle prompt completion, leveraging an open choice of LLMs (Gemini and others). It can handle chat, including history management, long-term memory and multimodal conversations. Very soon we will add multimodality with the ability to upload a picture or video and ask questions about it. We are also delivering APIs to generate images.”

“And last but not least, we also provide an API to do retrieval augmented generation (RAG), which provides source materials along with natural language responses.

“The original inspiration for GenAI as a Service came from an app we are particularly proud of, called L’Oréal GPT, live in production with Gemini Pro 1.5 with 1 million tokens. Users can load PDFs and other documents, and L’Oréal GPT responds to written prompts about them with natural language text or images depending on the request. Employees can also use L’Oréal GPT to query a knowledge base of internal company data.”

“Our development roadmap includes building out AI agents, graph databases, and long term memory as well.”

If you want to go deeper, here's a link to the case study and a link to the video of Thomas Menard crushing it while presenting the case study during the “Building Generative AI Apps on Google Cloud with LangChain” session at Google Cloud Next.?

No question, Thomas and L’Oreal are all in on LangChain…

“And you will see that everything which is blue is already live at L’Oreal and you see that when have the LangChain logo, this is where we leveraged LangChain."

"You see that LangChain is leveraged everywhere."

"We are actively working now with Harrison ( Harrison Chase , Co-founder / CEO of LangChain) on the evaluation stuff and we want to deploy as soon as we can, LangSmith, because we have more and more users and more business cases. We want to be able to evaluate the prompt we’re doing, evaluate them against the models and also evaluate the right capabilities to be sure that when we deliver a new feature, the accuracy is better and not regressing."

"Last but not least, as we said, we are a great lover of LangChain.”


I might be stating the obvious here but there are 3 things that captured my attention about the L’Oreal case study:

1: The size of L’Oreal (90,000+ people)

2: The pace of idea to implementation (3 months)

3: The embrace of LangChain (Generative AI native start-up that has built the leading developer framework to build “Generative AI applications”)

The F1000 companies that empower leaders like Thomas Menard who understand the evolution of the modern data stack and requirements to build Compound AI Systems, are setting themselves up for what is certain to be one of the most important areas of competitive differentiation over the next 5+ years.


“Lack of good training data is what has caused AI capabilities to plateau, and access to the next frontier of data is what AI needs to make the jump to the next S-Curve.”

“Today’s LLMs were primarily trained on publicly available internet data, harvested from GitHub , Reddit, Inc. , WordPress, and other website scraping and licensing activities. But it’s no longer enough to sustain model improvements.”?

“We believe the real breakthrough that will allow humanity to jump to the next S-Curve is data produced at work."

"Workplace data—e.g. product specifications, sales presentations, and customer support interactions—is of far higher quality than what’s left of public data for training purposes, especially compared to running the dregs of the internet through the transformer mill. (The results of which may be why a lot of AI-generated content is already being called ‘slop.’)”

“The potential for AI in the B2B space is vast and largely untapped.”

“The data is there: Knowledge workers continuously produce business data at an incredible cadence:”

When it comes to harvesting Workplace Data, it’s probably not a stretch to suggest that we’re at a very interesting crossroads. We’re coming up on the 2-year anniversary since ChatGPT has been launched.?

The familiarity across all levels of an organization with the power of Generative AI through direct experience with ChatGPT and any number of the handful of leading players creates an openness to exploring how the underlying technology can be harnessed within the Enterprise to drive business outcomes.

Unfortunately but understandably, this openness tends to be coupled with the challenges people and divisions within bigger companies have to navigating change, risk mitigation etc.?

Without getting on the soapbox for too long, here we go…

Given the complexity of distinguishing AI signal vs noise, it’s incumbent upon technology companies to evolve their go-to-market motion.?

Those that do will be much more successful as they will empower Enterprises to self-educate more easily and opt-into engaging with companies with differentiated value propositions that align with the specific business outcomes they’re looking to drive.?

More specifically, the silos between Sales, Marketing and Customer Success need to be addressed so that everyone can get more of what they want / need by being a bit more consultative in their collective approach, more self-service options and no question, more Build-In-Public etc…IYKYK.

While Pilots / POCs are a great way to overcome these challenges, it’s also important to consider the time to value creation ratio of any initiative as well as how easily the initial Pilot / POC can scale.

To make this real, a relevant example to consider would be Graph Databases.

As a highly visual learner, the Knowledge Graph has always been something that “just makes sense” to me as it’s effectively a visual representation of the relationship between data aka “data about data.”?

Knowledge Graphs visualize data relationships in a way that has a lot in common with the way we use whiteboards to explain ideas, complex topics etc. with circles and connecting lines.?

Enabling you to see valuable insights effectively hidden in plain sight.?

They’re especially useful in areas focused on highly connected data sets aka “Customers that bought this are also likely to buy X, Y or Z?”, “Customer 360 / CX / Next Best Action” etc.

While Knowledge Graphs and the associated Graph Database infrastructure that enables them have been an intriguing part of the modern data stack, their role within Compound AI Systems is attracting a whole lot of interest from data tech innovators capitalizing on their insights in part by connecting them to LLMs.

It wouldn’t surprise me if more of the Enterprise Gen AI pilots / POCs that convert and we then hear about in future case studies include compressed time to value ratios driven by the contributions of Graph Databases / Knowledge Graphs.

Based on the L’Oreal case study excerpt above - “Our development roadmap includes building out AI agents, graph databases, and long term memory as well.” - it seems like Graph Databases are part of what’s next for them too…

A talented Applied Technologist + Evangelist of Neo4j , the pioneering Graph Database leader, John Stegeman , recently did a fireside chat with me.?

We covered a lot in 13 minutes (link) including:

1: What exactly is a Graph Database?

2: A Knowledge Graph?

3: Let’s make them real with practical examples that anyone can relate to?

4: Why is there so much enthusiasm around the potential of combining GenerativeAI and Graph Databases / Knowledge Graphs?

5: When does the “lightbulb” start to go on with technical and non-technical teams around their interconnected data and identifying “graphy” problems that they’re excited to solve?

I think John’s explanation of data today being thought of in terms of spreadsheets whereas a Knowledge Graph enables it to be thought of in terms of Google Maps was phenomenal.

If you want to go deeper on Knowledge Graphs, here are a few more links you might find interesting:

  • Deloitte research report: “Why is it essential for enterprise-level generative AI to incorporate knowledge graphs?” (link)
  • “What does an Enterprise Knowledge Graph (EKG) have in common with Steve Jobs?” (link)
  • “A Knowledge Graph is a way to understand the interconnected nature of all of your company’s data sets.” (link)

Speaking of data…Data is to AI what gas is to a car. No data? No AI.

You know how far your car is going to go when your tank is empty…

“The data and AI strategy has become a key part of the business strategy. Business leaders need to invest in their data and AI strategy—including making important decisions about the data team’s organizational structure, data platform and architecture, and data governance— because every business’s key differentiator will increasingly be its data.”

“Data practitioners are getting closer to the business, and the business closer to the data."

"The pressure to create value from data has led executives to invest more substantially in data-related functions."

"Data practitioners are being asked to expand their knowledge of the business, engage more deeply with business units, and support the use of data in the organization, while functional teams are finding they require their own internal data expertise to leverage their data.”

Let’s go a bit deeper and really geek out on the Data Stack…

I recently read “The Datapreneurs” by Bob Muglia .?

Given that Bob effectively took Snowflake from $0 and 33 people to $200M and 1,300 people on the cusp of a massive IPO, seeing the evolution of the Data Stack through his eyes made picking up his book a no brainer…


“Part new historical narrative, part memoir, and part exploration of technology we thought was only possible in Asimovian science fiction, The Datapreneurs gives us a technologist’s take on the machine revolution shaping our future.”

What I especially liked about the book was that Bob was transparent about writing the book to contextualize the space through his experiences, provide his POV on where the puck is headed and paid off the POV by highlighting companies he’s invested in that align with his POV / investment thesis aka “Get long. Get loud.”

“At the center of these advances is the modern data stack, a collection of products and services that work together in the cloud, making gathering, managing, integrating, analyzing, and sharing data easier and cheaper. The modern data stack is the technological foundation of the data economy, and it promises to help people and organizations solve some of the world’s most challenging problems.”

These four key insights especially jumped out at me…

1: Computers are learning our language and in doing so, “unstructured data” is becoming a misnomer…

“Unstructured data is still underutilized. The market research company 国际数据公司 projects that by 2025, 80 percent of the world’s digital data will be unstructured…I think ‘unstructured data’ is a misnomer."

“These data types are structured - just differently. People understand them, but they are opaque to computers. I prefer to call these data types ‘complex’ rather than ‘unstructured.’”

2: Now that computers are learning our language, “we open a new world of applications”...

“While people easily comprehend these data types, computers struggle with them. Machine learning and foundation models will change that…Complex data is a collection of many different, unique formats.”

“Now, for the first time, we can train computers to understand these other formats, much as kids learn to read in elementary school and, as they progress, gain proficiency in understanding more complicated topics.”

“As we develop a new generation of applications, a computer’s ability to handle and understand complex data types is vital. Once they can do this, we open a new world of applications that can extract information from these data sources, potentially combining them with other data types to reveal new insights.”

3: Here come the AI Agents and Compound AI Systems (s / o L'Oreal, LangChain + Google Cloud)...

“In today’s world, video, images, and text are ubiquitous and rich information sources. Managing and analyzing this complex data is a new frontier.”

“Machine learning and foundation models are taming that frontier, and the business world is beginning to wake up to this opportunity.”

“AI assistants built using machine learning and foundation models containing complex data types will change creative and business processes.”

4: Is it time to revisit the legendary statement “Software is eating the world”?

“In 2011, Marc Andreessen, Netscape founder and current Andreessen Horowitz venture capitalist, famously said that ‘software is eating the world.’”

“That is undoubtedly true. But today’s software defined world leaves a lot to be desired.”

“In the next 10 years, I predict that models will eat software.”


Along with the four insights above, the following two portfolio companies really jumped out at me because they take what were wildly complex technical challenges requiring a company to invest in the associated, expensive talent to be able to access the capabilities and make them more universally accessible.?

Democratizing access to this kind of horsepower provides companies with a productivity boost that can be felt in multiple ways. Very good and very exciting ways.

1: Pinecone

“Pinecone developed a specialized kind of database called a vector database, making it easier to build high-performance applications using complex data. Vector databases help applications search text, images, video and other challenging data types.”

“A vector is a numeric array that represents examples of complex data. Vectors help perform similarity searches, meaning computers try to identify things that are not identical to one another but share many traits. A vector database can store, search for, and analyze vectors.”

“Building vector databases and systems for using them is extremely difficult and typically demands PhD brainpower. Amazon, Google, and other hyper-scale businesses can afford to hire and pay the teams of talented people required to do this work, but most companies can’t touch it. That is why Edo started Pinecone in 2019.”

“Pinecone first took on the immense task of interpreting text to gain insights from email threads, product support transcripts, and Slack collaborations. The company optimized its vector database to perform a semantic search. A lexical search looks for an exact word or phrase, but a semantic search can interpret a query’s meaning to improve results.” “The technology uses natural language processing to understand the semantics of sentences in context and stores those models as vectors in a database.”

“Until recently, organizations needed data scientists and engineers to build vector databases and develop applications. Now, vector databases and vector search serve business analysts and executives because the background complexity remains hidden.”


2: RelationalAI

“I see a colossal technology pivot coming in the years ahead. The modern data stack is powerful but immature and insufficient to handle the data management and analytic tasks we want to accomplish in the future. So, I see the ecosystem evolving to combine a maturing modern data stack with relational knowledge graphs and machine learning foundation models.”

“At the highest level, a knowledge graph is a database that models concepts, their relationships, and the associated rules and constraints. It is a network of interlinked descriptions of objects, events, situations and ideas. The heart of a knowledge graph is a model where the descriptions of entities are something that both people and computers can understand.”

“A fundamental problem with today’s approach is that business logic gets scattered into multiple systems. The same calculation is often duplicated and implemented numerous times in different applications and within BI dashboards."

"Putting all the business logic in one place, the knowledge graph model prevents errors and makes the code much easier to manage.”

" Molham Aref believes - and I agree with him - that applications built this way are more efficient with peoples’ time, require fewer computing resources, increase profit margins, and accelerate growth for organizations that use them."

"Over time, he believes these systems will be so powerful and easy to use that company leaders can model their businesses for themselves."

"Less-technical staff can describe what they want to achieve in simple terms and let the system figure out how best to meet those goals."

"As a result, businesspeople and data professionals can use the knowledge graph to automate a lot of the drudgery that exists in their jobs today."

"It will allow them to focus their precious time on the most critical business problems.”


Like most things in business, they always take longer than expected.

There will be no shortage of Enterprise Gen AI case studies released over the next 6+ months that will include the critical impact data including outcomes generated, operating expenses reduced etc.

An important consideration that’s a bit more nuanced and perhaps viewed differently by different companies, especially their CFOs ;), is what Allie K. Miller recently described as “a whole new dimension of AI-driven productivity”...

Never underestimate the impact of AI elevating the "quality and impact of our work"...

If there’s one thing I hope Newsletter #29 might persuade you to consider, it’s figuring out how you can start introducing Generative AI native technology into your day-to-day workflow??

It’s one thing to read about it, it’s another to see it…but experiencing the magic firsthand is something else.

Last but certainly not least, I’ve started adding each of my historical micro essays and single shot videos to my previously dormant “AI with Alec” YouTube channel (link) along with new ones I record going forward.?

I realized my essays / videos weren’t so easy to find without having to dig through my Linkedin or Twitter feed so I wanted to fix that.

Also, if you’re looking for a laugh and / or need a reason to go check them out, there are a few from 2016 when I first started my AI journey which are especially embarrassing…

I mean, this whole AI learning curve is quite steep!! ;)

Have a great week. Let’s get after it.

Alec

Ishu Bansal

Optimizing logistics and transportation with a passion for excellence | Building Ecosystem for Logistics Industry | Analytics-driven Logistics

1 个月

What are some potential challenges enterprises may face when harvesting workplace data for Generative AI? #AIimpact #DataPractitioner.

Rob Puzio?????? ??

Custom PLU stickers | GS1/GTIN Barcodes | Handheld Automatic Sticker Applicators | Price Gun Labels | Custom Die cut Labels

1 个月

Informative. Enterprises embracing AI, unlocking new levels of innovation.

Praveen Singh V

Founder @ FOF.link | Efficient Market Theory

1 个月

It's exciting to see such a shift in the conversation around Generative AI and its real impact! Exploring how Graph Databases can enhance AI systems feels like a game changer, especially for those of us who have built tech solutions from the ground up. Your insights on the evolution of the Data Practitioner highlight how critical this role is becoming in today’s landscape. Can’t wait to dive into the case studies and see how others are navigating this journey! ??

Julia La Roche

Podcast + YouTube Host | Emcee + Moderator | Journalist + Content Creator l Social Media + Communications Strategist

1 个月

Love this

要查看或添加评论,请登录

Alec Coughlin的更多文章

社区洞察

其他会员也浏览了