"Large Language Models are a watershed moment in computing": Walmart's Suresh Kumar
Ulrike Hoffmann-Burchardi
Chief Investment Officer Global Equities, UBS Global Wealth Management
Suresh Kumar is the executive vice president and global chief technology officer (CTO) and chief development officer (CDO) of Walmart, Inc. Suresh has 30 years of technology leadership experience, most recently from Google where he served as vice president and general manager of display, video, app ads, and analytics. Prior to Google, Suresh was the corporate vice president of Microsoft’s cloud infrastructure and operations and prior to that spent 15 years at Amazon in various leadership roles.
Suresh holds a PhD in engineering from Princeton University and a bachelor of technology from the Indian Institute of Technology, Madras.
Walmart is the largest retailer in the world with over $600bn in revenue (as of Q4 2022) and employs 2.1 million associates worldwide.
UHB: There have been many technological disruptions in the last few decades, now we are in the early innings of large language models. Where do you see the opportunities of this technology?
SK: We have seen a lot of technology advancements over the last ten to fifteen years or so, but large language models are a watershed moment in computing.?I remember a long time ago, when I was doing my PhD, the first neural nets started coming out.?One of my colleagues was working on neural nets for control system automation – it had 8 nodes. I don’t think they’ve publicized exactly how many nodes large language models now have, but it’s likely close to a trillion. Things are progressing quite dramatically, and what we are seeing is the promise of, at least from a retail perspective, a digital assistant that truly can help you with your shopping journey. It’s really exciting. Large language models and AI in general are going to redefine customer expectations. Equally important will be their impact on businesses. In every industry there are a large number of people whose job is to collect data from a large number of different sources, then try to reason over it and find the insights that you want.?Expect to see incredible gains to productivity as all of that becomes easier and a lot more streamlined.?It’s hard to understate the impact that these breakthroughs are going to have in the medium term.
UHB: How will customer engagement look like if conversation becomes the UI? ?
SK: Let’s look at it through a customer perspective – every industry including retail goes through these big cycles of disruption and usually, technology is a contributing factor.?When Walmart introduced the Supercenter format and brought grocery and general merchandise together, that was disruptive, and it was enabled by technology. Not technology that directly parses the customer experience, but indirectly through the supply chain.?Technology creates new capabilities. When it gets adopted, it ends up redefining the standards, and disrupts industries. ?
We are at the beginning of another big disruption in retail. Even before the introduction of large language models, a lot of things were already starting to get disrupted.?How people find products is fundamentally changing – my daughters don’t do a traditional search for products, they’re influenced by their Pinterest pages, their Instagram feeds, and so on. Products are finding you, rather than you finding products. ?
Where the large language models are going to accelerate disruption is around the top of the funnel: how you start your customer journey, how you find products, and how you do research about products. These are going to fundamentally change as this technology is integrated into more apps and experiences.?Shopping assistants, or ‘co-pilots’ for your shopping journey will make the customer journey that much more intuitive and that much more fulfilling. We’ll want these co-pilots to understand voice, text, images. ?How these different modes come together, that is going to be the disruptive part. ?
On the associate side, these capabilities can help with productivity in the field or in the home office, they can make you more efficient and get to the answers that you need quickly.?We are working on unlocking data insights, making a lot of things self-serve, being able to direct work for our associates so that they can spend more time directly focusing on helping our customers and members.?
UHB: There seems to be potential to go beyond the predictive element of a customer interaction to a broader conversation that can open up new avenues.
SK: We think about this as a shopping journey - it's a mission that you want to fulfill. And the products that you select are just one part of it. If you want to plan for your daughter's birthday, then the products that you want to buy are just one aspect of it, and there is a larger piece around it- the cake that you want, what your guests want to see- that is how you fulfill the mission that you are on.?And that's where LLMs become very powerful. ?
In the medium term, we will see an infusion of these capabilities into a lot of existing customer experiences.?Whether they are expressed through apps, or enhancements to the current tools, all these can show up in the middle of your customer journey.?You will see many different types of use cases that are brought to life.?And each one of these things are going to be slightly different because they are going to be tuned to unique customer expectations and what is most natural to that customer.?I don't think you will see a ‘one size fits all’.
UHB: Makes sense. It's exciting and disruptive, both front-facing and internally facing. In your seat, as CTO of the largest retailer in the world, how do you lever LLMs for the organization as a whole??
SK: It's a very exciting time to be in retail.?It's a very exciting time to be in Walmart. And it's an exciting time to be in technology. ?
This is just the latest in a large series of disruptive new technologies.?Before that, we had deep learning where ML was used more in a predictive way.?In fact, Walmart won the 2023 INFORMS Edelman Award for enhanced daily retail truck routing and load planning. Our Global Tech teams examined the process of planning from a holistic perspective, leveraged advanced AI algorithms to avoid millions of pounds of CO2 emissions, which also resulted in savings of tens of millions during fiscal year 2023.?We have things like IoT, we have 5G that is unlocking capabilities for supply chains, and you've got blockchain. A lot of new technologies which have an impact on our industry have started accelerating in the last several years. ?
About four years ago, when I started at Walmart and we brought all of the technology together into one organization, we recognized the need to transform not just how we build our technology solutions, but also how we transform the entire enterprise so that technology becomes an integral part of everything that we do.?That's where we started - to have first a clear understanding and articulation of where we want to be and how we want to operate. We want to be an enterprise that is agile, that's innovative, that's customer- and member-obsessed, while being people-led and tech-powered.?At its core, it's the people who are at the center and we want our technology to light up everything that they do. ?
One of the things that we did very early on is we started building very fundamental foundational capabilities for all aspects of our business, whether it is the experiences that engage our customers, or the experiences that make our associates more productive.?We started building foundational platform capabilities such as traditional deep learning ML models, that pull all our data and drive things like demand forecasting and our supply chain optimization, to things like our Converse capability where we can now start plugging in different natural language models. We invested in our infrastructure layer, which combined the best of public cloud with our own custom-built private cloud that pulls all of the different stores and our supply chain nodes into one seamless infrastructure.
We are now at a stage where we have capabilities and platforms that are directly tied to great customer and member experiences.?As we innovate, we bring in new technology as soon as it becomes available, and we have the capability to plug it in for all of these different use cases. To unlock the power of LLMs you need to have capabilities around it. You need infrastructure and you also need intelligence that can pull all the data together. Large language models by themselves are general purpose models; we can serve our customers best, when we know them well. It’s the same thing when it comes to efficiency and productivity.
UHB: As you're thinking about LLMs I'm curious where you come out in the debate on small models versus large models?
SK: There is a lot of debate-I don't think it’s going to get settled one way or the other. It will continue to evolve.?Our approach is pretty straightforward - we are looking at both LLMs that are very large general purpose, as well as open-source models that have started coming out, including LLaMA and the Stanford models.?We are now creating capabilities where we can plug different types of models in and make sure that we choose the ones that are appropriate for our use cases. Our approach is to make use of foundational models that are available, whether through open source or through our partners, in a way where we have good controls and trust over the data. That is an essential part- we must ensure that our customer data is protected at all times.
To answer your question - it's not a question of one or the other. I do think that just like the traditional predictive AI models, the generative AI models will have to be chosen based upon what is the problem that you are trying to solve.?For example, healthcare, you don't want a large-language model to be able to take very sensitive healthcare data and mingle that with everything else. There are definite cases where you need to be able to look at what you want to do, how do you want to solve the customer problem in a way that is trustworthy, in a way that is natural, and, of course, efficient and cost-effective.?
UHB: There's a big debate also - prompt engineering versus fine-tuning versus training – what is your view?
SK: If you look at it from the customer perspective, you want the interaction to be as natural and intuitive as possible. Right now, the conversation that you and I are having is a natural conversation where we bring in different knowledge sets and different domains, all sitting on top of a basic understanding of English. This is where we need to get to. Whether it is prompt engineering, whether it is fine tuning, I care less about the actual process or the means or even the technology. I care a lot more about the outcome, which is, “What is it that we are trying to do for the customer? If the customer is interested in trying to navigate through our selection and if the customer says, "Hey, I'm trying to buy a cell phone for my daughter," then what is the best way that I can understand the intent, and respond back and say, "The things that you need to care about are- “Is it robust? Is it going to break if it drops? Does it have parental controls enabled?" How do I extract that out and present it to you in a way that is as seamless as possible, and how are we doing that??
UHB: It's a great example because you would think one of the biggest things for you in that situation would be to gain more information. How old is the daughter? What are the use cases? Why is she getting a phone? Is it safety? And so on. Because then you get a much more complete picture of the customer.
SK: This is exactly right. The deeper we can understand both the customer and their intent, what mission they are on, what is the context, the better we can serve them. And traditional search is very transactional and essentially stateless. It forgets everything that you have just done, forgets about even the last session. Large language models help you have a stateful conversation that persists over time. You combine that with multi-modality, you could be having conversation over here and then you could walk over to the store and continue the conversation. And if we recognize you and if you know that you are on a shopping journey to pick up the best cell phone for your daughter, we can present you with five great cell phones that might suit you. So, statefulness, understanding the customer, and being multi-modal, is going to unlock a lot of the capabilities.
UHB: That’s a good point to talk about your associates and team. They have been through a lot- COVID and digital acceleration. We are barely over the digital hangover, the economy is more challenging, and now large language models have arrived. How do you get the organization aligned behind what you're seeing as this incredible opportunity, when some people may think, “this is a threat to my job”? ?
SK: Ultimately, the North Star is about our purpose.?Our founder, Sam Walton, articulated our purpose very well, which is to help people save money and live better. From that, everything else follows. Our culture is built on our purpose- our values are timeless. And these provide anchor points when a lot of other things change.
Customer behavior changes, the pandemic changed the way in which you work. Technology changes a lot. Your day-to-day work changes a lot. In the day-to-day, you have to be adaptive. But there are certain things that don't change. Our purpose doesn't change. Our values don't change. We want to be customer- and member-obsessed and always look for ways in which we can help serve our customers better. Our culture of collaboration doesn't change.?
You asked about technology as a threat to people's jobs -No, I do not think technology is a threat. What will happen -what has been the pattern through every disruption- is that some jobs will evolve and entirely new jobs will be created. The end goal is improved experiences for everyone. ?
UHB: And then, how are you approaching the implementation of LLMs internally??Top-down versus bottom-up??
领英推荐
SK: It is very early. What we want to do is two things. One, we want to create environments where different teams can innovate quickly in a way that protects our data, and the customer experience and the trust that they have placed in us. And we want to do it in such a way where we can learn from each other quickly. It's very important in an organization as large as Walmart, that if one team does some things and has a bunch of learnings, that learning gets propagated throughout the organization as quickly as possible.
UHB: Makes sense. You’ve mentioned data and data infrastructure a few times now. I'm curious, what does this looks like at Walmart?
SK: One of the unique things about Walmart is that because we serve our customers and members in so many different ways - online, offline, pickup, delivery - we are in a unique position to know more about our customers and their behaviors, their shopping habits, their preferences, much more holistically than just about anybody else. That's a great advantage for us because it allows us to serve them better than anyone else.?
Four years ago, when I came in, we brought not only all of technology together, but also all the data. The cloud is a great place for us to be able to reason over massive sets of data. If you want to do things on-prem, there are scale limitations. And especially now with LLMs, we need a lot more unstructured data.
Our entire data estate is now in the cloud, where we can run different types of models and different types of analytical tools. We've got all of those pieces in place. That is now allowing us to do things like demand forecasting and finding insights around shopping behavior that influence the way in which we design our customer experiences. We share these insights with our suppliers so that they can make the products better.
UHB: So, what you said about the cloud, Suresh, does that mean that are you are done with your cloud transition?
SK: We are largely done. So, two things - number one is that when I talk about cloud, I'm talking about a hybrid cloud. We have built an infrastructure layer that essentially combines the best of public and private cloud. We have built out our own private cloud that not only takes our data center investments, but it also ties all of our edge together - edge means all our stores and supply chain nodes. We have built a layer that orchestrates all of these things as seamlessly as possible. ?
There are some things that are not necessarily the best candidates for running in the cloud- those we run on-prem. The large majority of workloads that drive customer experience, that drive associate experience, and that drive intelligence, those things need to be in the cloud.
UHB: Makes sense. Some of your peer CTOs in other verticals they have talked about developing software products that could be used externally. Have you thought about a retail cloud strategy??
SK: Let's take a step back. If you put the customer at the center, we want to serve our customer in the best possible way, and that requires us to build a business that is made up of components that are mutually self-reinforcing. We are different from many of the other retailers, in that we want to create an open ecosystem where multiple parties can participate and derive value. Let's take our marketplace as an example. We have a thriving marketplace that's fast growing, where a lot of sellers are listing their items. Many of these sellers are retailers in their own right, and they have technology needs as well.?About a year ago, we launched Walmart Commerce Technologies to sell solutions based on our proven technology and to help other businesses elevate their offerings, keep pace with the rapidly evolving marketplace and meet the needs of changing customer expectations. ?
I'll give you one more example- our last mile delivery system. We built a last mile delivery capability called Spark. This is the technology that powers our gig workers and others to be able to deliver products for last mile. If you're delivering products from Walmart, maybe you as a customer might want to order a pizza from a pizza store next door, and you want both of them to get delivered at the same time. That's a capability that we are now offering through our Go Local offering. It's more about solutions that ultimately benefit the customer by building a stronger ecosystem.
UHB: Let’s pivot to a quick rapid fire to finish up. The first one is – build/buy, insourcing/outsourcing, where do you land??
SK: If it is core to our business and/or if it is a competitive advantage, we build ourselves and we insource. Demand forecasting, supply chain, and customer experience are examples of that.?
If it isn't core and there are high-quality industry-accepted solutions that are available and that work at Walmart scale, then we buy or ideally we use open source if possible: infrastructure components, load balancers, basic platform capabilities like distributed cache, or even areas like finance.?
It’s the same thing when it comes to insource versus outsource - If it’s core and creates competitive advantage, we want to insource. If it is a short-term project, migrating a system, modernizing something, moving something to the cloud, we can augment through outsource work.
UHB: Any technologies that you are currently looking for externally, maybe in connection with LLMs?
SK: We have a vibrant connection with the startup community. You'd be amazed at the number of areas we look at- from supply chain to core infrastructure, to customer experience, monitoring and observability. Through our Sparkcubate program we identify start-ups and companies who can inject energy into early-stage innovation and help to solve problem statements unique to Walmart. We also work closely with our big strategic partners. So again, it's an ecosystem on the technology side as well.?
UHB: Augmentation versus automation?
SK: When it comes to daily operations, automation makes sense. We have made investments in automating our distribution centers with robots, for example. Augmentation is important where it helps us make strategic decisions. For example, we have talked about unlocking insights from data, selecting and optimizing our assortment, as an example. We want tools to be able to augment human intelligence, act as a co-pilot, if you will.
UHB: You touched on this:?voice as a UI of the future?
SK: I don't think voice alone is going to be the UI. Multi-modality is going to be the future UI. You want seamlessness from whether you speak or text, go to a screen and look at the product, from your AR/VR device or interact with it. Seamless multi-modality is going to be the UI.
UHB: Most interesting and unexpected insights from data?
SK: I'll give you two of them. One thing that we found is that online, customers from different generations behave very differently. We found that Gen Zs, for example, rearrange the sorting order from low to high price when they land on a search page, and then they don't use any of the default filters of recommended or top-rated products or items. That’s not something that I would have guessed up front, but that's what we see. Whereas baby boomers, they typically follow just the default recommendations. It's very interesting to see different generations behave very, very differently.
A second example - you know that we are a very seasonal business. We run massive sales during our holiday period and everybody gets very excited. The point in time we know when our holiday sales have peaked and ended, is as soon as bananas become the top-selling item again. It's something that you would not have ever guessed. Bananas are a staple that typically is the top-selling item in normal times. During holiday times, other things peak up, but you know that it's all over and we are back to normal again when he top-selling item is a banana.?
UHB: And last question- most powerful innovation in retail in the next 10 years that seems futuristic right now?
SK: We have talked a lot about generative AI, and that probably is going to be the one that's going to have the most impact. It's easy to see the type of impact that it will have in the medium term; in the long term perhaps it’s going to have the largest amount of impact, but still to be determined and discovered exactly it's going to transform the industry and the customer experience.
UHB: Is there a sort of multi-modal AR infused vision that you have?
SK: We are very much working on it. Large pieces need to come together- stay tuned, you will hear a lot of new things that start coming out soon.
UHB: Looking forward to that! I want to again thank you for your time, Suresh.?
Jack Adams
I love the insight about bananas!
VP of Engineering at Aspectiva
1 年Ofer Egozi
Driving Workforce Enablement: Sr.Director of Eng, People Tech at Walmart EBS
1 年Admire this refreshing perspective, evaluating disruptive technologies with people centered approach. Bananas indicating end of holiday madness ??- only organizations like ours with extensive large-scale operations can uncover such patterns.