Newsletter #6: NVIDIA CEO's AI message - "Run don't walk...either you are running for food or you are running from becoming food."
The 英伟达 earnings beat was a shot heard around the world.
Projecting $11B in Q2 revenue, up from the street’s expectations of $7.2B is something else…
But before we dig into that, I wanted to highlight what Jensen Huang , NVIDIA’s founder / CEO, said during his commencement speech to graduating seniors at NTU on Saturday.?
The message isn't necessarily surprising coming from the founder of a company that just joined the trillion dollar club but it's important to keep in mind as riding the massive AI wave is something we’re all endeavoring to do.?
Every company.?
Every business unit.?
Every human being.
“In 40 years, we created the PC, internet, mobile, cloud and now the AI era. What will you create? Whatever it is, run after it like we did. Run, don’t walk…Either you are running for food, or you are running from becoming food.”
“Agile companies will take advantage of AI and boost their position. Companies less so will perish…While some worry that AI may take their jobs, someone who’s expert with AI will.”
“Calling 2023 ‘a perfect year to graduate,’ Huang likened the AI revolution to ‘the rebirth of the computer industry.’”
“At NVIDIA, I experienced failures — great big ones. All humiliating and embarrassing. Many nearly doomed us….Confronting our mistakes and, with humility, asking for help, saved NVIDIA.”
#1: NVIDIA earnings beat was a shot heard around the world.
The launch of ChatGPT on 11.30.23 - “AI might be having its iPhone moment” (link ) - will be remembered as the moment Generative AI captured the imagination of the C-Suite.?
CEOs experiencing the magic firsthand, comparing their experiences with their kids, comparing notes with their colleagues etc. led companies to reconsider their Corporate Strategy and how to reorient their company accordingly.
An earnings report like this throws gasoline on the fire because of how real and immediate the results are along with how broad the implications of what this means for other companies, industries etc…
Only 6 months into the post-ChatGPT, Generative AI era of business.
Quarterly revenue topped $7.2B vs $6.5B target. Quarterly earnings of $1.09 per share vs $0.92 forecast from analysts.
Q2 revenue forecast…NVIDIA is predicting $11B in revenue for Q2 vs $7.2B expectations from the street aka they’re letting the street know they’re going to absolutely smash their expectations.
A 52.8% increase in quarterly revenue projections is breathtaking proof of how quickly the airplane is taking off…
““The computer industry is going through two simultaneous transitions — accelerated computing and generative AI…A trillion dollars of installed global data center infrastructure will transition from general purpose to accelerated computing as companies race to apply generative AI into every product, service and business process.”
Yesterday, NVIDIA and WPP announced a partnership focused on enhancing the creative process via Generative AI enabled workflows which involved another major newsmaker from last week, Adobe .
“The new engine connects an ecosystem of 3D design, manufacturing and creative supply chain tools, including those from Adobe and Getty Images , letting WPP’s artists and designers integrate 3D content creation with generative AI.”?
“WPP uses responsibly trained generative AI tools and content from partners such as Adobe and Getty Images so its designers can create varied, high-fidelity images from text prompts and bring them into scenes."
"This includes Adobe Firefly, a family of creative generative AI models, and exclusive visual content from Getty Images created using NVIDIA Picasso, a foundry for custom generative AI models for visual design.”
#2: Enterprise Tech Stacks are bring recalibrated for the Generative AI economy, quickly.
Just like the Internet Economy that came before it, the Generative AI economy will require companies to look at their technology infrastructure through a lens recalibrated by what’s transpired over the last 6 months and the broad implications going forward.
Last week, Adobe’s Generative AI enhancements to Photoshop (link ) captured a lot of attention for a variety of reasons, including the tactical implications on the creative process and associated workflows.
What they’re doing with Generative AI personifies what I love about AI aka Humans + Machines not Humans vs Machines.
Their embrace of GAI brings to life a lot of the promise around the space, making it so real.
Reminded me of 3 GAI insights / ideas to consider.
1: Reduce Friction + Enable More / Better Creativity
Not all aspects of the creative process are “fun” but they’re all important. As a guy who LOVES photography and has shot a gazillion photos (i.e. the firework picture above), I love shooting but sitting down to go through and edit my pictures, that’s not so fun for me. What has captured my imagination is that not only does Adobe’s embrace of GAI make the editing process more efficient, they just made it more creative aka more fun.
2: “Speed is a Strategy” (s/o 麦肯锡 's Alex Singla )
Within the workflows that are eligible to be enhanced by GAI, it’s reasonable to suggest that a person enabled by GAI can produce 30%+ more output. While that’s a statement supported by investor research and all sorts of other credible sources, what Adobe just released makes it real within the clearly defined and big area of the creative process. “Don’t tell me, show me” kind of stuff is powerful.
3: $1 Trillion Dollars
Me → Bard: “How big is the professional services industry within marketing, advertising and media?”
Bard → Me: “In the United States, the industry employs over 2 million people and generates over $1 trillion in annual revenue.”
The global data center infrastructure isn’t the only infrastructure being transformed…
#3: JP Morgan sharing their AI receipts during their Investor Day along with filing patents for IndexGPT.?
On May 22nd during the 摩根大通 Investor Day, Lori Beer (Global Chief Information Officer) shared very impressive and tangible receipts around the way JPM is modernizing its tech stack in response to the Generative AI era of business.?
Pulled from page 9 of the Investor Day transcript (link ), I especially appreciated this section:
领英推荐
“The third pillar of our strategy is unlocking the power of data and AI."?
"The importance of this effort has never been more clear with artificial intelligence appearing regularly in the headlines. We have made tremendous progress building what we believe is a competitive advantage for JPMorgan Chase."?
"We have over 900 data scientists, 600 machine learning engineers and about 1,000 people involved in data management. We also have a 200-person top notch AI research team looking at the hardest problems in the new frontiers of finance."?
"We were recently ranked number 1 on Evident AI’s Index, the first public benchmark of major banks on AI maturity."?
"Demonstrating the progress these teams have made, last year we committed to delivering $1 billion in business value by the end of 2023."
"We are close to realizing that goal ahead of schedule and are therefore increasing our target to $1.5 billion by the end of this year."?
"This value is driven by more than 300 AI use cases in production today for risk, prospecting, marketing, customer experience and fraud prevention."?
"Last year, I highlighted trading and risk use cases, and we continue to see great value from both. But today I'll highlight two other examples, which are generating real revenue for the firm."?
"In the retail space, AI is helping us offer more personalized products and experiences to our customers, such as credit card upgrades. And collectively, this work has delivered over $220 million in benefit in the last year alone."?
"And we aren't just focused on retail. We are also leveraging AI in sales to generate insights to deepen our relationship with clients across our lines of business, such as in the Commercial Bank, where AI is helping provide growth signals and product suggestions for bankers. These efforts delivered $100 million of benefit in 2022."?
"Our ability to drive this level of value is driven not only by the sheer volume of data we possess, but also our modernization investments, which have enabled us to migrate large amounts of data to the public cloud and enhance the capabilities in our underlying data platforms."?
"These platforms enable us to develop models faster with embedded governance, demonstrating our investment discipline as we deploy AI across the firm. We are seeing strong returns and have increased our ROI 25% from 2021 to 2022. And we expect this to continue in the future.”
Raising the projections for the EOY impact from $1B to $1.5B is a 50% increase. It’s not 52.8% like NVIDIA but 50% is pretty good… ;)
*s / o to VentureBeat and Sharon Brown for the image above and the great JPM coverage (link ).
Also, the recent trademark application by JPM for IndexGPT captured a lot of attention because the product will leverage “cloud computing software using artificial intelligence” for “analyzing and selecting securities tailored to customer needs,” according to the filing and reporting by Hugh Son / CNBC .
The part that I found especially interesting is the implication of enabling a Large Language Model to be interacted with directly by JPM consumers / clients.?
Given today’s general concerns around hallucinations which are heightened in regulated industries, this speaks to the confidence JPM has in the pace of innovation and quality of LLMs.?
It also speaks to how they’re thinking about the relationship between their proprietary data, GAI systems and how those data sets will be enriched and actioned going forward to address the jobs to be done framework of their consumers / clients.
#4: Research paper by 谷歌 Google DeepMind and 美国普林斯顿大学 focused on the Tree of Thoughts (ToT) framework might be a preview of the next level application of LLMs to solve more complex, strategic questions?
After reading “Tree of Thoughts: Deliberate Problem Solving with Large Language Models” (link ), it’s a reminder of how early we are in exploring Generative AI and its application.
Oversimplifying but if today’s LLMs are super smart in that you ask them a question and they come back with an answer that’s informed by effectively all the content available on the public Internet, what about questions that are a bit more nuanced?
Questions that we, as human beings, have to think through multiple scenarios, pro / con each, do a bunch of research, weigh probabilities of “if this then that” etc. to eventually come up with some sort of a conclusion that’s rooted in an acknowledged lack of complete information?
This is the territory that the research report is exploring by introducing what could be a groundbreaking framework to enable LLMs to think through nuanced questions in a way that mimics a more human, complex thought process.
In doing so, the framework addresses some of the limitations of today’s LLMs.
“Language models are increasingly being deployed for general problem solving across a wide range of tasks, but are still confined to token-level, left-to-right decision-making processes during inference."
"This means they can fall short in tasks that require exploration, strategic lookahead, or where initial decisions play a pivotal role."
"To surmount these challenges, we introduce a new framework for language model inference, “Tree of Thoughts” (ToT), which generalizes over the popular “Chain of Thought” approach to prompting language models, and enables exploration over coherent units of text (“thoughts”) that serve as intermediate steps toward problem solving.”
“ToT allows LMs to perform deliberate decision making by considering multiple different reasoning paths and self-evaluating choices to decide the next course of action, as well as looking ahead or backtracking when necessary to make global choices."
"Our experiments show that ToT significantly enhances language models’ problem-solving abilities on three novel tasks requiring non-trivial planning or search: Game of 24, Creative Writing, and Mini Crosswords."
"For instance, in Game of 24, while GPT-4 with chain-of-thought prompting only solved 4% of tasks, our method achieved a success rate of 74%.”
“Deliberate search such as ToT might not be necessary for many existing tasks that GPT-4 already excels at, and as an initial step this work only explores three relatively simple tasks that challenges GPT-4 and calls of better search and planning abilities incorporated with LMs."
"However, as we begin to deploy LMs for more real-world decision making applications (e.g. coding, data analysis, robotics, etc.), more complex tasks could emerge and present new opportunities to study these research questions.”
Anyhow, never a dull moment in this Generative AI economy.
I hope this contributed to your weekend reading.
Enjoy your Memorial Day and have a great week.
Alec?
Well done on this weeks newsletter! Very comprehensive and useful update!
Grateful Father - PragerU...on a mission to restore Foundational American Values
1 年Alec Coughlin - great insights! Thank you for sharing.
Customer Obsessed | People Advocate | Ex-Salesforce | Ex-Gartner | Facilitating easier meaningful connections for NAs Largest Brands and their Customers
1 年You touched on so many insightful points in this weeks newsletter. I also appreciate the applicability of GAI cross industry and the new partnerships coming out of this revolutionary economy. I love Hueng’s humility around asking for help in the face of failure and encouraging our youth to run fast and chase the food!! Keep emphasizing Humans + Machines not Humans vs Machines!???? Well said, Alec Coughlin ????????
Next Trend Realty LLC./wwwHar.com/Chester-Swanson/agent_cbswan
1 年Thanks for sharing.