NVidia’s AI Advantage

NVidia’s AI Advantage

Why the Capability to Build LLMs is No Longer a Sustainable Strategic Advantage

Article Summary?

The article I used for this week’s analysis is “2 Reasons to Buy NVidia Stock Like There’s No Tomorrow” by Dani Cook, featured in the Motley Fool. The article gives a few reasons why investors should consider NVidia Stock. I found the analysis to be brief and lack some substance considering our analysis of the Resource-Based View of Strategic Resources, especially sustainably strategic resources. His reasons stressed the stock price and their AI advantage but didn’t go into any significant depth or compare that advantage to other AI leaders in technology. (Cook, 2024) Let’s take a deeper dive with more research and find out where NVidia stands compared to other big AI leaders.?

Strategic Management Integration?

When studying the internal environment of a firm, it's useful to try and identify strategic advantages held by firms. The resource-based theory defines strategic resources as those capabilities the firm has that are valuable, rare, difficult to imitate, and organized to capture value. (Kennedy, 2020) According to that theoretical approach, only NVidia has a sustainable strategic advantage in AI technology because they can produce the H100 chips that power the AI Large Language Models, or LLMs, produced by the top AI software product developers. (Cook, 2024) For this analysis, I’ll contrast that capability with the capability to produce LLMs held by OpenAI, Google, and Amazon. Let’s walk through the definition of a resource-based strategic advantage and decide whether each company has a sustainable competitive advantage based on its AI capabilities, hoping to derive insight into who has a sustainable strategic advantage with AI.?

The strategic resource must fulfill four criteria to be considered a sustainable strategy. (Kennedy, 2020) Strategic firm resources help that firm compete in the market and produce value. Those strategic resources are sustainable if they help the firm maintain that competitive advantage over time. By that criteria, all four companies mentioned have a valuable strategic resource in their capabilities to produce AI-based products. NVidia’s engineers can design and produce hardware that allows OpenAI, Google, and Amazon to create AI software solutions for their customers. (David, 2024) Based on how popular AI has been and the adoption of AI in the industry since the end of 2022, the capability to create AI-based hardware and software solutions is very valuable to all four firms.?

The next criterion is that the strategic resource must be rare.? (Kennedy, 2020) This is where the four companies’ capabilities diverge. Only NVidia has a rare resource in the ability to manufacture H100 and similar other processors it sells to companies to perform AI computation. (David, 2024) No other company has the capability that NVidia has when it comes to AI computation, for now. NVidia manufactures the H100 chips using a secret process that has not been reproduced. A few weeks ago, NVidia announced their newest processor platform, which is even more powerful, smaller, and uses less electricity, meaning they’ll continue to dominate the market. (Cook, 2024)?

OpenAI’s ChatGPT capability to create an AI Chatbot that everyone wanted was unique. Still, Google quickly copied it, trying desperately not to miss out on the next big answer source for the Google Search Engine. (Raffo, 2024) Another huge contender is Amazon’s Claude, which held the record for the largest context size for a few weeks and is also quite powerful. (Kelly, 2023) Because there are at least three large language models, the type of systems defined above, the rareness of OpenAI’s capability to create LLM-based ChatGPT was short-lived. Hundreds of large language models exist, but those are the three most common commercial solutions. Because of how many LLMs exist now, the capability to create one is not rare, and anyone with a powerful computer can spin one up at their house and start using it with their data.?

The next criterion is that the resources could be more challenging to imitate. (Kennedy, 2020) Facebook eliminated this difficulty for many when they released their LLM, called LLAMA, as open source, meaning any firm can now create or modify their own LLM. (Kim, 2023) That capability means LLMs are no longer difficult to imitate, and any firm with the required technical resources can create its own LLM. That rules out the big three I mentioned before again because anyone can imitate them and create their own LLM. NVidia’s ability to create powerful chips that perform best has yet to be imitated. AMD and Intel are chip manufacturers capable of creating chips, but they have no technology and no market share anywhere close to what NVidia has with their platforms. This suggests they haven’t figured out NVidia’s secret yet. Score another point for NVidia’s strategic capabilities for making chips, a sustainable strategic resource.?

The final criterion for a sustainable strategic resource is that the firm's capability must be organized to capture value (Kennedy, 2020). All four firms are very mature, organized to capture value, and capable of creating valuable products and services for their customers.?

Views and Opinions?

If NVidia can keep its capability to manufacture chips that power the AI revolution we’re currently witnessing, they’re the only company that has a true sustainable strategic advantage for AI. Other firms are attempting to imitate NVidia’s strategic resources and take some of NVidia’s market share, but they have a long way to go for now. (David, 2024) The software developers mentioned that they use their capabilities to provide products for firms to use and improve our business practices. Still, they compete in an increasingly competitive market for AI tools, where any other firm can spin up their own LLM and start processing data. Only NVidia’s capability to create the underlying processors that make AI possible is a sustainable strategic advantage, promising they’ll be an AI contender for years.?

?

Works Cited:?

Cook, D. (2024, March 28). 2 Reasons to Buy Nvidia Stock Like There’s No Tomorrow. The Motley Fool; The Motley Fool. https://www.fool.com/investing/2024/03/28/2-reasons-to-buy-nvidia-stock-like-theres-no-tomor/?

David, E. (2024, February). Chip race: Microsoft, Meta, Google, and Nvidia battle it out for AI chip supremacy. The Verge; The Verge. https://www.theverge.com/2024/2/1/24058186/ai-chips-meta-microsoft-google-nvidia?

Kelly, R., & Bathgate, R. (2023, November 22). Anthropic just released Claude 2.1, and it vastly outperforms GPT-4 on token capacity. ITPro; IT Pro. https://www.itpro.com/technology/artificial-intelligence/anthropic-just-released-claude-21-and-it-offers-more-than-double-the-token-capacity-of-gpt-4?

Kennedy, Reed. (2020) Strategic Management. Blacksburg, VA: Virginia Tech Publishing. https://doi.org/10.21061/strategicmanagement CC BY NC-SA 3.0?

Kim, S. (2023, March 30). List of Open Sourced Fine-Tuned Large Language Models (LLM). Medium; Medium. https://sungkim11.medium.com/list-of-open-sourced-fine-tuned-large-language-models-llm-8d95a2e0dc76?

Raffo, D. (2024). Gemini vs. ChatGPT: What’s the difference? Enterprise AI; TechTarget. https://www.techtarget.com/searchenterpriseai/tip/Gemini-vs-ChatGPT-Whats-the-difference?

要查看或添加评论,请登录

Daniel Buchanan的更多文章

社区洞察

其他会员也浏览了