Computex: The AI Chip Wars in Focus
Copyright 2024 J.Gold Associates, LLC.

Computex: The AI Chip Wars in Focus

The recent Computex event had an underlying theme of AI, much like industry in general these days. With the hype cycle going full bore on AI enablement and vendor positioning, it was a time for the base-level AI enablers, the chip companies, to strut their stuff. But while all position themselves as leaders of their respective markets, there is a good deal of overlap and head to head competition at the same time that the AI market is diversifying which doesn’t make judging the leaders or market opportunities so easy. Below is our analysis of the companies and how they fit into the overall AI puzzle, and where we see them being successful.

It’s Not Just About Cloud and Data Center

The race is on in AI, but AI is not one market. While Nvidia and AMD concentrate on the high end training market in cloud and data centers, there is a lot of work going on at the edge and personal device level. Within the next 2-3 years, we expect the AI PC to be 65%-75% of the overall market, with an even higher share in the enterprise. That means Intel, AMD and now Qualcomm will be fighting for market share. We expect Intel to hold the majority of enterprise market share, followed by AMD with Qualcomm being more popular in the high end consumer and SMB space, but still a minority player. We also expect that virtually all mid to high end smartphones will include very capable AI acceleration, reinforcing the strategy of Qualcomm, MediaTek, and Arm

There will also be a major battle in the Edge space for AI capability. Nvidia wants to be a player there, but we don’t expect them to have the product breath necessary to capture much share except at the very high end. Arm based solutions will capture lower end edge solutions, but Intel and AMD will likely battle for the mid to high end of the edge market, with Intel currently having the “edge” when it comes to product diversity and ability to add workload specific accelerators (e.g., for networking, wireless, etc.). We expect 65%-75% of AI workloads to be running on edge devices in the next 2-3 years and only high end AI running in the data center on high end compute.

Picking an Optimum Processor for Enterprise Workloads

Nvidia has gained a lot of attention in the AI chip space, but AMD, Qualcomm and Intel offer AI chips that would be suitable for enterprise AI workloads as well. What factors should a company consider when deciding which chip vendor to purchase from?

The ultimate issue is going to be performance per dollar and performance per watt. The extreme power of 1KW per device required to run the high end chips like Nvidia’s Blackwell is problematic from a power availability and cost perspective when you are running 30K or more chips per system. Inference workloads will happily run on more generic CPUs with some AI acceleration as needed. And since ultimately inference workloads will constitute the bulk of AI processing that gives Intel and AMD some advantages as Nvidia really doesn’t have a competitive CPU offering. Even at the AI chip level, Intel’s Gaudi for inference processing is much more power efficient with better performance per watt than Nvidia chips (3 times more efficient according to Intel). Even AMD is beginning to focus its messaging on comparing its efficiency against Nvidia. We expect diversity in AI based processing to be a major factor in expansion of the market for AI workloads over the next 2-3 years.

Some Specific Vendor Analysis

Intel doesn’t have the lowest power or highest performance chips for AI. But it does have one advantage. As we move to a distributed AI world, where AI workloads run across a variety of platforms from personal devices to edge computers to data centers and cloud, being able to easily move workloads to the most appropriate platform is important. Except for its Gaudi devices, Intel’s compatible x86 architecture means porting apps is relatively simple. No other other chip provider has such a breadth of compatible compute platforms (the closest is AMD).

Qualcomm is adding NPUs to their mainline Snapdragon chips bringing high performance AI to the mobile marketplace, with most premium phones now offering AI capabilities, both from the Android side and in some cases with phone makers’ own enhancements (e.g., Samsung). But the biggest potential new market for Qualcomm is the AI PC. We expect the Qualcomm PC efforts to capture mostly the pro-sumer space, as enterprises are hesitant to fully change to a new compute platform due to app and driver concerns. And the lack of an integrated 5G modem to connect to the cellular network embedded in all Snapdragon powered PCs, much like we see with WiFi and Bluetooth today, has removed a potential advantage for Qualcomm. While the PC vendors are still unsure of the uptake, and especially hesitant to add the costs associated with cellular connectivity, having a built-in modem, much like what is done in the smartphone market, would make significant price reductions possible, along with E-Sim capability to ease carrier provisioning, and would have greatly stimulated the market.?

MediaTek has upped its game and has created new mobile chips that are directed at competing at the high end of the mobile market and specifically competing with Qualcomm. MediaTek is no longer just focused on the mid and low tier volume markets, especially in the Far East, that it was known for previously. But most of the future of AI for MediaTek is focused on the IoT and Edge capability. They currently have no interest in competing in the PC market, and we don’t expect that to change in the short term. We expect to see MediaTek compete aggressively in the IoT, automotive and edge space and offer significant competitive pressure to Qualcomm, as this is an area where MediaTek can best leverage its strengths.?

Arm – With all of its licensees around the world, Arm will be a major player in nearly all aspects of AI computing. It continues to add AI IP to its products both at the lower end IoT and in its Edge and data center focused products. The result is an increased capacity to enable a wide variety of chip companies that have created a unique ecosystem. We expect Arm to continue its AI enablement, resulting in a wide array of AI accelerated chip offerings from a variety of companies. This is clearly a growth potential for Arm, and it plans to reap the benefits going forward.

Marvell is not primarily a CPU/GPU vendor so its position in the market is a bit different than the companies mentioned above. With its expertise in networking, data movement and storage, it’s trying to become the advanced high speed network default provider for AI in the high end of the market. To this end, it has secured relationships with AI chip and system vendors. We expect Marvel to continue to be a primary implementer of high speed networks for data centers and internal system high speed connectivity (e.g., distributed memory). With AI’s insatiable need for high speed data transfer, Marvel is well positioned to take advantage of the AI boom.

The Rest of the Pack – There are a number of startups that are focused on specific target AI markets, as well as the major hyperscalers (e.g., AWS, Google Cloud, Microsoft Azure) creating their own AI accelerators. We expect such diversification to continue for at least the next 2-3 years until the market shakes out. It’s likely the hyperscalers will continue to create their own optimized accelerators for their infrastructure and price/performance needs for an extended period of time. But with a plethora of startups, it’s likely that almost all will either be acquired (e.g., Intel did with Habana Labs), or fade away. Nevertheless, new entrants always bring new perspectives that benefit the market longer term, and the AI space is likely no exception.

Bottom Line: The expanding market for AI enabled systems and applications means that a variety of workloads will need to be processed on a variety of systems. With workload diversity, and a shift from primarily high end model training on super-computer class systems to a distribution of inference-based AI workloads across many different sizes and types of systems, there is a wide range of potential markets that the chip vendors can address. We expect a very healthy competitive situation to emerge over the next 1-2 years and continue for years to come. While very high end data center solution (e.g., Nvidia) will continue to grow, the expansion will be much greater in many other aspects of the market, lifting a variety of chip vendors in the move to AI everywhere.

Copyright 2024 J.Gold Associates. LLC.

J.Gold Associates provides advisory services, syndicated research, strategic consulting and in context analysis to help its clients make important technology choices and to enable improved product deployment decisions and go to market strategies. Email - info (at) jgoldassociates (dot) com

Omar ElShazli

Campus Strategist @Perplexity | EE @AUC

9 个月

Insightful

回复
Bill Sheppard

I bring a proven track record in Business Development and Partnerships across the IoT, Smart Home, Consumer Electronics/Media, and Voice sectors, spanning startups to major players. Let's talk!

9 个月

Any thoughts about Groq? They have shown incredibly impressive performance.

要查看或添加评论,请登录

Jack Gold的更多文章

社区洞察

其他会员也浏览了