AMD Shows its Breadth of Processor Coverage Powering All Areas of AI

AMD Shows its Breadth of Processor Coverage Powering All Areas of AI

AI is not one market and one processor area. It covers a wide gamut of processing needs, and continues to expand from the current sweet spot of high end massive training systems running high performance GPUs, to inference-based edge servers primarily relying on accelerated CPUs, to next generation NPU enhanced chips for AI enabled PCs. It's not easy for a single company to cover the complete spectrum, but AMD is closer to being such a full spectrum supplier than other major players.

At AMD’s AI Platform day, it showed a variety of products and services that are significantly increasing its competiveness in the market place, and putting it in a position to challenge the incumbent leaders in AI solutions in servers, software, networking and the AI PC. With its breadth of products, it leads others in covering most of the expanding markets for AI solutions, making it a strong choice for solution vendors and enterprises and offering an additional processor choice necessary for a healthy ecosystem. While this is an evolving market, and vendors are continuing to update their solutions at a fast pace, AMD plans to keep its own fast pace of product updates on a yearly basis in order to remain competitive.

Server/Cloud

AMD is providing a strong alternative to Intel with its EPYC CPUs, which is continuing to challenge Intel Xeon in both on-prem and cloud systems. Most of the hyperscalers have EPYC solutions available for their customers, and AMD claims the new EPYC devices will provide up to a 7X consolidation potential for current on-prem server solutions of 4 – 5 year old vintage. This provides a major TCO advantage if companies choose to upgrade. TCO advantages can be realized with upgrades of any older hardware, so while this is an impressive cost savings potential, it’s also true for the competitors’ processor upgrades, as newer generation compute power savings improvement are not unique to AMD. But AMD remains a force in servers and has increased market share over the past 2-3 years with premium products and large customer wins (e.g., META says it has 1.5M+ EPYC processors deployed).?

AI

AMD’s new Instinct MI325, which is coming in Q4 2024, is positioned as a direct challenger to industry leading Nvidia H100/200 processors, especially since its performance per watt and AI training performance are compelling. It has garnered attention from all of the hyperscalers, who have already deployed the older MI315 as an alternative to Nvidia processors for customers who want a more cost competitive option. It is also a viable alternative to the continuing limited availability of Nvidia processors. Although AMD showed some impressive test results compared to the H series devices, it is not yet clear how it will compare to the just now launching Nvidia Blackwell series devices. It's likely the “leapfrogging” of performance will continue as AMD has an enhanced processor in the pipeline (MI350 for launch in 2H 2025). Having credible alternatives to single vendor dominance is a positive for the entire marketplace.

Software and Networking

AMD has adopted an open approach to its software frameworks powering its GPUs. With the ROCm software stack, it claims to be able to easily run nearly any models (e.g., Hugging Face has 1M+ AMD optimized models) and frameworks (e.g., PyTorch, TensorFlow). AMD rightly understands that the move to open industry standards is required to advance its market position and the AI solutions market in general (Intel is taking a similar approach). This is a direct challenge to the proprietary nature of Nvidia software frameworks. AMD is also taking an open approach in its networking solutions with its Pensando programmable DPUs emphasizing the move in the marketplace to support high speed Ethernet connectivity rather than relying on proprietary connectivity such as InfiBand. We expect higher speed Ethernet options to become the leading choice in the next 2-3 years, largely replacing proprietary options.

AI PC

AMD continues to increase its market share in the PC space, primarily at Intel’s expense (Qualcomm powered AI PCs are still a very small part of the market). Its Ryzen processors have become more acceptable to enterprise users, and AMD’s focus on the AI PC with its new AI Pro series and 50 TOPS NPU capability is an industry leading benchmark. However it’s important to note that TOPS alone is not necessarily the most appropriate performance measurement as overall system design and implementation has a major effect on AI performance. But just as in the GPU area, a viable alternative processer in this space is good for the overall marketplace.

Bottom Line:

AMD is aggressively pursuing its strategy of being a full breadth processor provider with a heavy emphasis on the emerging AI market. It is successfully positioning itself as both an alternative to Intel and Nvidia, as well as an innovator in its own right. Will this increased competition ultimately bring the high cost of AI chips down to earth? Probably not in the short term as there is still more demand than supply. We do expect AMD to perform well and increase market share, although the marketplace has a continuously changing environment, and its competitors are releasing their own updated and compelling products. But organizations looking for enhanced leading edge performance would do well to consider AMD products for their AI computing needs.


Copyright 2024? J.Gold Associates, LLC.

J.Gold Associates provides advisory services, syndicated research, strategic consulting and in context analysis to help its clients make important technology choices and to enable improved product deployment decisions and go to market strategies. Join our mailing list to receive updates on our research and overviews of our reports. Email us at:? info (at) jgoldassociates (dot) com

要查看或添加评论,请登录

Jack Gold的更多文章

社区洞察

其他会员也浏览了