Everyone's Making Semiconductors

Everyone's Making Semiconductors


Excerpt from September 28 Yardeni Research Morning Briefing.

Jackie Doherty & Ed Yardeni

Large tech companies have been jumping into the semiconductor industry. Amazon, Google, Tesla, and others have developed semiconductors for use in their own operations instead of buying all of their semiconductors from Nvidia, Intel and the like. Custom-made chips tailored to their company’s specific requirements can perform better and are cheaper to make than buying other companies’ chips in the market.

In the case of AI server chips, companies undoubtedly are looking to save money by developing an alternative to Nvidia’s chips, which in the case of its A100 GPUs can sell for $20,000 to $25,000 each on eBay . The costs can add up quickly. OpenAI, for example, will need more than 30,000 of Nvidia’s A100 GPUs for the commercialization of ChatGPT, an April 18 article at TheVerge.com reported.

Here’s a look at the progress some tech companies are making in designing their own chips:

(1) Amazon chips in. Earlier this week, Amazon said it will invest up to $4 billion in Anthropic, an artificial intelligence (AI) firm with an AI chatbot called “Claude 2.” Anthropic will use Amazon Web Services (AWS) as its primary cloud provider, and it will use AWS-designed semiconductors when training the AI models on vast amounts of data.

Anthropic will use AWS Trainium and Inferentia chips to build, train, and deploy future foundation models. The two companies will also collaborate on the development of future Trinium and Inferentia technology. The two chips are considered a less expensive, more accessible alternative to Nvidia chips used for the same purposes.

Amazon jumpstarted its efforts in chip development in 2015 when it purchased Annapurna Labs, an Israeli startup. Since then, it has produced Graviton and Nitro, chips used in its servers. Now Amazon has an AI package to offer clients. In addition to Anthropic, Amazon can offer clients its Trainium and Inferentia chips; Titan, a large language model; and Bedrock, a service to help developers enhance software using generative AI. Some believe that having its own AI chips—which Microsoft does not have—will become a differentiator for Amazon, an excellent August 21 CNBC article on Amazon’s efforts reported.

(2) Google has AI chips too. Google has custom developed Tensor Processing Units, chips designed to accelerate machine learning tasks like image recognition, natural language processing, and predictive analysis. Only customers of Google Cloud access the chips.

Google has also developed Tensor chips for its Pixel phones in conjunction with Samsung. Google reportedly is working to design its first fully custom chipset, the Tensor G5, by 2025 without the aid of Samsung, a July 7 Tom’s Guide article reported. TSMC would handle the production of the chip.

(3) Tesla has Dojo. Tesla has built the Dojo chip to train AI networks in data centers. The chips are designed and built for “maximum performance, throughput and bandwidth at every granularity,” the company’s website states. The chips are used in the company’s Dojo supercomputer, first revealed in 2021, and used to train Tesla’s self-driving AI models. It can quickly analyze the company’s extensive video from its fleet of vehicles, a September 25 article on DriveTeslaCanada.ca reported. The system could also be used in robotics and other autonomous systems. Tesla uses Taiwan Semiconductor Manufacturing Co. to manufacture the Dojo chips and reportedly has doubled its order this year, the article stated.

(4) Meta & Microsoft in the mix. Microsoft is working on developing the Althena AI chip, which could replace Nvidia chips. The project, which began in 2019, reportedly will result in chips that will be made available to Microsoft and Open AI employees as soon as next year.

Meta is also working on a chip for its AI services. The Meta Training and Inference Accelerator—or MTIA chip—in combination with GPUs purportedly delivers better performance, decreased latency, and greater efficiency, a May 18 article at TheVerge.com reported. It’s not expected to come out until 2025.

(5) Semi industry performance data. The S&P 500 Semiconductors stock price index has risen 66.2% ytd through Tuesday’s close, though it’s down 11.5% from its record high on August 1 (Fig. 1 ). Nvidia has had a huge impact on the industry this year. Its shares have risen 188.9% ytd through Tuesday’s close. If the company were removed from the S&P 500 Semiconductors stock price index, the index would be up only 23.2% ytd.

Semiconductor stocks have rallied in advance of the rebound anticipated in revenues and earnings growth next year. The industry is expected to see its revenue growth flip from a decline of 2.1% this year to an increase of 17.4% in 2024 (Fig. 2 ). Likewise, earnings are expected to decline 7.5% this year but surge 37.0% in 2024 (Fig. 3 ). If Nvidia’s earnings were eliminated from the Semiconductors industry, the industry’s forward revenues growth rate would drop to 7.9% from 15.6% and its forward earnings growth drops to 20.6% from 35.5%.

The Semiconductors industry’s forward P/E peaked at 29.5 in mid-July, and it currently stands at 22.7 (Fig. 4 ). But as earnings rebound next year, the cyclical industry’s forward P/E should fall. If Nvidia’s forward P/E of 28.3 were eliminated from the calculation, the industry’s forward P/E would be only 19.0.

Try our?research service. See our Predicting the Markets book series on?Dr. Ed's Amazon Author Page. Please see our?hedge clause.


Melissa A. Masterson-Kellenberger, CFA

Vice President, Senior Investment Advisor | Chartered Financial Analyst?

1 年

I hope they make some here.

回复
Santee Almonte

Founder & CEO of ZDA Enterprises Inc.

1 年

Great Article, Nvidia’s skyrocketing trend seemed to me like, the big boys where asking why their lunch was feeling lite…… Makes sense to develop your own meal ????♂? JS

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了