Electric Power Supply Shouldn’t Slow The Development Of AI Data Centers, & More
Want more research from the ARK Team? Have feedback on our publications? Click here to help inform our content creation.
1. Electric Power Supply Shouldn’t Slow The Development Of AI Data Centers
By: Sam Korus
ARK’s research suggests that power shortages are unlikely to impede the expansion of AI data center expansion. Indeed, the economics suggest that the higher electricity costs associated with rapid development of AI data centers will not impact their profitability significantly. Recently, for example, Elon Musk used generators to power xAI's data center in Memphis, Tennessee, bypassing full grid interconnection altogether.
While growth in electricity production has averaged ~2.7% at an annual rate globally for the past five years, ARK’s research estimates that the incremental demand from AI data centers will be 0.7 percentage points, pushing the growth in global electricity demand to 3.4% at a compound annual rate through 2030, as shown below.
For perspective, electricity generation in China has increased 5.7% at an annual rate over the past five years, as shown below. Indeed, according to our estimates, in 2023 alone, China installed more electric generating capacity than will be necessary to meet all the incremental global demand from AI data centers likely in 2030.
According to ARK’s research, electricity accounts for only ~9% of total AI data center costs, leaving ample room for companies to invest in expedited, non-grid power solutions without disrupting data center economics—especially given the high returns on investment that we expect from AI advancements.
2. Doubling Down On Its Robotaxi Future, Tesla Took A Giant Step Forward With Full Self-Driving v12.5
By: Tasha Keeney, CFA & Daniel Maguire, ACA
Last week, Tesla released to a select group of customer vehicles its Full Self-Driving (FSD) v12.5, a hands-free driving experience powered by its latest Hardware 4 (HW4) inference chip manufactured by Taiwan Semiconductor Manufacturing Company (TSMC). With 5x the parameter count of previous FSD models,[3] v12.5 is a dramatic upgrade in performance that can provide smooth, uninterrupted, and confident AI-controlled drives for 45 minutes or longer, according to a select group of users.[4]
ARK’s research suggests that the next update, FSD v12.5.X, will offer more step-function changes in performance, as it integrates the highway and city driving software stacks into a single end-to-end solution. Tesla will continue to reduce the intervention rate—how often drivers need to take control manually—before launching its robotaxi network.[5]
In last week’s earnings call, Musk emphasized that Tesla plans to globalize its autonomous driving technology by seeking regulatory approval for FSD deployment in Europe, China, and other countries.[6] He also projected that full autonomy should be achievable by late 2024 and highly probable in 2025. Similarly, ARK's updated Tesla valuation[7] model projects 2025 as the most likely year in which Tesla will launch a robotaxi service. Our conservative model suggests a ~40% probability of slippage into 2026, as shown below. We look forward to monitoring FSD’s progress toward full autonomy.
3. Have Open-Source Models Hit The Frontier Of AI Capability?
By: Frank Downing
领英推荐
Last week, Meta released[8] its latest group of AI models, Llama 3.1. Sized at 405 billion parameters, Llama 3.1.405B is the largest open model in the world and the first to beat leading models from OpenAI and Anthropic. As shown in the table below, Llama 3.1 405B scored consistently high based on key performance benchmarks.
To broaden access to Llama 3.1’s models, Meta partnered with established companies in the cloud computing and AI services spaces, including Amazon, Databricks, Dell, Nvidia, Groq, IBM, Google, Microsoft, Scale AI, and Snowflake. Beyond hosting inference application programming interfaces (APIs) for Llama, those providers will help enterprises fine tune Llama 3.1, implement safety guardrails, and generate synthetic data—the last role a key component of the training strategy[10] that boosted Llama 3.1 performance.
Although benchmarks are important measures of performance, speed and cost can be as important or more important, depending on the use case. Independent research from Artificial Analysis[11] suggests that, while scoring highly in overall benchmark metrics, Llama 3.1 405B is slower and more costly to run than other frontier models like Google’s Gemini 1.5 and Anthropic’s Claude 3.5.
Upsizing the largest Llama 3.1 model from 70 to 405 billion parameters appears to be the source of its performance edge, higher costs, and slower speeds. All else equal, of course, larger models cost more to run. Despite the apparent tradeoffs, the open nature of Meta’s Llama models provides greater optionality to businesses that require higher levels of privacy and control for their AI implementations.
In our view, Llama 3.1 is an important step forward for the open-source model community. Last week, Mark Zuckerberg presented his vision[12] for an open-source AI future, emphasizing that foundational open-source projects like Linux have become the most performant and secure options. Zuckerberg anticipates a similar future for AI models, with Meta at the forefront: AI models giving back to the community while powering future Meta services. Along those lines, Zuckerberg suggested that Meta’s AI chatbot could be the world’s most popular chatbot by the end of this year.
Want more research from the ARK Team? Have feedback on our publications? Click here to help inform our content creation.
[1] Energy Institute. 2024. “Statistical Review of World energy, 73rd Edition.”
[2] Climate Cooperation for China. 2023. “2022 energy Statistics Show Rapid Development of Renewable energy in China.” See also Reuters. 2024. “China's installed solar power capacity rises 55.2% in 2023.”
[3] Musk, E. 2024. “We are focusing on just Model Y with HW4 for the initial release...” X.
[4] Whole Mars Catalog. 2024. “Last night as Tesla FSD 12.5 drove me home…” X.
[5] Tesla Newswire. 2024. ““Tesla FSD (Supervised) v12.5 is now rolling out…” X.
[6] Seeking Alpha Transcripts. 2024. “Tesla Inc. (TSLA) Q2 2024 Earnings Call Transcript.” Seeking Alpha.
[7] Keeney, T. et al. 2024. “ARK’s Expected Value For Tesla 9n 2029: $2,600 Per Share.” ARK Investment Management LLC.
[8] Meta. 2024. “Meet Llama 3.1.”
[9] Meta. 2024. “Meet Llama 3.1.”
[10] Llama Tea. 2024. “The Llama 3 Herd of Models.” Meta AI.
[11] Artificial Analysis. 2024. “Llama 3.1 Instruct 405B: API Provider Benchmarking & Analysis.”
[12] Zuckerberg, M. 2024. “Open Source AI Is the Path Forward.” Meta.
Senior Energy Analyst at Northwest Power and Conservation Council
3 个月https://www.kgw.com/article/news/local/the-story/pacific-northwest-energy-summit-green-tech-power-grid/283-9ddfcf76-9587-4015-92b1-19ab3adfadda
IT and Cybersecurity Professional / DoD Compliance Consulting (NIST/CMMC). Locations in Minneapolis and Tampa
4 个月Gregory Jones
Certified in Risk and Information Systems Control | Risk and Control Lead | Information Security Risk Manager | IT Governance and Compliance
4 个月What confidence is there that FSD will be out soon and that HW4 will fully support FSD? Trust requires evidence now that the Tesla CEO is showing his true colors. Shortcuts in Tesla working conditions, build quality, and safety all make sense now because the CEO has a very narrow goal.