The Promise and REALITY of Generative AI: Goldman Sachs In-depth Report
Janet Brandon
Technical SEO Specialist at Experian, author and owner of Green Fingers blog, and Author of 'Mastering ChatGPT: for Marketers, Content Writers and SEO Professionals'
Goldman Sachs recently released an in-depth and rather scathing look at the current state of generative AI and future outlook, which is worth your time reading in full. Here's an overview of the key points discussed.
Recently, the transformative potential of generative AI technology has been a hot topic, with promises to revolutionize companies, industries, and societies. This optimism has driven tech giants, other companies, and utilities to project an estimated $1 trillion on capital expenditure in the coming years. These investments are expected to span data centers, chips, AI infrastructure, and enhancements to the power grid. However, despite this massive spending, the tangible benefits of AI remain elusive, with efficiency gains among developers being the primary achievement thus far. Even Nvidia, a company that has reaped significant benefits from AI, has seen its stock sharply corrected.
One of the critical voices in this debate is Daron Acemoglu, an Institute Professor at MIT. Acemoglu is notably skeptical about the immediate economic benefits of AI. He estimates that only a quarter of AI-exposed tasks will be cost-effective to automate within the next decade, meaning AI will impact less than 5% of all tasks. Contrary to the belief that technologies become cheaper and more effective over time, Acemoglu argues that AI advancements may not occur as quickly or impressively as many expect. He also questions whether AI adoption will create new tasks and products, noting that such impacts are "not a law of nature."
It is true that generative AI holds immense potential to change scientific discovery, research and development, innovation, new product and material testing, and more. However, these transformative changes won't happen quickly. Over the next ten years, AI technology is more likely to increase the efficiency of existing production processes by automating certain tasks or making workers who perform these tasks more productive. Estimating the productivity and growth gains from AI in the short term depends entirely on the number of production processes the technology will impact and how much it increases productivity or reduces costs.
The Reality of AI Impact on Jobs
Acemoglu's perspective is grounded in the reality that many tasks humans perform today, particularly in sectors like transportation, manufacturing, and mining, require real-world interaction and multifaceted skill sets. AI won't be able to materially improve these tasks anytime soon. Therefore, the largest impacts of AI in the coming years will likely revolve around purely mental tasks. While these tasks are non-trivial in number and size, they are not vast.
While generative AI is expected to automate certain tasks, its impact on complex human jobs will be limited. Many tasks in areas like transportation, manufacturing, and mining require real-world interaction and multifaceted skills, which AI won't be able to materially improve anytime soon.
The Fallacy of Scaling Laws
Many in the industry believe that doubling data and compute capacity will double AI capabilities. However, this view is challenged by the lack of clear metrics for open-ended tasks and the quality of data required.
Acemoglu challenges this view, questioning what it means to double AI's capabilities. For open-ended tasks like customer service or understanding and summarizing text, no clear metric exists to demonstrate that the output is twice as good.
The quality of data also matters, and it's unclear where more high-quality data will come from and whether it will be easily and cheaply available to AI models. Including twice as much data from Reddit into the next version of GPT may improve its ability to predict the next word when engaging in an informal conversation, but it won't necessarily improve a customer service representative’s ability to help a customer troubleshoot problems with their video service. The quality of the data also matters, and it’s not clear where more high-quality data will come from and whether it will be easily and cheaply available to AI models.
Current AI models, such as large language models (LLMs), have proven impressive, but a significant leap of faith is required to believe that the architecture of predicting the next word in a sentence will achieve capabilities akin to HAL 9000 from "2001: A Space Odyssey." It's almost certain that current AI models won't achieve anything close to such a feat within the next ten years.
Demands on the Power Grid and Supply Chain
A significant concern is whether the US power grid can keep up with the demands of AI development. The projected $1 trillion in AI capital expenditure has yielded little so far, and questions remain about whether this massive spend will ever pay off. Even if AI could potentially generate significant economic benefits, shortages of key inputs—namely, chips and power—might prevent the technology from delivering on its promise.
Goldman Sachs US semiconductor analysts Toshiya Hari, Anmol Makkar, and David Balaban argue that chips will constrain AI growth over the next few years. Additionally, the proliferation of AI technology and data centers will significantly increase power demand, which US utilities may not be prepared to meet due to aged infrastructure and regulatory constraints. This could lead to a power crunch that might limit AI's growth.
Expanding the power grid to meet AI demands is a complex and lengthy process. The electric utility industry is highly regulated, and utility companies must undergo a lengthy permitting and approval process before constructing new capacity. Supply chain constraints also pose a significant challenge, as the current supply chain is not prepared for a sudden surge in equipment orders.
The share of total US power demand accounted for by data centers is expected to rise from around 3% currently to 8% by 2030, translating into a 15% CAGR in data center power demand from 2023-2030. Approximately $50 billion of investment through 2030, or roughly $7 billion annually, is needed for new power generation alone. Utility companies will also need to build supporting infrastructure, leading to even higher overall investment.
The Economic and Regulatory Constraints
Affordability is a crucial constraint for utilities, as regulators focus on ensuring that electricity bills remain affordable for residential customers. This puts a cap on the rates utility companies can charge and still get their projects approved. Utilities don't generate a significant amount of free cash flow, so they will need to add debt capacity or issue equity to facilitate this massive investment.
Covello's main concern is that the substantial cost to develop and run AI technology means that AI applications must solve extremely complex and important problems for enterprises to earn an appropriate return on investment (ROI). He estimates that the AI infrastructure buildout will cost over $1tn in the next several years alone, which includes spending on data centers, utilities, and applications. So, the crucial question is: What $1tn problem will AI solve?
Many people attempt to compare AI today to the early days of the internet. But even in its infancy, the internet was a low-cost technology solution that enabled e-commerce to replace costly incumbent solutions. Amazon could sell books at a lower cost than Barnes & Noble because it didn’t have to maintain costly brick-and-mortar locations. Fast forward three decades, and Web 2.0 is still providing cheaper solutions that are disrupting more expensive solutions, such as Uber displacing limousine services. However, the question remains whether AI technology will ever deliver on the promise many people are excited about today.
The idea that technology typically starts out expensive before becoming cheaper is revisionist history. E commerce was cheaper from day one, not ten years down the road.
The starting point for costs is also so high that even if costs decline, they would have to do so dramatically to make automating tasks with AI affordable. People point to the enormous cost decline in servers within a few years of their inception in the late 1990s, but the number of $64,000 Sun Microsystems servers required to power the internet technology transition in the late 1990s pales in comparison to the number of expensive chips required to power the AI transition today, even without including the replacement of the power grid and other costs necessary to support this transition that on their own are enormously expensive.
More broadly, people generally substantially overestimate what the technology is capable of today. Covello struggles to believe that the technology will ever achieve the cognitive reasoning required to substantially augment or replace human interactions. Humans add the most value to complex tasks by identifying and understanding outliers and nuance in a way that it is difficult to imagine a model trained on historical data would ever be able to do.
Future Growth Predictions
While the future of generative AI holds promise, its path is fraught with challenges. Studies show that AI data centers can consume up to ten times more energy than traditional data centers, particularly during their training phase. In Europe, even in a base case scenario, the incremental power consumption expected from AI and traditional data centers over the next decade would be equivalent to the current consumption of countries like the Netherlands, Portugal, and Greece combined.
A significant investment in AI infrastructure, chips, and power grid enhancements is necessary to realize its full potential. However, the benefits of AI may not be as substantial or immediate as tech giants suggest. The barriers to entry, high costs, and future demands on the power grid are significant. As the industry continues to evolve, it will be crucial to address these challenges to harness the true power of generative AI and ensure its sustainable growth.
While generative AI has the potential to bring about significant changes, its journey will be long and complex. The investment required is vast, and the immediate benefits are uncertain. The power grid's ability to meet AI's demands is a critical concern, and regulatory and infrastructure challenges must be overcome. As we look to the future, it is essential to approach AI development with a balanced perspective, recognizing both its potential and its limitations.
+++++++++++++++++
Follow Janet Brandon on LinkedIn.
Struggling to create quality content that ranks on Google? Get your copy of Mastering ChatGPT: for Marketers, Content Writers and SEO Professionals and start dominating search engine results with ease!
Grab a copy of my new ebooks: Framework Secrets for Effective Copy, and ChatGPT for Business.
Author of GreenFingers.info
Hi there! I am a seasoned digital marketer with a wealth of experience across a range of marketing activities, from SEO, digital marketing, GPT prompting, social media to email automation and ecommerce, with a love for oil painting, reading, houseplants and mountain biking.