The Rise of Necessity-Driven Cooling: What Cold Plate Adoption Means for Immersion

The Rise of Necessity-Driven Cooling: What Cold Plate Adoption Means for Immersion

One year ago, in my keynote at the OCP Global Summit, I published my forecast data for the liquid cooling market with a compound annual growth rate (CAGR) approach, projecting strong and steady growth based on existing sales cycles, adoption trends, and expected market volume.

But I was wrong…

Despite my best efforts, the market was on the brink of a breakthrough no standard forecast could capture.

Source: Cartoonstock

At the time, I was fully aware of the upcoming shift to cold plate technology, a technology poised to address emerging thermal demands for high-performance IT equipment. Despite this insight, I deliberately refrained from incorporating a step-change in my projections. Instead, I followed the traditional growth model, opting for a “gradual” curve (CAGR of 37-52%) rather than anticipating the rapid acceleration that hindsight has now proven inevitable.

Looking back, this decision reveals a key challenge in forecasting: the tension between incremental growth assumptions and the potential for disruptive inflection points. With recent data showing a sharp surge in cold plate adoption, it’s clear that necessity-driven demand rewrites growth trajectories almost overnight. This article will re-examine that original forecast, overlaying the realities of the cold plate adoption spike to illustrate how the industry is accelerating toward a new paradigm in cooling that results in a crystal-clear inflection point for immersion cooling in our near future.

Liquid Cooling’s Path Forward: Beyond Organic Growth

As the demand for data center performance surges, amid a confluence of catalysts ranging from GPU power consumption to generative AI workloads to increased rack densities, liquid cooling has rapidly moved from a niche innovation to a critical infrastructure requirement. While extensive analysis exists on the drivers behind this shift, including the limitations of air cooling, the rise of high-density computing, and sustainability imperatives, we will focus on a forward-looking perspective: the adoption forecasts for liquid cooling through 2030.

With these industry drivers well documented, we will analyze the trajectory for two key technologies, cold plate and immersion cooling, as they navigate distinct adoption paths. Using recent data on market revenues and uptake trends, we will explore why the traditional CAGR-based projections may not capture the rapid acceleration needed to meet the sector’s evolving thermal demands by 2030.

Cold Plate Cooling: An Unanticipated Surge in Demand

The adoption of liquid cooling technologies, while once limited to specialized environments, has seen substantial growth in recent years. Using traditional metrics like CAGR and revenue cycles, we observe a rapid but steady increase in cold plate cooling deployments, with adoption expanding beyond early adopters to broader segments across data centers. This is largely driven by the need to manage higher thermal densities in modern IT hardware, particularly with advances in AI and high-performance computing (HPC).

Source: Promersion

However, this year, we’ve seen an unprecedented acceleration. The market size for cold plate cooling has effectively quadrupled, reaching a value of $2 billion. If the NVIDIA Blackwell series and the associated NVL72 platforms come through in time, it is likely to surpass that mark. This sharp surge raises a critical question: why was this exponential growth never forecasted despite the apparent logic behind it?

To understand this, we look beyond typical adoption metrics like CAGR and examine the underlying drivers of this spike. Unlike traditional incremental demand, this surge represents a fundamental shift: cold plate cooling has moved from enhancement to necessity. The tipping point has arrived, driven by chip ODMs who now require cold plate technology to meet the thermal performance needs of their hardware. In industry terms, this shift reflects a “technology requirement” rather than “organic demand,” as this rapid uptake is fueled by a critical push to address escalating thermal constraints.

This context sets the stage for a closer look at how liquid cooling forecasts can evolve to reflect the emerging necessity rather than simply projecting gradual, organic growth trends.

Why Traditional Forecasts Miss Necessity-Driven Growth

Despite years of anticipation about the industry reaching this threshold, most forecasts failed to predict the timing or scale of this shift. This is rooted in traditional forecasting methodologies typically relying on historical data and organic growth trends to estimate future demand, which works well in stable or gradually evolving markets. However, these models often lack the flexibility to factor in abrupt demand shifts and struggle to incorporate "step changes" where demand is no longer driven by gradual market adoption but by urgent technological needs or constraints. In this case, we see that the sudden demand for cold plate cooling, driven by thermal performance requirements from chip manufacturers, is a prime example of a step change.

Additionally, standard forecasting metrics, like CAGR, assume a degree of consistency in growth, which does not capture disruptive spikes or tipping points. Because of this, models can fail to predict sudden accelerations that arise from necessity rather than market-driven adoption.

The Shift from Demand-Pull to Requirement-Driven Adoption

The concept of demand driven by necessity rather than organic growth is well supported in markets influenced by technological constraints. When certain technologies become critical for maintaining or advancing performance (e.g., the need for cold plate cooling to handle high TDP CPUs and GPUs), demand becomes less flexible. In this situation, adoption occurs not because of incremental interest, but because there is no viable alternative, causing an accelerated and unavoidable market response.

This shift from demand-pull (driven by market interest) to supply-push (driven by necessity or requirement) can be difficult for traditional forecasts to incorporate because they often lack a framework for technology-driven mandates.

Rethinking Forecasts with Scenario Planning

We could consider a counterargument that advanced forecasting offers tools, like scenario analysis or stress testing, which can help identify potential tipping points. Scenario-based forecasts consider various outcomes (e.g., a high-demand scenario triggered by a critical technological need). However, this approach is still less commonly used in standard market forecasts, which often prioritize stable growth projections.

Source: Promersion

If scenario analysis were more widely applied, forecasters might have included a “technology requirement” scenario that anticipated accelerated adoption if chip ODMs reached thermal constraints. However, implementing these scenarios requires deep industry insight and data, which are not always readily available to analysts.

The Challenge of Credibility in Rapid Growth Forecasts

Moreover, investors and stakeholders have often seen overly optimistic growth projections fall short, especially those with “hockey stick” shapes that suggest explosive, sustained growth. Such patterns are frequently associated with overly ambitious technology projections or speculative market dynamics, which may raise concerns about reliability. Rapid adoption forecasts sometimes fail to consider real-world constraints, such as supply chain bottlenecks, infrastructure readiness, and economic viability, which makes them seem unrealistic or dismissive of practical limitations.

Source: Cartoonstock

Balancing Realism and Insight in Scenario Forecasts

Tying scenario forecasts to tangible, imminent shifts (like ODM mandates for cold plate cooling due to thermal performance thresholds) can provide more grounding. This way, the “necessity-driven” adoption forecast is linked directly to observable, quantifiable constraints rather than hypothetical interest alone.

On the other hand, industry experts have been signaling the imminent breakthrough of cold plate cooling for years, with organizations like ASHRAE and the Open Compute Project issuing urgent calls for the industry to increase its readiness. Despite these announcements, major IT equipment brands have shown little to no visible adoption of cold plate technology in their mainstream systems, leaving many of these warnings largely unheard. Even NVIDIA (or its representatives), leading in high-performance computing, denied the need for cold plate technology until just a few months before launching its Blackwell GPUs, hardware that now requires cold plates to function. This gap between urgent notifications and actual adoption raises a key question: how can we accurately assess a technology breakthrough when it is driven by necessity rather than market enthusiasm?

The case of cold plate cooling offers valuable lessons, showing that even when a technology’s critical need is clear, industry adoption often lags until it becomes unavoidable.

Understanding Platform Density’s Impact on Cooling Needs

Cold plate cooling excels in delivering pinpoint precision cooling to high-power chips, such as CPUs and GPUs, which are the primary sources of heat in most IT systems. However, as platform density has evolved, the thermal demands are no longer confined to xPU components alone. Increasingly, non-xPU components like network cards, memory, and storage accelerators are also significant contributors to system-wide heat, creating a more dispersed thermal load that traditional methods struggle to manage.

By examining the historical and current thermal power requirements of various components, we assess how platform density, measured as total power consumption per server excluding xPUs, has developed over time.

Source: Promersion

The data above highlights key developments in power requirements to date:

  • Power Delivery Losses: Voltage Regulation Modules (VRMs) and Power Delivery Networks (PDNs) can add 66W for a 500W CPU and up to 157W for a 1200W GPU, representing non-negligible heat sources in high-density platforms.
  • Network Components: With SmartNICs, FPGAs, and high-bandwidth Ethernet cards drawing between 60W and 225W per port, the network layer alone imposes significant thermal demands.
  • Memory Modules: High-bandwidth memory (HBM) modules and DDR memory have also become major heat contributors, consuming 50W per HBM module and up to 240W across multiple DDR modules.
  • Storage and Power Supply: Storage accelerators and NVMe SSDs add heat, and even the most efficient PSUs contribute to the system’s thermal load with around 53W per kW of total system power.

This accumulated data shows that platform density, particularly the cumulative thermal load from non-xPU components, has grown considerably as each generation of components brings higher power and thermal requirements. Plotting these cumulative power demands over time provides a clear visualization of how platform density has evolved to date, marking an ongoing shift toward higher total system loads. Unlike traditional cooling methods, which are limited in scope to specific high-power components, immersion cooling is uniquely equipped to manage this broad, system-wide thermal demand that outpaces traditional cooling’s capabilities.

Source: Promersion

By analyzing the upward trend in platform density, we can observe that cooling requirements are shifting from isolated hotspots to a more comprehensive system-wide need. This evolution points toward a future where immersion cooling will both enhance cooling performance and be increasingly necessary for high-density platforms, effectively managing the growing heat output across the full array of components.

Efficiency Advances: A Potential Counterbalance

Ongoing advances in power efficiency could alter this trajectory for non-xPU components. Innovations such as shifting from 12V or 48V to 400V power distribution, adopting low-power optics (LPO) over traditional optical components, and optimizing digital signal processing (DSP) costs could lower the power draw for non-xPU components. These improvements, if adopted on a broad scale, could potentially ease some of the thermal burden outside of xPUs, which might slow down the timeline for immersion cooling’s inflection point.

These efficiency advancements may help ease some thermal pressure in the short term, but they do not fundamentally alter the long-term trajectory toward increased overall power density driven by rising performance demands. The cumulative demands from high-density, high-performance systems continue to push beyond traditional cooling’s capabilities. Even if certain efficiency optimizations emerge, immersion cooling remains uniquely suited to address the distributed thermal requirements of next-generation data centers, where overall system density and heat generation trends are expected to remain high.

Preparing for a Necessity-Driven Future in Liquid Cooling

The rapid adoption of cold plate technology has shown us that, in an industry built on incremental change, sudden inflection points are not just possible—they’re necessary. This shift from gradual demand growth to an urgent requirement highlights the importance of adapting our forecasting methods to account for necessity-driven adoption, especially as platform densities and thermal loads continue to rise.

Immersion cooling, once viewed as a niche solution, is now positioned to address the broad, distributed thermal demands of next-generation data centers. As we’ve seen with cold plate cooling, the transition from optional enhancement to essential infrastructure happens swiftly when technological demands hit a tipping point. For immersion cooling, that tipping point is on the horizon, driven by the cumulative heat loads of high-density systems that go beyond the capabilities of traditional cooling solutions.

Source: Promersion

Looking forward, we as the liquid cooling industry and the data center ecosystem as a whole, must prepare for these shifts by adopting dynamic forecasting models that anticipate and incorporate technological necessities. Waiting for gradual, organic growth to drive adoption will no longer be sufficient in a landscape where performance requirements can redefine market demand overnight.

By recognizing these trends and embracing a readiness mindset, the industry can not only meet the demands of high-performance computing but also maintain the flexibility to adapt to future technological shifts. Liquid cooling is no longer just a forecast, it’s the next essential step in sustaining the data centers of tomorrow.


Questions: What lessons can we learn from this breakthrough? How can the industry better prepare for the next inflection point in cooling? What major technologies or developments could influence the inflection point for immersion cooling?

Share your insights below and join me next month as we explore the diverse cooling ecosystem beyond the inflection points.


About Promersion

Promersion is a trusted partner for organizations navigating the dynamic landscape of liquid cooling technologies. Committed to collaboration, Promersion works with a wide network of industry stakeholders to accelerate adoption, drive innovation, and establish best practices across the liquid cooling ecosystem. By bridging the gap between technological advancements and real-world business strategies, Promersion empowers the industry to unlock the full potential of liquid cooling technologies.

Acknowledgments

Special thanks to Donna Prideaux, MPS ( KeyBank Capital Markets), for her editorial contributions and Jaime Comella Gómez-Aller , for proofreading and providing valuable insights.

Image Use Policy:

Images by Promersion may be freely used with appropriate credit. Licensed images from Cartoonstock are not available for reuse.

Recreating Graphs

Readers are welcome to recreate graphs based on the data presented in this article for non-commercial purposes, provided proper credit is given to Promersion. Please ensure that any recreated visuals accurately represent the data and context provided here.

Source Data Access

The data presented in this article is intended to provide high-level insights. Access to more detailed source data and context is reserved for Promersion clients as part of Promersion's consulting services. For inquiries, please contact me directly.

#empoweringopen #ocpvolunteers #edge #OCP #datacenter #innovation #opensource #sustainable #sustainability #datacentres #datacentre #DontFearLiquidCooling #immersion #immersioncooling #liquidcooling #netzero #efficiency #heatreuse #supplychain #PlanforIT

Rolf Brink

Driving the global growth and adoption of liquid cooling technologies for data centers

1 个月

The second article of this series can be found here: https://www.dhirubhai.net/pulse/projecting-cooling-landscape-2030-rolf-brink-mkjrc/

Thank you for meeting at #PTC25 and I am happy to help capture the expertise of CAMALI CORP by attending your future events!

回复
Arthur Luo

Experienced Telecom/IT Executive

2 个月

Immersion Cooling should have better PUE, isn’t it?

回复
Jeremy Hartley

Managing Director at Dataracks. Manufacturer, Design and Install bespoke solutions for Data Centers.

2 个月

Very interesting, We have seen a big demand in racking that can house this growing cooling technology. I'm surprised myself, like you I thought it would be slow and steady. But we have reacted quickly to meet the demand. Just need to find those customers quickly. Jeremy

Charlie Hou

Ai Enthusiastic | ex Mellanox/Nvidia Networking

2 个月

2025 will be huge!!!

回复

要查看或添加评论,请登录

Rolf Brink的更多文章