Retrofitting vs New buildings regarding data centres in the AI era.
Introduction
On the other hand, data centres (DCs) have increased dramatically over the past two decades, just as cars have done with technology. To explain it, it’s like buying a car: The car comes with the finest features and the most power offered to the consumer when you buy the vehicle. You drive it for 10 years, but along the way, more and more newer and more powerful models become available on the road.
You have two options left with you; either you retrofit your old car with better features and power, or you get rid of it and buy a new car built for the era you are in.
Let us now use this same mental model to argue about data centres. In recent times, alongside the exponential rise of artificial intelligence (AI), DCs are facing mounting forces to be more powerful, more efficient, and able for large-scale computing at higher density.
Today’s organizations have the same choice: should they retrofit existing DCs to meet the requirements of AI or invest in brand-new, AI-ready infrastructure?
The Growing Demand of AI in Data Centres
The operational needs of data centres have completely changed by AI. Traditional workloads, such as conventional database queries, rely on general-purpose CPUs that are much less power-hungry and generate much less heat than AI. Additionally, AI workloads necessitate:
First, if we train AI models, 2-3 times more power is needed for standard cloud workloads.
? Liquid or immersion cooling is attractive because it offers the advanced cooling solutions required for AI workloads, while traditional air-cooling methods are often insufficient.
? Scalability and modularity—An AI model must be able to enhance, so data centres need to be flexible and scalable.
????????????????????? High-speed networking: InfiniBand and NVLink interconnects are needed for high-speed data transfer between AI processors.
These demands put organisations in a position to determine whether to upgrade existing infrastructure or invest long-term in a new facility.
Option 1: Retrofitting Existing Data Centers
Upgrading current data centres with AI-friendly infrastructure is known as retrofitting. It involves installing the best available technology in specific rooms. Many enterprises benefit from this option.
Pros of Retrofitting:
1.?????????????????? Retrofitting can be more economically advantageous than constructing a new facility. A benefit is that it allows companies to use existing assets while limiting downtime.
2. Faster Deployment: Replacing an existing data centre for a new one that supports the technology is quicker as businesses can now scale their AI capabilities faster.
3.?????????????????? Retrofitting can conserve electronic waste and increase green operation by improving energy efficiency instead of tearing down and reconstructing facilities.
4.?????????????????? 1. There are Limited Space in Highly Developed Urban Areas for New Data Centres. Companies can make the best use of their existing real estate through retrofitting.
Challenges of Retrofitting:
While helpful, retrofitting programmes have several considerable disadvantages:
????????????????????? Datacentres built some years ago were not designed for the kind of power density needed for AI workloads.
????????????????????? Advanced cooling: Existing facilities can be too costly and reach the technical limits in implementing advanced cooling solutions.
? ? Inefficient Layout: The traditional data centres were not meant for high-performing AI clusters, so the retrofits were inefficient.
In most situations, retrofitting offers short-lived benefits but not long-term gains.
Option 2: Building AI-Native Data Centers
However, building a new data centre dedicated to AI processing is likely the best option for companies with a large AI workload.
Pros of New AI-Native Data Centers:
领英推荐
1. For new facilities, liquid cooling, energy-efficient power distribution, and modular scalability are in play.
2.?????????????????? Future Conversion: New data centres can be built so that with the rapid evolution of AI, we’ll be able to accommodate future hardware innovations and realise very high-density compute clusters.
3.?????????????????? New facilities offer the potential to be designed as buildings with renewable energy sources and heat reuse technologies, use water-efficient cooling, and thus align with global carbon reduction capabilities.
4.?????????????????? The Edge Computing & AI Specialisation: Instead of putting these AI-specific data centres all in one place, they can be distributed in convenient strategic places closer to where the processing and analysis of these data are required, reducing latency.
?
Challenges of Building New AI-Native Data Centers:
Construction of the new facility is highly capital intensive — costly and requires large up-front investment.
? Longer Time to Deployment: A new build takes 12–36 months, which means more AI only when a new build is available, whereas retrofitting continual deployment.
????????????????????? Location Constraints: There are constraints on their location to find the right place, with good power, connectivity and infrastructure.
Market Trends: A Hybrid Approach
As the adoption of AI happens at a super speed, the market is tilting in the direction of hybrid solutions, which are the supporting leg of retrofitting and building new AI-native DCs.
Some key trends include:
????????????????????? Google, AWS, Core42 and Microsoft have started building DCs focused on AI on hyperscale, adding liquid cooling and modular design to maximise power efficiency.
????????????????????? Investments in selective retrofits in Enterprise Data Centres include upgrading high-intensity critical infrastructure with a shift of high-intensity AI workloads to colocation or cloud providers.
????????????????????? Private and real estate firms are buying older DCs to repurpose general cloud services, while AI workloads are being shifted to brand-new, purpose-built facilities.
Deciding the Right Way for Your Business
Several key factors determine whether it is worth retrofitting or building a new AI-ready data centre.
1.?????????????????? Workload Requirements – What are the amount of AI processing will your organisation require over the next (5–10) years?
2. Regarding RDMA connections, you have two generic choices for the underlying non-blocking network fabric: InfiniBand and Omni-Path.
3.?????????????????? Is retrofitting the cost-effective short-term solution, or should it be a new build, which is the right choice in the long term?
4.?????????????????? Goals of Sustainability – Would a retrofit help achieve carbon neutrality, or is a new build more efficient in the long run?
5.?????????????????? Time to Market – How quickly will you be looking to scale your AI strategy, are you looking to do it quickly or does it permit a more extended timeframe to develop a new facility?
?
Conclusion
One of the many issues faced by people in the digital world today is deciding whether to retrofit your current data centre or to build one data centre. Data centres just have to catch up as cars do, they need to evolve to cater to the advances of technology.
Retrofitting may work in the short run, providing benefits of saving money and fast deployment; but it may not be sustainable in the long run. On the other hand, building new AI-unique data centres brings in future proof, efficiency, and future work, but the cost and the time required are also very big.
Ultimately, a hybrid strategy of retrofitting existing on-premises hardware and building new, brand-new facilities is emerging as the preferred strategy. Given that AI is pushing the limits of computational power, the data centre industry needs to keep up with 2025 demands and be ready for the future of AI innovation.
Strategic Business Development Manager
1 周Great article Raj Kanda. Do you know how this will impact the luminaires in the data center? Will they need to be replaced in retrofit projects as well? If so, in how many years?
Director @ VoltServer Inc.
1 个月Have you thought about how the tradeoff might change if we are considering edge, inference focused data centers? I ask because I'm wondering how important deployment speed will be for edge data centers. I'm assuming it's much higher and will be more of a land race. In this case, it would shift the balance to retrofitting. What do you think?