High efficiency of low speeds
The concept of Low Speed Ventilation, with proper design of the cooling system, can provide an excellent level of PUE for the data center - 1.1 and even less! The experience in Moscow convincingly proves this.
How you can achieve the maximum productivity of the engineering subsystems of the data center with a minimum of energy costs? How to achieve optimal PUE performance and yet not overestimate the budget? One of the best answers to these questions will be a cooling system using the concept of Low Speed Ventilation. But in order not to limit ourselves to statements, we illustrate this statement by example and back up the calculations.
Terms of Reference for the creation cooling systems with high
energy efficiency for the data center
In our example, we will consider a conditional creation project relatively large data center. Objects with similar characteristics already work on the territory of the Russian Federation, so we can talk about the solution of the claimed practical problem, and not just about the theoretical calculations. Electrical equipment, including UPS, will be installed in separate technological rooms and heat inflow from them (as well as from lighting systems, as well as heat inflow through walls) we we will not take into account during design the cooling system for IT equipment.
The room for the data center is a new one. To build it will be from scratch on a land plot of 200 x 100 meters - the land should be enough with a vengeance, even in the case of further increase in capacity. Plot of regular rectangular shape, located at a distance of 25 km from Moscow Ring Road in the north-west direction. Its wide rons are oriented to the north-west and south-east. In the radius of 3 km there are no motorways, railways and other sources of man-made air pollution. However, the at the moment, the required electric power is not available, but there is a confirmed technical possibility of supplying up to 12 MW from two independent substations with a voltage of 35 kV. Both are located far enough from the site: one in 7 kilometers, to the other 5.5 km. At the same time, the data center will be guaranteed power supply on classical diesel-generator sets (dynamic UPS will not be used). To ensure an uninterrupted cooling server rooms is permissible, if necessary, use of cold storage batteries or connection parts of cooling equipment to individual sources uninterruptible power supply.
Since the task is to make the data center maximally energy-efficient, but the building has not yet been built, architectural and planning solutions will be developed based on the recommendations of the layout proposed by the the designer of the cooling system (as the most energy-intensive engineering subsystem). To the project developer it is necessary to offer a solution for the cooling system server rooms of the data center, providing a maximum energy efficiency and at the same time requiring a minimum power supply. The maximum permissible average annual PUE should not be more than 1.37.
It is also permissible to use a cooling system based on a system of direct cooling by outdoor air (fresh air cooling). But only if the developer system cooling server rooms will offer an optional solution, which will ensure the reduction of initial costs by at least 15% and a decrease in operating costs by 20% per year or more (compared to with a cooling system having a PUE of 1.37).
When using the above-mentioned free cooling system, a short-term (in total, no more than 400 hours a year) increase in the temperature at the entrance to the server up to 40 ° C. But the requirements for operational availability TIER III level must be observed, although in this case the certificate of The Uptime Institute will not be a necessary condition.
Let us also specify that the choice of the optimal variant will be made by the customer on the basis of a comparison of the cost of creating and operating the proposed solutions for uninterrupted cooling for 10 years of operation of the data center.
Three options for different conditions
To solve this task, it is worthwhile to consider three variants that reflect different approaches: we take systems with direct free cooling, year-round indirect free cooling, and a classical scheme with a chiller. We leave for them only one common key feature: all systems will use the concept of Low Speed Ventilation. The system is operated on the principle of availability of cold air, control is performed using Smart Measure Tube. This approach allows in all cases to achieve the absence of such negative factors as overpressure, the manifestation of the Venturi effect and the appearance of local overheating zones - "hot spots". The PUE will be guaranteed to be low, which is clearly shown in the table 1. We note that an increasing the air flow will significantly increase the cooling system's power, but it will lead to all the problems listed above, which we have so successfully avoided.
General issues
To get the solution, we take the ten powerful air coolers with a capacity of 315 kW each. Parameter ranges of incoming water and external indoor air are 15-23°C and 24-36.5°C, respectively. Calculations show that in this case the required volume of air circulated through the room will be 777.5 thousand cub.m/h.
Proceeding from the proposed layout of the solution, the raised floor will not be used, which means that air is fed into a single volume of the cold hall from two sides along nine corridors, respectively, each section must be 248 sq.m (considering that in LSV type systems, the air flow speed is 1.5 m/s, and taking into account the number of air coolers). For a cold section, a distance of 2 meters is required between the rows of posts and 0.9 meter above them, and for a hot section it is 1 meters between the rows and 1.2 meters above the cold plenum (Figure 1).
Fig.1 Scheme of cooling without a raised floor with direct free cooling
An important feature of the external components of the system - dry coolers - is that they can be installed on the roof of the object. If you use this option, then the size of the building needed to place our data center will be only 52 x 27 x 4 m (LxWxH), and therefore, providing all the necessary parameters for the cooling system, we will take less than 7% of the area of the provided site! As for the energy costs, they will only be needed to ensure the operation of the fans of the ten LSV coolers with a power consumption of one fan 270 W, which will total consume 27 kW - less than 1% of the IT load. Now let's go directly to the details of the solution. What will achieve maximum efficiency?
PUE 1.1
Such PUE can be achieved, using a free cooling system based on the V-shape dry coolers in conjunction with the chiller cooling in a closed air circuit. The principal scheme of solution is shown in Fig. 2.
Fig. 2. The effectiveness of the LSV system with a dry cooler and chiller - PUE 1.1
On the example of the classical scheme (table 1) you can see a significant energy saving due to concept of Low Speed Ventilation. This solution all the year round guarantees temperature in the hall 24°C and allows the use of free cooling, if "overboard" 8°C or lower. Additional investments - costs for chillers and a prefabricated cooling center with plated heat exchangers(Figure 4). Also the costs of exploitation and service will risen. But, given the reduction costs by 50%, in comparison even with a highly efficient CRAC systems it is a fully justified step.
Fig.4 Prefabricated heat exchanger system - complete solution for data center cooling
PUE 1,075
A similar level of efficiency can be achieved with the help of the free cooling system based on the adiabatic cooler (Fig. 5) in a closed air circuit.Such solutions are capable of supplying water at a temperature of 25°C at an ambient temperature from 25°C to 40°C for 260 hours a year. In this case, eleven more powerful LSV coolers will work with air in the range of 24-27°C with a closed circuit (Fig. 6).
Fig.6 Scheme of the operation of the free cooling system based on the adiabatic cooler
For the case under consideration, we can examine the performance of the climate system in more detail: Figure 7 shows the parameters for eleven adiabatic coolers with a cooling capacity of 286.5 kW each, which in total gives the required 3.15 MW.
Fig. 7. Characteristics of the adiabatic coolers in Moscow
At the same time, there will be 12 independent pairs of air coolers + adiabatic coolers, which will allow us to avoid reserving pipelines. The water consumption will be 1444 cub.m per year, with an allowable salt concentration of 350 ppm, that is, even chemical water treatment or reverse osmosis is not required. Besides this, adiabatic cooler is low-noise, reliable, easy to operate. Due to the closed circuit of the air circulation, there are no need for outdoor air filters, and the guaranteed temperature in the cold corridor does not exceed 27°C. This is an excellent alternative for cities with a polluted atmosphere, providing the best ratio of PUE and investment.
PUE 1,065
Finally, consider the most energy efficient solution in our review - free cooling, organized with the V-shape dry coolers(Figure 8) and the direct supply of outdoor air. This solution allows using dry cooler to supply 40% ethylene glycol with a temperature of 15-23°C with a closed air circuit in the server room. If the outdoor temperature is above 8°C, we begin to mix outside air to achieve the desired performance, while the number of hours of operation of the data center at a server air temperature above 24 ° C will be only 260 hours per year.
Fig.8 V-shape dry cooler
In addition to the costs of the LSV air coolers and the V-shape dry coolers drive with EC fans, it will be necessary to purchase a frame with filters for LSV and shutters for controlling the supply and discharge of outdoor air. The installed capacity of the system will be equal to 143.7 kW. For those data centers where the temperature in the room is inadmissible above 24 ° C, this solution can be supplemented with a chiller that, with the LSV, will cool the outside air at a temperature from 25°C to 40°C. This will noticeably affect the price, but imperceptibly - on PUE, which in this case will be 1.08.
Additional features with LSV
The dimensions of the room in which we decided to place the data center allow adding three more LSV units, which will allow: increasing the water temperature from 15-23 ° С to 21-29 ° С; Extend the working time of free cooling; Reduce the use of the chiller or the supply of outdoor air, or will increase the IT load. The LSV concept ensures a stable climate in the server room, eliminates the appearance of excessive pressure and local overheating zones, guarantees an excellent PUE.
Table 1. Comparative characteristics of different solutions for the given data center cooling system
For more information, please contact: