Microsoft Dunks Servers Boiling Liquid to Cool

This site may earn affiliate commissions by using the links on this page. Terms of use.

Microsoft has been exploring innovative ways to cool its data center servers for several years. In the past, the company has previously made headlines for cooling off foreign data centers using seawater via its Natick project. It now points to a two-phase solution for liquid cooling that, according to him, enables even higher server densities.

The new system uses a non-conductive coolant. Microsoft does not identify it exactly, but it sounds similar to 3M’s Novec 1230, with a very low boiling point around 122F (Novec 1230 boasts at 120.6F). The boiling of the coolant creates a vapor cloud that rises and comes in contact with a cooled condenser at the top of the tank. The liquid then rains back into the closed loop server chassis, which in turn supplies the systems with fresh cooled refrigerant. Heat is also transferred from the server tank to a dry cooler outside the enclosure and also distributed there. Immersion cooling works because direct contact with a non-conductive liquid offers much better thermal dissipation than an ordinary air cooler.

“We are the first cloud provider to perform two-phase immersion cooling in a manufacturing environment,” said Husam Alissa, a chief hardware engineer in Microsoft’s advanced data center development team in Redmond, Washington.

It’s shown that Ioannis Manousakis, a major software developer for Azure, is removing a Xeon server from its bath. Photo by Gene Twedt for Microsoft.

Microsoft’s blog post points to the growth of immersion cooling as a good thing, emphasizing the fact that it can reduce server power consumption by 5-15 percent. The company notes that the use of immersion cooling makes it possible to send burst workloads to specific servers, as it can overclock it to serve requests faster.

Microsoft’s Project Natick experiment proved that pumping a data center module with nitrogen and dropping it into the water can be very useful, as the serviced servers fail 1/8 as the replica servers on land. It is said that the lack of moisture and oxygen is responsible for the excellent reliability under water. This system should enjoy similar benefits. The company aims to set up data centers for low latency, high performance and minimal maintenance needs if this liquid cooling system is sustainable.

Microsoft’s blog post claims that the use of immersion coolers makes it possible for data centers to follow a ‘Moore’s Law’ of their own, as the move will reduce power consumption and increase server density, but it seems to have been achieved a bit. The reason why companies are evaluating features like immersion cooling is because CPUs and GPUs are now struggling to deliver higher performance without drawing ever-increasing amounts of power. CPUs can now store 300W at the socket, while GPUs in the data center scale up to 700W. CPUs are becoming more efficient, but increasing core counts and additional functions during operation increase their absolute power consumption, even with these gains.

An interesting question is whether we will ever see the immersion cooling in the consumer market. Technologies starting in data centers often scale down in personal computers over time, but building an affordable after-sales kit for a home user to tackle this type of cooling is a big order. This type of cooling solution will never be cheap, but there may be a market for it in boutique game computers and high-end workstations.

Feature image by Gene Twedt for Microsoft.

Read now:

Source