Microsoft Dunks Servers Into Boiling Fluid to Cool Them Off

Microsoft Dunks Servers Into Boiling Fluid to Cool Them Off

Microsoft has been exploring innovative ways to cool its data center servers for some years now. In the past, the company has previously made waves for its offshore data center cooling using seawater via its Project Natick. Now, it’s showing off a two-phase liquid cooling solution it says enables even higher server densities.

The uses a non-conductive cooling fluid. Microsoft doesn’t precisely identify it, but it sounds similar to 3M’s Novec 1230, with a very low boiling point around 122F (Novec 1230 boils at 120.6F). Boiling off the coolant creates a vapor cloud, which rises and contacts a cooled condenser at the top of the tank lid.

The liquid then rains back down into the closed-loop server chassis, resupplying the systems with freshly cooled coolant. Heat is also transferred from the server tank to a dry cooler outside the enclosure and dissipated there as well. Immersion cooling works because direct contact with a non-conducting fluid offers far better thermal dissipation than a conventional air cooler.

“We are the first cloud provider that is running two-phase immersion cooling in a production environment,” said Husam Alissa, a principal hardware engineer on Microsoft’s team for datacenter advanced development in Redmond, Washington.

Ioannis Manousakis, a principal software developer for Azure, is shown removing a Xeon server from its bath. Photo by Gene Twedt .

Microsoft’s blog post paints the growth of immersion cooling as a good thing, highlighting the fact that it can reduce server power consumption by 5-15 percent. The company notes that running immersion cooling allows it to direct burst workloads to those specific servers because it can overclock them in order to serve requests more quickly.

Microsoft’s Project Natick experiment proved that pumping a data center module with nitrogen and dropping it in the water can be quite helpful, with the submerged servers suffering 1/8 the failure rate of replica servers on land. The lack of humidity and oxygen are said to be responsible for the superior underwater reliability. This system should enjoy similar benefits. The company envisions deploying data centers for low latency, high performance, and minimal maintenance needs if this liquid cooling system proves sustainable.

Microsoft’s blog post claims that adopting immersion cooling allows data centers to follow a “Moore’s Law” of their own because the shift will decrease power consumption and allows for increased server density, but this seems like a bit of a reach. The reason companies are evaluating features like immersion cooling is because CPUs and GPUs now struggle to deliver higher performance without drawing ever-larger amounts of power. CPUs can now hit 300W at the socket while GPUs scale up to 700W in the data center. CPUs continue to become more efficient, but increasing core counts and additional on-die capabilities drive up their absolute power consumption even with these gains.

One interesting question is whether we’ll ever see immersion cooling come to the consumer market. Technologies that debut in data centers often scale their way down into personal computing over time, but building an affordable aftermarket kit for a home user to tackle this kind of cooling is a tall order. This kind of cooling solution would never come cheap, but there might be a market for it in boutique gaming PCs and high-end workstations.

Feature image by Gene Twedt .

Now Read:

Facebook Twitter Google+ Pinterest
Tel. 619-537-8820

Email. This email address is being protected from spambots. You need JavaScript enabled to view it.