Datacenters are both the heroes and villains of the digital age. On one hand, these facilities power the technologies that increasingly run our world. On the other hand, datacenters devour enormous amounts of energy and face growing opposition in many communities. As the world careens deeper into climate crises, finding ways to reduce power consumption is nothing less than critical.
One idea that has lurked in the background for decades is using special fluids to directly or indirectly cool computing devices and other electronic systems. While the concept may seem shocking, dielectric liquids and immersion cooling methods have come of age. These technologies, which already are making inroads in high-performance computing and cryptocurrency mining circles, bathe electronic components in a dielectric (nonconductive) liquid or coolant with strong insulation properties. The circulation of the fluid draws off heat as it comes into contact with the electronics.
Immersion cooling is poised to make a major impact on datacenters. Although IBM and Cray experimented with the technology during the 1960s through the 1980s, advances in design, engineering and fluids are finally making the technology viable and more affordable. "Immersion cooling has advanced remarkably over the last few years," states Lucas Beran, principal analyst at market research and consulting firm Dell'Oro Group. "As the pressure mounts to control power consumption and heat generation in datacenters, the technology has an important role to play."
As businesses, government entities, and others look for ways to reduce their carbon footprints and trim energy costs, it is clear that conventional air cooling, and even other fluid-based technologies such as Direct Liquid Cooling (DLC), which can be applied to CPUs and GPUs using special plates, are not up to the task.
It also is becoming difficult to squeeze out energy efficiency gains in datacenters by refreshing legacy servers, optimizing data, virtualizing workloads, and turning to green hosting. Thermal management now consumes 30% to 40% of annual energy consumption in datacenters, according to Dell'Oro Group.a In fact, datacenters are forecast to consume 8% of the world's electricity by 2030, up from about 1% in 2018.b
Energy consumption is spiking in datacenters because per-socket demand for power is increasing rapidly, says Jim Rogers, computing and facilities director for the National Center for Computational Science at Oak Ridge National Laboratory in Tennessee. "Just a few years ago, a CPU might consistently consume about 100 watts. Today's CPUs and GPUs are consistently using more than twice that amount," he says. As the number of computers in a single rack has grown, density has also become an issue. "The net result is that the aggregate or total amount of heat that must be managed has steadily climbed to 30, 40, or 50 kilowatts in a single rack," he says.
These two factors together are pushing conventional cooling systems to their limits. "When manufacturers move above about 15 kilowatts per rack, the strategy to eject the waste heat to the datacenter space by just moving air with server fans effectively ends," Rogers explains. At that point, rear-door heat exchangers, cold plates, and other technologies are necessary. What makes immersion cooling stand out is that the heat generated by various components in the server is directly absorbed by a fluid, which is far more efficient than air at dissipating heat.
As heat densities grow, the challenges mount. "Blowing air across components is inefficient and removing heat from air is mechanically expensive," Rogers says. Direct Liquid Cooling (DLC), which also goes by the name Direct-to-Chip, at best can extract only 70% of the heat.c Meanwhile, water-based cooling systems, a subset of DLC, are used broadly, but they utilize cold plates circulating water near hot components and must rely on limited thermal transfer. "Immersion cooling solves that specific issue by offering great specific heat absorption capability using non-conductive fluids," according to Rogers.
The real-world impact is significant. The Uptime Institute reports the average datacenter's power usage effectiveness (PUE) rating in 2020 was 1.58—and the figure has been stagnant since 2013.d Highly efficient air-based cooling typically delivers a PUE of about 1.2 to 1.4 (the lower figure is more common in cloud datacenters like AWS, while the latter figure is more representative of an enterprise datacenter), but immersion cooling methods lower that figure to approximately 1.03 or better, Beran points out. Depending on climate conditions, the availability of renewables and other factors, immersion cooling can make a profound difference in both energy consumption and costs. The end-game is to get PUE ratings as close as possible to 1.0.
However, PUE, which is the de facto standard for defining energy efficiency in datacenters, does not entirely capture the significance of more advanced liquid and immersion cooling methods because there is a ripple effect, including reducing dependence on fans and other electrical components. Jacqueline Davis, a research analyst at the Uptime Institute, points out that liquid cooling and immersion techniques "profoundly change the profile of datacenter energy consumption."e
Two primary types of immersion cooling exist. For now, the most widely used immersion cooling technique is Single Phase Immersion, which relies on an accessible enclosure filled with dielectric fluid. A pump circulates the dielectric fluid or deionized water in the enclosed space until it comes into contact with a heat exchanger, which pulls the heat out and transfers it to a water circuit, before returning the cooler fluid back to the enclosure. With Single Phase Immersion, the coolant never boils or freezes, and there is little or no risk of evaporation. Typically, servers are installed vertically inside a horizontally oriented cooling bath.
Depending on climate conditions, the availability of renewables and other factors, immersion cooling can have a profound impact on energy consumption and costs.
Single Phase Immersion products are widely available. For example, Green Revolution Cooling (GRC), acknowledged as a leader in the field, has seen its technology deployed at several supercomputing sites, including facilities operated by the U.S. National Security Agency (NSA), the U.S. Air Force, and the Tokyo Institute of Technology. The company says its technology slashes datacenter cooling costs by as much as 95% while reducing overall power consumption by 50% or more. At a typical datacenter, switching to liquid immersion can also cut carbon output by 31%. "Immersion cooling reduces the cost, complexity, and the environmental impact of the world's digital infrastructure," says Gregg Primm, vice president of marketing for GRC.
In Two-Phase Immersion (TPI) cooling, electronic components are placed in a hermetically sealed enclosure filled with dielectric fluid. The electronics release heat into the fluid and cause it to boil at approximately 50 degrees Celsius. The resulting vapor condenses on a heat exchanger within the tank. The heat is transferred to water that flows outside the facility. The process, which offers the added benefit of using environmentally friendly non-flammable fluids, results in exponentially greater heat transfer than 1PI, though the approach remains in the early developmental stage.
Not surprisingly, the fluids themselves also are advancing. Although some dielectric substances used for immersion cooling are derived from mineral, vegetable, fluorocarbon, or synthetic oils, 3M and other companies have developed inert, fully fluorinated liquids that are clear, odorless, non-flammable, non-oil-based, low in toxicity, and non-corrosive. It is possible to match these products specifically to heat-transfer requirements. In addition, some of these dielectric fluids have been formulated for low global warming potential (GWP) and zero ozone depletion potential (ODP). Primm says GRC products utilize fluids that are biodegradable, non-toxic, designed to last 15 years or more, and are fully recyclable.
If it sounds as though immersion cooling makes perfect sense—and significantly cuts costs in datacenters—a basic question arises: why hasn't the technology been widely adopted? Rogers says immersion cooling is viable, but it can be somewhat messy and involve ongoing operation and maintenance (O&M) expenses related to managing the systems and fluids. What's more, the up-front price tag for immersion cooling can be steep. Depending on the structure of an existing datacenter, the return on investment (ROI) is distant or non-existent, he points out.
Oak Ridge is among the organizations approaching immersion cooling methodically. It has adopted cold plate technology, but balked at adopting Single Phase Immersion cooling. "The biggest obstacle is substantiating the return on investment when, with existing technology, we can already capture over 95% of the heat from systems that are generating hundreds of kilowatts of waste heat." Rogers says there is no perfect approach: cold plates as well as enhanced water-cooling systems create their own sets of headaches, including introducing thousands of points of failure across large systems.
"Immersion cooling is ready to make a major impact. We're very close to reaching the tipping point where it will make a major difference."
Beran says that ultimately, most resistance is cultural; "There's a fear that systems will leak." However, immersion cooling has advanced to the point where the risk of a spill and contamination are remote. "While it's necessary to have a spill containment strategy in place, it really isn't all that different than having a fire extinguisher in your home. A lot of times people go their entire lives without using it, but it's there if you need it," he says. Other concerns, such as fluids dissolving stickers that display serial numbers, can easily be solved by laser etching numbers onto equipment, he notes.
Meanwhile, researchers continue to explore different technology components and frameworks for immersion cooling, as well as how to incorporate natural and synthetic fluids more effectively. For example, researchers are now looking for ways to enable the boiling of a cooling fluid directly in contact with electronic components.f There's also a focus on adapting and expanding the technology for battery systems, solar panels, and other devices that generate heat. For instance, one current research method centers on the use of modular jet oil cooling technology to draw heat from lithium-ion packs used in stationary electrical storage and transportation applications.g
To be sure, immersion cooling appears to be ready for prime time. In fact, vendors such as Submer now package systems in pods and micropods that are essentially plug-and-play, with coolant that can last 20 years.h Beran says the approach can cut operating costs by 33% compared to traditional air-based cooling.i Market research firm Market-StudyReport predicts adoption will grow by 24% from 2020 to 2025.j
"There are many ways to reduce energy consumption in datacenters," Beran concludes. "We're seeing new innovations and technologies emerge all the time. But immersion cooling is ready to make a major impact. We're very close to reaching the tipping point where it will make a major difference."
Birbaraha, P., Gebrael, T., Foulkes, T., Stillwell, A., Moore, A., Pilawa-Podgurski, R., Miljkovic, N.
Water Immersion Cooling of High Power Density Electronics, ScienceDirect, Vol. 147, February 2020.
Trimbake, A., Pratap Singh, C., Krishnan, S.
Mineral Oil Immersion Cooling of Lithium-Ion Batteries: An Experimental Investigation, American Society of Mechanical Engineers (ASME) Digital Collection, May 2022, 19(2): 021007
Matsuoka, M., Matsuda, K., Kubo, H.
Liquid Immersion Cooling Technology with Natural Convection in Data Center, 2017 IEEE 6th International Conference on Cloud Networking (CloudNet), October 19, 2017.
Pérez, S., Arroba, P., Joya, J.M.
Energy-conscious optimization of Edge Computing through Deep Reinforcement Learning and two-phase immersion cooling, Future Generation Computer Systems, ScienceDirect, July 31, 2021.
©2022 ACM 0001-0782/22/6
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from email@example.com or fax (212) 869-0481.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.
No entries found