New fanless cooling technology improves energy efficiency for AI workloads by achieving a 90% reduction in cooling power consumption
- New HPE fanless cooler reduces server blade energy consumption by 37%
- The system uses direct liquid cooling, perfect for AI technologies
- The architecture is designed to be scalable depending on business needs
Hewlett-Packard Enterprise (HPE) recently hosted its AI Day 2024 event, introducing the industry’s first 100% fanless direct liquid cooling architecture.
As artificial intelligence (AI) technologies continue to develop, energy consumption in the next generation of accelerators has increased, surpassing the capabilities of traditional air cooling methods.
Organizations running large-scale AI workloads are now looking for more efficient solutions to manage the energy demands of their infrastructure, and hPE is a pioneer in direct liquid cooling technology, which has become one of the most effective methods for cooling high-performance AI systems. This approach has enabled HPE to deliver seven of the ten most energy-efficient supercomputers on the Green500 list.
100% fanless direct liquid cooling addresses cooling challenges in AI systems
The new cooling system is designed to improve efficiency in several key areas HPE Saying that the fanless architecture reduces energy consumption for cooling by 90% compared to traditional air cooling systems, and provides significant environmental and financial benefits.
The system is built on four core elements. First, the system uses a comprehensive cooling design with an eight-element system that cools the GPU, CPU, server blade, local storage, network fabric, rack, cluster, and coolant distribution unit (CDU).
The second element is that the fanless cooler also offers high-density performance that supports compact configurations, backed by rigorous testing, monitoring software and on-site services to ensure smooth deployment.
Third, for those concerned about the environment, the new system uses an integrated network fabric that enables large-scale connectivity with lower costs and less power consumption, for a more sustainable architecture. Finally, the architecture runs on an open system design that provides flexibility by supporting various accelerators, allowing organizations to select solutions that best suit their needs.
The fanless architecture reduces cooling power consumption by 37% per server blade compared to hybrid liquid-cooled systems, not only reducing energy costs, but also reducing CO2 emissions and eliminating data center fan noise. Additionally, the design allows for higher server cabinet density, allowing organizations to halve their floor space requirements.
“As organizations embrace the capabilities created by generative AI, they must also advance sustainability goals, combat escalating power demands and reduce operational costs,” said Antonio Neri, president and CEO of HPE.
“The architecture we unveiled today uses liquid cooling exclusively, delivering greater energy and cost-efficiency benefits than alternative solutions on the market. In fact, this direct liquid cooling architecture delivers a 90% reduction in cooling energy consumption compared to traditional air-cooled systems.”