Cool computing brings astronomical results
The Pawsey Supercomputing Centre in Perth was established primarily for the purposes of analysing the flood of data coming from a new generation of current and future radio telescopes located in the remote Western Australia desert. Equipped with two supercomputers — called Magnus and Galaxy — the centre deals with an almost unbelievable torrent of data. Magnus is currently the most powerful supercomputer in the Southern Hemisphere, performing at one petaflop, which is a thousand trillion floating-point operations per second. If every one of the 7.5 billion people on the planet were given a standard calculator, it would take them more than 10 years to process the amount of data that Magnus processes in one second.
“Magnus has the power of approximately 40,000 laptops running at the same time,” said Dr Neil Stringfellow, the executive director of the centre. “This enables researchers to process data and approach real scientific problems, achieving what we never thought possible.”
All of that computing power has to be kept at precisely the right temperature for optimum performance. Cooling requirements are becoming more stringent as the computing power of servers is increasing on the same footprint, leading to increased power consumption and heat. This drives the packing density per rack and requires new ways of cooling the equipment — for no matter how high the air volume, a fan can never cool lower than the surrounding ambient temperature, which is why a more active cooling regime is needed… particularly for critical, high-performance applications.
In an overall sense, the centre is cooled via an innovative system that draws water from underground aquifers. This groundwater system saves approximately 14 million litres of water every year when compared to standard cooling towers. As for the equipment, part of the centre’s computing infrastructure is kept at the right temperature thanks to Varistar LHX+ Rack Integrated Coolers supplied by Pentair. The main benefit these units provide compared with previous systems is that the cooling is close to the equipment, so sensors can immediately detect changes in compute utilisation and thus the changing heat load. Active fans mean there is no undercooling of the equipment and there are no problems with air pressure drop.
The LHX+ comes into its own when the heat load exceeds 6–10 kW and, by baying several cabinets together, small-room independent cubes can be set up. They also support applications where EMC shielding or reduced noise is required.
Fifty Varistar LHX+ units were supplied to the Pawsey Supercomputing Centre, with the solution provided as a complete system including cabinets, cooling and remote management.
Phone: 1300 137 344
Schneider Electric's EcoStruxure Micro Data Centre is designed to support distributed IT...
The Schneider Electric RD300 series of InRow Direct Expansion units are a close-coupled cooling...
The Schneider Electric APC Smart-UPS On-Line SRT1000XLI 230 V has been developed to handle...