Day of the Trifid: high-performance computing and education


Wednesday, 07 November, 2012


Day of the Trifid: high-performance computing and education

Academia is a cutthroat business. Far from being a relaxing haven where scientists endlessly pontificate on matters eternal at their own leisure, it is a constant battleground over continued funding.

Researchers can’t afford to be slowed by technology - they need to process large batches of research calculations quickly or they risk a rival publishing a finding first. Being first to publish can mean a greater likelihood of getting future funding. Being beaten to the punch can leave you as a footnote, a mere verifier of existing findings.

As such, researchers who need to perform complicated calculations on giant datasets rely on what’s called high performance computing (HPC).

HPC facilities are generally clusters of computing power that allow researchers to log in remotely and submit datasets and computations for processing. They typically run some form of Linux, and carry many hundreds or thousands of processors, realising speeds of many trillion calculations per second.

They’re not appropriate for all research - if you’re trying to find a correlation between just a few variables with a small sample size, your average desktop PC is likely powerful enough to get the job done pretty quickly. But if you’re modelling the molecular make-up of a star system, you’re gonna need more grunt. HPC lends itself to life sciences, engineering and computing.

As of June 2012, Australia had six HPCs in the Top500 - a ranking of the world’s 500 fastest supercomputers.

Many universities have their own HPC facilities, or use the services of an outside provider. Australia has a few HPC facilities, starting with that of the National Computational Infrastructure (NCI), which is funded by the federal government and hosted at the Australian National University (ANU). With 36 TB of RAM, 800 TB of storage and 11,936 cores, it has a peak performance of 140 TFlops (trillions of floating point operations per second).

The NCI facility has been used for simulating reactions between molecules, gaining insight into collisions between subatomic particles, modelling the dynamics of the Southern Ocean and much more.

Intersect, an organisation which serves NSW researchers, has an HPC system called McLaren, an SGI Altix 4700 with 128 dual-core CPUs, 1 TB of RAM and 12 TB of disk space. The organisation has plans to replace McLaren with Orange, a system with 100 cluster nodes, 1600 cores and 101 TB of shared storage, capable of delivering 33.3 TFlops.

VPAC (Victorian Partnership for Advanced Computing), an organisation established to provide computing services to Victorian research institutions, has just launched a new HPC facility, going by the name of Trifid.

Trifid offers 45.9 TFlops of performance via 2880 cores and a 165 TB storage array. It will be available to researchers at VPAC’s member institutions, including Deakin, RMIT and Monash universities, among many others.

According to RMIT’s Professor Heinz Schmidt, Trifid will enable researchers to “better understand the nature of proteins and complex materials for next-generation solar cells, more effectively model medical radiation used in cancer treatments and support the breadth of engineering applications”.

Related Articles

Revolutionising connectivity: the trends redefining data centres in 2024

The rush of generative AI has hit the IT ecosystem hard.

Five key data trends Australian IT leaders need to know about this year

With zettabytes of data freely available at our fingertips, businesses must look inwards and...

Future-proofing digital growth in the cloud

As companies move into 2024, many will grapple with the best approach to unlocking the full...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd