Cluster power ending Australia's research computing drought

By David Braue
Friday, 05 December, 2003

If computing power were water, Australia's drought would be rapidly coming to a close. Thanks to lowering costs and steadily improving technology, Australian life scientists and other researchers are rapidly gaining access to unprecedented amounts of computing power to support their work.

The latest feather in their bow came in late November, with the launch of Australia's newest and largest supercomputer: a massive assemblage of 155, 3GHz Dell Computer servers that together deliver around 1.8 teraflops (trillion floating point operations per second) of computing power.

That makes the system -- nicknamed 'Barossa' and housed at the Australian Centre for Advanced Computing and Communication -- one of the world's hundred fastest computers, according to the Top 500 list (www.top500.org) that has become the Bible of high-performance computing (HPC) pundits.

Barossa's power is already helping researchers such as University of New South Wales PhD student Chris Cotsapas, who has used it to run immeasurably complex calculations relating to his work with the mouse genome. With a recent project involving some 2 million analyses -- with 60 billion calculations each -- Cotsapas says the project would have taken 5700 years, running on his own desktop PC; Barossa knocked back the calculations in 32 hours.

"Our ability to describe the nature of living systems relies more and more on collaboration between biologists, mathematicians and computer scientists," says mouse genome project chief investigator Prof Peter Little, at UNSW. "Many of the new advances in genomics are a result of high-performance computing."

Realising these advances is frustrating for researchers who to have to wait for lower-powered systems to wade through masses of calculations. By getting more ready access to HPC systems, researchers in life sciences will be able to expand the scope and complexity of their calculations whilst getting results more quickly.

Expensive and proprietary supercomputers have been setting the standard for high-performance computing for decades, but manufacturers' expertise in clustering together hundreds of inexpensive Intel-based processors has led to faster computers that cost a fraction of the price.

That has meant open season on computing power for universities and research institutions, where expensive high-end computers have traditionally been hard to access -- and prohibitively expensive, given the limitations of research budgets.

The shift is reflected in the latest Top 500 list, which debuted in November to reveal that such systems now account for 189 of the top 500 fastest systems in the world -- making commodity processors the single most common architecture among those fastest systems. One year ago, the number was just 56.

While that's the kind of announcement that launches a thousand press releases, it has far more significant implications for an industry that has long wrestled with issues of iniquity in access to computing resources. This has particularly been the case in Australia, where CSIRO, CSL, the Bureau of Meteorology, Department of Defence and other well-funded organisations have had the majority of supercomputing facilities while university researchers competed for slices of time on heavily shared HPC systems.

For life science researchers, the trend towards commoditisation means that faster, cheaper systems could quickly become commonplace within Australia -- enabling the more ready testing of new hypotheses and allowing more people to access HPC resources at once. In addition to Barossa, state computing partnerships in Brisbane, Adelaide, and Melbourne have recently invested in new HPC systems, and more are likely to follow as funding allows.

In some cases, universities are taking completely new approaches to harness large numbers of systems. At the University of Virginia in the United States, researchers were pleasantly surprised when an experiment -- linking together hundreds of desktop PCs based on Apple Computer's new G5 processor -- produced the world's third-fastest supercomputer.

Other efforts, such as Lawrence Livermore National Laboratory's plan to produce a nearly 4000-CPU HPC cluster by January, will threaten that system's position, but bragging rights in the Top 500 are a side benefit of the trend towards commoditisation. The real beneficiaries will be those whose research has previously been limited by the practicalities of access and available computing power.

Related News

Stem cell experiments conducted in space

Scientists are one step closer to manufacturing stem cells in space — which could speed up...

Plug-and-play test evaluates T cell immunotherapy effectiveness

The plug-and-play test enables real-time monitoring of T cells that have been engineered to fight...

Common heart medicine may be causing depression

Beta blockers are unlikely to be needed for heart attack patients who have a normal pumping...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd