The most powerful supercomputer in Australia was unveiled today at the National Computing Infrastructure (NCI) facility at the Australian National University in Canberra, by Senator Kim Carr, the Minister for Innovation, Industry, Science and Research.
The supercomputer was supplied by Sun Microsystems and contains 3,000 quad-core Intel Nehalem processors with 36TB of memory and a petabyte filesystem.
Its computational performance is 140 teraflops - 140 trillion floating point operations per second.
David Singleton, systems manager at the NCI facility, said that the supercomputer was built using Intel's 45nm Nehalem processors.
"Intel just had a bump in performance so it is a really good time for us to be procuring the machine right now," he said.
However, Singleton explained that a real supercomputer not only requires thousands of processors, it also needs those processors to be able to exchange information with other parts of the computer without any bottlenecks.
"You can have all those CPUs sitting there but if they can't talk to each other they are just a bunch of CPUs. We have the latest quad data-rate network (40 gigabits per second) ... everything can talk to everything at that speed so there isn't a bottleneck anywhere.
"On top of that it has a very serious global file system that sits on 1,200 disks and will achieve a 20 gigabit I/O data rate. If you get 50 megabits out of your laptop, you are doing well," he said.
According to SIngleton, the supercomputer is 12 times faster than the previous model and uses about 50 percent more power.
The power consumption has not increased at the same rate as processing power because of multi-core technologies, which create CPUs with more than one core, which doubles the processing power without using more electricity.
"Each core is twice as fast as the previous CPUs and four cores use no more power than the previous generation. Previously we had 64 CPUs in a rack, now we have 768 CPUs in a rack," he added.
Senator Carr said that modern science depends on the ability to crunch numbers.
"It has been estimated that 487 billion gigabytes of new digital data was created in 2008. The Large Hardon Collider (LHC) will generate up to 1,600MB of data per second when it is in full operation ... data generated from Square Kilometer Array radio telescope, in one week of operation, will equal all the words ever spoken by humanity.
"All these numbers tell us we will struggle to achieve the research outcomes we are looking for without sufficient investment in cutting edge communications technology," added Carr.