Quantcast

Exascale: Faster Than 50 Million Laptops

March 30, 2012

Today’s most powerful supercomputers could soon become obsolete if computer scientists can develop a new line of super fast calculating machines that perform more than 1,000 times as fast as today as those in current use.

In fact, scientists predict these exaFLOP computers will become available by the end of the decade, as the US, China, Japan, Europe and Russia, invest hundreds of millions into supercomputer research.

FLOPS, (FLoating Point Operations per Second), a term used to measure the performance output of supercomputers. Today’s fastest supercomputers can achieve speeds measuring in the petaFLOPs, or millions of billions, said Dimitrios Nikolopoulos, professor at the School of Electronics at the UK’s Queen’s University of Belfast.

Scientists, however, are hoping to create supercomputers that measure far beyond those speeds, measuring in the exaFLOPS, or billions of billions of operations per second. “It is the next frontier for high-performance computing,” Nikolopoulos told CNN.

The first computer to reach a petaFLOP was IBM’s Roadrunner in 2008. But that reign was only short-lived; the Cray Jaguar installed at Oak Ridge National Laboratory in the US broke past 1.75 petaFLOPS less than a year later.

That record also did not sit long, as Japan’s K computer, developed by RIKEN and Fujitsu, sailed far beyond 1.75 petaFLOPS, reaching record speeds of over 10 petaFLOPS, nearly four times faster than its closest rival, China’s 2.57 petaFLOP NUDT YH MPP supercomputer.

But developing a supercomputer like the K computer and IBM’s roadrunner, is no easy task. “The kind of space that you need is similar to that of a football field. You’re talking about many, many lanes of computer racks and thousands of processors,” said Nikolopoulos.

The K computer contains an unimaginable 88,128 computer processors and is made up of 864 refrigerator-sized cabinets.

Exascale computing probably won’t get much bigger than its petascale counterparts, added Nikolopoulos. And with tech engineers constantly downsizing the amount of space needed to store data and the such, it seems plausible that exascale computing could take up even less room, he noted, adding that the amount of processors needed will substantially increase, however.

Despite a real possibility of exascale computing, there remains “severe technology barriers,” Nikolopoulos said. “Power consumption of supercomputers in general is not sustainable.” The current projections suggest that power consumptions of exascale computers will be 100 megawatts. “It is impossible to build a suitable facility and have enough power,” he added.

But even so, there are many benefits exascale computing could bring if engineers can reach past any barriers that arise.

One benefit would enable discoveries in many areas of science, Nikolopoulos said. “Aerospace engineering, astrophysics, biology, climate modeling and national security all have applications with extreme computing requirements.”

Bill Cabbage, public information officer at Oak Ridge National Laboratory, said exascale computing will attempt to confront serious challenges in energy supply and sustainability.

“These are very difficult problems and will require the development of new forward-thinking technologies to deal with them,” he told Matthew Knight at CNN. “We are bringing all our resources to bear on these problems.”

Nikolopoulos said exascale computing could also benefit social sciences.

“More and more people are interested in understanding the behaviors of societies as a whole. These require simulations — how people interact, communicate, how they move. That will require exascale computing,” he said.

On the Net:


Source: RedOrbit Staff & Wire Reports



comments powered by Disqus