Sequoia Passes One Million Cores
January 28, 2013

Researchers Set New Record With Sequoia Supercomputer

Peter Suciu for — Your Universe Online

The most advanced gaming PCs rely on quad-core processors, but that pales to what researchers at Stanford's Engineering Center for Turbulence Research (CTR) have created with a new record in computational science. The Stanford researchers successfully used a supercomputer with more than one million computing cores to solve a complex fluid dynamics problem; that being the prediction of noise generated by a supersonic jet engine.

The work was conducted on the newly installed Sequoia IBM Bluegene/Q system at Lawrence Livermore National Laboratories. The system was delivered to the lab in 2011 but was not fully deployed until June of last year. Shortly after, the TOP500 Project Committee announced that Sequoia replaced the K Computer as the world´s fastest supercomputer and it is the first to cross 10 petaflops of sustained performance. The IBM Sequoia runs on Linux.

The Sequoia features 1,572,864 compute cores -- or processors -- and 1.6 petabytes of memory connected by a high-speed, five-dimensional torus interconnect.

Joseph Nicholas, a research associate at CTR, was among the researchers to prove that the million-core fluid dynamics simulation was possible, and was also able to contribute to research aimed at designing quieter aircraft engines. Thus, a high-performance computer was used to aid in the development of high-performance aircraft engines.

The research study looked at how the exhausts of high-performance aircraft during takeoff and landing are among the most powerful human-made sources of noise, which can be dangerous to the hearing of those on the ground -- even with the most advanced hearing protection available. The sound from the aircraft also creates an acoustically hazardous environment, and for the communities around an airport, it can be a major annoyance as well.

The solution has been to design new nozzle shapes and the predictive simulations that are accomplished through advanced computer models can aid in their design. These simulations can help scientists measure processes occurring within the exhaust environment -- an area that is otherwise inaccessible to experimental equipment. The data from the simulations is already helping to drive computation-based scientific discovery, as researchers look deeper into the physics of noise.

This is just one area of research that supercomputers are helping in advanced simulations, including those of computational fluid dynamics.

“Computational fluid dynamics (CFD) simulations, like the one Nichols solved, are incredibly complex. Only recently, with the advent of massive supercomputers boasting hundreds of thousands of computing cores, have engineers been able to model jet engines and the noise they produce with accuracy and speed,” said Parviz Moin, the Franklin M. and Caroline P. Johnson Professor in the School of Engineering and Director of CTR.

Supercomputers, such as Sequoia, can tackle CFD simulations in part by divvying up the complex math into smaller parts, allowing the math to be computed simultaneously. This, in turn, means the more cores working together, the faster and more complex each calculation can be. However, as the difficultly of the calculations increase, it can even be challenging for a supercomputer -- becoming the “kryptonite” for even the most super of supercomputers.

The Stanford researchers have had to  deal with such setbacks, and in the past week the first CFD simulation passed through initialization that scaled up and passed the all-important one-million-core threshold.

“These runs represent at least an order-of-magnitude increase in computational power over the largest simulations performed at the Center for Turbulence Research previously,” said Nichols “The implications for predictive science are mind-boggling.”