Stanford geniuses break million-core supercomputer barrier
Photo: Courtesy of Lawrence Livermore National Laboratories |
Stanford
Engineering's Center for Turbulence Research (CTR) has set a new record in
computational science by successfully using a supercomputer with more than one
million computing cores to solve a complex fluid dynamics problem -- the
prediction of noise generated by a supersonic jet engine. Joseph Nichols, a
research associate in the center, worked on the newly installed Sequoia IBM
Bluegene/Q system at Lawrence Livermore National Laboratories (LLNL) funded by
the Advanced Simulation and Computing (ASC) Program of the National Nuclear
Security Administration (NNSA). Sequoia once topped list of the world's most
powerful supercomputers, boasting 1,572,864 compute cores (processors) and 1.6
petabytes of memory connected by a high-speed five-dimensional torus
interconnect.
Because
of Sequoia's impressive numbers of cores, Nichols was able to show for the
first time that million-core fluid dynamics simulations are possible -- and
also to contribute to research aimed at designing quieter aircraft engines.
The
physics of noise
The
exhausts of high-performance aircraft at takeoff and landing are among the most
powerful human-made sources of noise. For ground crews, even for those wearing
the most advanced hearing protection available, this creates an acoustically
hazardous environment. To the communities surrounding airports, such noise is a
major annoyance and a drag on property values.
Understandably,
engineers are keen to design new and better aircraft engines that are quieter
than their predecessors. New nozzle shapes, for instance, can reduce jet noise
at its source, resulting in quieter aircraft.
Predictive
simulations -- advanced computer models -- aid in such designs. These complex
simulations allow scientists to peer inside and measure processes occurring
within the harsh exhaust environment that is otherwise inaccessible to
experimental equipment. The data gleaned from these simulations are driving
computation-based scientific discovery as researchers uncover the physics of
noise.
Courtesy of the Center for Turbulence Research, Stanford University |
More
cores, more challenges
"Computational
fluid dynamics (CFD) simulations, like the one Nichols solved, are incredibly
complex. Only recently, with the advent of massive supercomputers boasting
hundreds of thousands of computing cores, have engineers been able to model jet
engines and the noise they produce with accuracy and speed," said Parviz
Moin, the Franklin M. and Caroline P. Johnson Professor in the School of
Engineering and Director of CTR.
CFD
simulations test all aspects of a supercomputer. The waves propagating
throughout the simulation require a carefully orchestrated balance between
computation, memory and communication. Supercomputers like Sequoia divvy up the
complex math into smaller parts so they can be computed simultaneously. The
more cores you have, the faster and more complex the calculations can be.
And
yet, despite the additional computing horsepower, the difficulty of the
calculations only becomes more challenging with more cores. At the
one-million-core level, previously innocuous parts of the computer code can
suddenly become bottlenecks.
Ironing
out the wrinkles
Over
the past few weeks, Stanford researchers and LLNL computing staff have been
working closely to iron out these last few wrinkles. This week, they were glued
to their terminals during the first "full-system scaling" to see
whether initial runs would achieve stable run-time performance. They watched
eagerly as the first CFD simulation passed through initialization then thrilled
as the code performance continued to scale up to and beyond the all-important
one-million-core threshold, and as the time-to-solution declined dramatically.
"These
runs represent at least an order-of-magnitude increase in computational power
over the largest simulations performed at the Center for Turbulence Research
previously," said Nichols "The implications for predictive science
are mind-boggling."
A
homecoming
The
current simulations were a homecoming of sorts for Nichols. He was inspired to
pursue a career in supercomputing as a high-school student when he attended a
two-week summer program at Lawrence Livermore computing facility in 1994
sponsored by the Department of Energy. Back then he worked on the Cray Y-MP,
one of the fastest supercomputers of its time.
"Sequoia
is approximately 10 million times more powerful than that machine,"
Nichols noted.
The
Stanford ties go deeper still. The computer code used in this study is named
CharLES and was developed by former Stanford senior research associate, Frank
Ham. This code utilizes unstructured meshes to simulate turbulent flow in the
presence of complicated geometry.
In
addition to jet noise simulations, Stanford researchers in the Predictive
Science Academic Alliance Program (PSAAP), sponsored by the Department of
Energy, are using the CharLES code to investigate advanced-concept scramjet
propulsion systems used in hypersonic flight (with video) -- flight at many
times the speed of sound -- and to simulate the turbulent flow over an entire
airplane wing.
Source: Stanford School of Engineering
Posted by Unknown
on Tuesday, January 29, 2013.
Filed under
Chemistry and Physics
.
You can follow any responses to this entry through the RSS 2.0