Simulation and High-Performance Computing

As part of the Department of Energy’s mission to enhance U.S. competitiveness by accelerating innovation, DOE is working to make today’s world-class supercomputers available for scientific and industrial simulation, and expanding the accessibility of software and hardware tools. To that end, last fall my office held a Simulations Summit here in Washington, where I hosted more than 70 leaders of academia, industry, government, and national research laboratories to discuss policies and plans for bringing science and simulations to bear on our national competitiveness mission. During his keynote address at the Summit, Secretary Chu said that “the DOE strategy should be to make simulation part of everyone’s toolbox.”

The Department of Energy’s Office of Science (SC) is addressing that need by pushing the boundaries of computing and simulation to advance key science, math, and engineering challenges facing the nation. SC makes advanced supercomputers available and supports high-fidelity simulations that give scientists the power to analyze theories and validate experiments that are dangerous, expensive or impossible to conduct.  Scientific simulations are used to understand everything from stellar explosion mechanisms to the quarks and gluons that make up a proton. They can tell us how blood flows through the body and how to make a more efficient combustion engine. And they can do much more.

Climate simulations model the past and future global climate with input from observational data and mathematical models. Advanced computational techniques and computer capabilities allow researchers to increase the resolution and refine the grids for climate simulations that enable better regional and global predictions. High performance computing allows us to model the current state of the ice sheets in Antarctica and Greenland more accurately, as well as the motions of water, gases, and storm systems in the atmosphere. The fracturing and melting of the ice sheets is important to understand so that we can predict how those changes will affect sea level and ocean temperature.

[q:media id="{F415FC48-C186-460F-B4AC-4F0BF1754474}" width="512" height="288"]

[No audio.]
Climate models are based on our observations of the many environmental conditions on Earth. The data collected during these observations is put into supercomputers. With the help of these computers, scientists use quantitative methods to create simulations such as the ones shown here. Using the Community Climate System Model, which links independent models of Earth’s oceans, lands, ice, and atmosphere, this simulation shows a complex range of processes that circulate water in the atmosphere.  

Credits: Scientific Visualization, Jamison Daniel and James Hack. Post-Production, NCCS

Click to enlarge photo. Enlarge Photo

Image displays a volume rendering of a hydrogen/air jet flame with hydroperoxy radical (ignition marker, red and yellow) and hydroxyl radical, (flame marker, blue) Image courtesy of K. L. Ma, Univesity of California-Davis.

Computer-generated simulations of flames allow scientists to distinguish the underlying features of their ignition and composition, including their flow characteristics, chemical make-up, and temperature profile. The above image displays a volume rendering of a hydrogen/air jet flame with hydroperoxy radical (ignition marker, red and yellow) and hydroxyl radical, (flame marker, blue).

 

Simulations of combustion allow the design of cleaner-burning automobile engines and power-generation devices, enabling reduction in the environmental impact of fuel burning. Models of turbulence in flames allow scientists to understand the chemistry of fuel burning, which helps them design fuel-efficient engines that produce less pollution. Modeling also allows industrial boilers to be designed to be more efficient with reduced emissions. Engineering-based simulations of reactors generating power via nuclear fission are a key to continuing and expanding the use of nuclear power – a clean and abundant energy source. Additionally, simulations of fusion plasmas help us better understand how power may be harnessed from nuclear fusion – the energy source of the sun.

High performance computing makes detailed simulations of materials and their properties possible, improving our understanding of phenomena like superconductivity. Superconducting materials could have a dramatic impact on energy use and our economy by improving the speed of power sources, vehicles, trains, and electric machinery, as well as lower their maintenance costs.  Advanced simulations have allowed scientists to better understand how some materials are able to conduct electricity without any resistance and they have enabled improved research into the spin effects that determine the transition temperatures. Some specific examples of breakthough science achieved via HPC are described here.

Advanced simulations inform policy makers on how to address social and environmental issues such as climate change (as described above).

[q:media id="{0323BED3-5439-43F3-9546-BC61EECF4A8E}" width="512" height="288"]

[No audio.]
This simulation illustrates the carbon give-and-take between land and atmosphere. Green represents the biosphere’s uptake of carbon dioxide during daylight hours when plants are engaged in photosynthesis. Red represents the net return of carbon dioxide to the atmosphere during nighttime, when respiration dominates. The Earth appears to “breathe.” A cycle of photosynthesis and respiration is a major feature of the long-term maintenance of Earth’s balance of carbon dioxide and oxygen.
 

Credits: Scientific Visualization, Jamison Daniel and Forrest Hoffman. Post-Production, NCCS.

No other technology has improved as quickly and as steadily as computing – today’s supercomputers are one trillion (1,000,000,000,000) times faster than the supercomputers of the 1950s. The performance of supercomputers is measured in flops, shorthand for floating point operations per second, the speed at which the computer can perform mathematical calculations. The most powerful supercomputers today can achieve petaflop performance, or one quadrillion (1,000,000,000,000,000) flops. The U.S. is a leader in high-performance computing: of the top 500 supercomputers in the world, more than half are in the U.S., and 90% were built by U.S. hardware vendors. Much of the progress in computing can be attributed to efforts by the Department.

I am spearheading work within the Department to develop the next generation of supercomputers, which will be able to achieve exaflop performance, a factor of 1000 more powerful than today’s most powerful computers in order to solve problems of scientific and technical importance in DOE missions. These machines will require paradigm shifts in both hardware and software in order to use significantly less energy per unit of computation and be more resilient against hardware failures. High-performance computing is a technology in which each generation of performance relies on the last, and we will use our most powerful computers to design and simulate the next generation.

Raising the computing power ceiling an additional factor of 1000 will enhance all sectors of the U.S. economy, national security, and scientific research. Improvements in high-performance computing benefit all computer users, not just those who use these world-class machines. Hardware innovation to drive down the energy consumption of processors and memory for exascale machines will be directly applicable to commodity electronics, making portable computers and smart phones much more powerful. Private sector consumers of high-performance computing use simulation to accelerate and reduce the cost of innovation in the design and manufacturing of their products, in applications stretching from advanced materials for engines and airplane wings to advanced chemicals for household products to the design of newer and faster consumer electronics.

Industrial computing users at the Simulations Summit were enthusiastic about DOE’s plans to pursue exascale capability, as they understand that the hardware and software innovations necessary for this leap in power will also allow them to increase the pace of their innovation. SC sponsored a series of workshops in 2008-9 to explore the role and future impact of scientific computing at extreme scales such as exascale on the grand challenges in science. Covering climate science, high energy physics, astrophysics, nuclear physics, materials sciences, chemistry, fusion plasmas and nuclear security, these forward looking workshops identified critical areas where exascale computing could provide significant gains.

The U.S. is not alone in developing world-class supercomputing capacity. Chinese high-performance computing researchers have announced that they have achieved a performance of 2.5 petaflops using their Tianhe-1A supercomputer. This feat makes the Tianhe-1A machine the world’s most powerful computer, 40% faster than the top American machine located at Oak Ridge National Laboratory. The computing capability represented by these new machines will provide Chinese consumers of high-performance computing with world-class facilities and threatens the competitive advantage the U.S. has held in this area for many years. At DOE, we have been anticipating this competitive challenge and are actively working to maintain and expand our world-class computing capability. As Secretary Chu said at the Simulations Summit, “the obligation of our leadership to the next generation of computation is a challenge and an opportunity.”

A golden moment has presented itself to continue U.S. leadership in simulations, but concerted action and continued DOE leadership are necessary to turn this opportunity into reality. The announcement of China’s supercomputing capabilities reinforces the need to act decisively and promptly to develop the next generation of supercomputers while building the programmatic and software tools to harness existing computing power and drive U.S. leadership in the scientific, national security, and industrial applications of high-performance computing.