Since this summer, the Summit supercomputer at Oak Ridge National Laboratory in Oak Ridge, Tenn., is the world’s fastest supercomputer. | Photo by Boris Ladwig

OAK RIDGE, Tenn. — About a four-hour drive south-southeast of Louisville, in the bowels of a modern building surrounded by national forests, a low hum mixes with the swooshing of rushing water and mechanical hammering to create a cacophony that resembles a loud vacuum cleaner.

On a spotless white floor behind a key-card entry door rest rows upon rows of 6-foot-4.5 inch tall sleek, black cabinets, the fronts of some of which feature a blue-and-black zigzag design reminiscent of a bolt of lightning. Overhead, thick black tubes twist from the ceiling to deliver and withdraw their payloads of water, electricity and data.

The mundaneness of the server farm belies its extraordinariness: It houses the world’s most powerful computer, Summit, which was unveiled this summer and will begin its work in earnest early next year.

Insider Louisville visited Oak Ridge National Laboratory as part of its series on robots, artificial intelligence and automation.

The interior of one of summit’s cabinets. | Photo by Boris Ladwig

Summit, funded by the U.S. government, cost about $200 million and is capable of making 200 quadrillion (that’s 200,000,000,000,000,000) calculations per second. That roughly means that if all people on earth completed one calculation every second, it would take humanity 305 days to do what Summit can do in one second.

With about 10 times the computing power of its predecessor, Titan, scientists hope to use Summit’s power analyze enormous data sets to tackle the world’s biggest challenges, from predicting extreme weather events to unlocking the secrets of supernovae to understanding which genes predispose humans to opioid addiction.

Summit’s computing power resources are made available to national laboratories, federal agencies, universities and industry and are allocated based on scientific merit. The selection process is ongoing.

Each of Summit’s 256 cabinets hold 18 nodes, or computers, that each contain two IBM Power9 central processing units and six NVIDIA Volta graphics processing units.

An additional 40 storage racks can hold 250 petabytes of data, or as much as roughly 262 million personal computers. Or enough for about 74 years worth of high-definition video.

The setup stretches over 5,600 square feet or about two tennis courts. The U.S. Department of Energy, which oversees the facility, had to install concrete floors to hold the computer, as its cabinets and overhead infrastructure weigh 340 tons, which is more than a commercial aircraft.

Predicting deadly tornadoes

While Summit’s staggering specs — its annual electric bill is estimated at $10 million — are easily compiled, they are difficult to grasp, as are its capabilities and utility.

Jack Wells

And despite its heft, Summit actually will tackle only about 20 projects — though really big ones — said Jack Wells, director of science at the Oak Ridge Leadership Computing Facility.

Researchers come to Oak Ridge only with really big science problems that other computers, even supercomputers, cannot handle. The researchers’ cost to access the facility is that they publish their research, so it can benefit the entirety of America’s academic and industrial complex.

The DOE has operated supercomputers since the early 1990s, but about 15 years ago, the U.S. government realized that it was falling behind competitors, especially Japan at the time, Wells said. Congress in 2004 passed the High-End Computing Revitalization Act, which charged the DOE with implementing a program to advance high-end computing systems.

And to be sure, Summit represents America’s latest gauntlet thrown in the ever more competitive global battle for tech supremacy. Before Summit, the world’s fastest supercomputer was Sunway TaihuLight, built in Wuxi, China.

But with Summit and its predecessors, the U.S. government wants to provide researchers with the capability to tackle the most difficult challenges faced by humankind.

On a basic level, supercomputers are used to model and simulate complex, dynamic systems that would too expensive, impractical or impossible to physically demonstrate. Models with greater detail provide scientists with a greater understanding of anything from renewable energy to biological systems and weather forecasting.

Adrian Lauf, assistant professor in the University of Louisville’s Department of Computer Engineering and Computer Science, said that many areas that humans are studying are much too complex to model with the kind of accuracy that’s required to draw definitive conclusions. Lower computing capability requires scientists to make more approximations, which reduce the quality of the simulation — and their value to the scientist.

More detailed models frequently provide scientists with knowledge they did not have before, Lauf said. And sometimes they refute prior assumptions and point researchers into directions in which they would not have looked otherwise.

For example, he said, a recent tornado simulation created on a supercomputer provided researchers with critical information about what fuels a tornado’s strength.

The simulation, by Leigh Orf, a scientist with the Cooperative Institute for Meteorological Satellite Studies at the University of Wisconsin-Madison, recreated a deadly storm system from May 2011 that included an EF5 tornado. The simulation covered an area of 75 square miles and 12.5 miles tall and was based on data including temperature, air pressure, moisture and wind speed that Orf’s team captured in a weather balloon in the area before the storms formed.

Orf told Insider that the team used that input data to simulate what would happen.

“The output data is then analyzed, and that output data can contain a ton of different fields, and take hundreds of (terabytes) of disk space, making it a challenge to manage and analyze,” Orf said. “The output data fields I analyze are: wind, temperature, pressure, vorticity, rain/snow/hail/cloud, and some other derived quantities.”

The model Orf’s team produced required as much storage space as about a million home computers. And generating it took the supercomputer at the University of Illinois more than three days — though the school said it would have taken a desktop computer decades.

Aiding industry

Wells said some projects that Summit’s circuits will tackle will help U.S. companies solve challenges that they otherwise could not.

For example, Summit’s predecessor, Titan, helped General Electric after the company uncovered an unexpected combustion instability during tests of a gas turbine.

The company needed to understand the instability to determine whether it was related to design and therefore might emerge again in future turbines. GE lacked the capacity to adequately model the combustion physics.

The DOE said in a news release that figuring out instabilities in gas turbine combustors was “one of the most complex problems in science and engineering.”

“Simultaneously increasing the efficiency and reducing the emissions of natural gas-powered turbines is a delicate balancing act,” the DOE said. “It requires an intricate understanding of these massive energy-converting machines — their materials, aerodynamics, and heat transfer, as well as how effectively they combust, or burn, fuel. Of all these factors, combustion physics is perhaps the most complex.”

The GE H-class turbines come with 6-foot-long chambers in which combustion takes place at high pressure and temperature. Each of the turbines has a ring of at least 12 combustors, each of which can burn about three tons of air and fuel per minute at firing temperatures exceeding 2,700 degrees, or about the temperature at which iron melts. Each turbine burns about 20 tractor-trailers full of fuel-air mixture.

Finding the right mixture of fuel and air allows turbine makers to reduce emissions, but precision burning can cause an unstable flame inside the combustor, which can generate noise-induced pressure waves that can reduce turbine performance or even wear out the machine in a matter of minutes.

GE suspected the waves were coming from interaction between adjacent combustors but had no way to physically test its hypothesis, in part because of limits in camera technology. Instead, the company decided to try reproducing the instability in the virtual world but lacked, by far, the computing power necessary to simulate multiple combustors.

With the help of scientists at Oak Ridge and software developed by Palo Alto, Calif.-based Cascade Technologies, GE ran simulations on Titan that required the use of up to 16,000 computer cores simultaneously, producing a fine-mesh grid of nearly one billion cells, with each providing a microsecond-scale snapshot “of the air-fuel mix during turbulent combustion, including particle diffusion, chemical reactions, heat transfer, and energy exchange.”

A simulation of combustion within two adjacent gas turbine combustors. GE researchers created the model on the Titan supercomputer. | Courtesy of Oak Ridge

The simulation proved more insightful than a physical test, according to Joe Citeno, combustion engineering manager for GE Power.

“These simulations are actually more than an experiment,” he said. “They provide new insights which, combined with human creativity, allow for opportunities to improve designs within the practical product cycle.”

The DOE said that even small increases in gas turbine efficiency can yield enormous payoffs in cost and emissions reductions. A 1 percent improvement for a 1,000 megawatt power plant, which can power about 650,000 homes, could reduce carbon dioxide emissions by 17,000 metric tons per year, the equivalent of removing about 3,500 vehicles from the road. If applied to the entire natural gas power plant fleet, it would cut electricity generating costs by more than $1 billion every two years.

Companies that work with the DOE have to publish the results of their studies, which, Wells told Insider, boosts the state-of-the-art in making turbines more reliable and efficient, which benefits all turbine makers — and their customers. The collaboration keeps U.S. businesses more competitive, creates jobs, reduces the turbines’ environmental impact can cut costs for consumers.

Life, the universe and everything

Many supercomputer projects deal with basic unanswered science questions, Wells said.

For example, the Oak Ridge astrophysics group is going to create much more detailed models of exploding stars, or supernovae, to gain a better understanding of the elements that stars produce when they die.

Bronson Messer, a computational scientist with Oak Ridge, said his group hopes “to quantify how stardust is made and disseminated in the galaxy.”

Comparison of the size of the sun (upper left) to other stars. | Courtesy of NASA

Modeling a supernova is tricky, though, in part because it requires complex calculations of anything from hydrodynamics to small-scale particle interactions. And then there’s the sheer size of such events. The sun in humanity’s solar system is so large that it can hold the planet Earth 1.3 million times, and before the sun explodes, in roughly two billion years, it will swell so much that its outer layers will reach Earth, which currently is about 93 million miles away.

And the sun is but an average sized star. NASA has found stars with diameters 100 times bigger.

Wells said that Summit will allow Messer’s team to get a much more accurate model of the nuclear burning that takes place in a supernova because it can handle much more complex calculations. For example, when current state-of-the-art super computers create models of nuclear burning, they include 13 elements. Summit will handle 160.

Other projects on Summit will involve training computers to read large volumes of medical data to help doctors determine best treatments for cancer patients, creating a virtual fusion reactor, and, for the first time, creating an accurate image of the Earth’s mantle on a global scale, in part by analyzing data from seismographs from all over the world.

From predicting deadly weather to generating new ways to treat human diseases and exploring the furthest reaches of space, Summit will help scientists answer some of the most fundamental questions about the universe, Wells said.

Opioid addiction

Though Summit won’t officially tackle projects until early next year, it already has set a record for the fastest scientific calculation (1.88 exaops or nearly two quintillion calculations per second) while analyzing genomics data.

Summit enabled the ORNL computational biologist Dan Jacobson and his team to analyze datasets of millions of genomes, a scale that they previously could not study effectively. The DOE said that one hour on Summit allowed scientists to solve problems that would have taken them 30 years on a desktop computer.

Dan Jacobson

In a joint project with the U.S. Department of Veterans Affairs, Jacobson and his team plan to use Summit to analyze the genomes of 600,000 veterans who volunteered their data for research “to better understand the genetic factors that contribute to conditions such as cardiovascular disease, prostate cancer, suicide prevention, Alzheimer’s disease and opioid addiction.”

About 10 percent of people who use opioids develop an addiction, and scientists believe at least part of the reason to be hidden in our genes. Opioid addiction in the last few years has harmed communities all across the country, including Louisville.

Jacobson’s team hopes to identify which genetic markers predispose people to opioid addiction so that doctors can offer such patients who are in need of pain medication an alternative to opioids and reduce the rate of addiction.

“Machines like Summit are poised to turbocharge our understanding of genomes at a population scale, enabling a whole new range of science that was simply not possible before it arrived,” Jacobson said.

Wells said that Jacobson’s research exemplifies a shift in the role computers are playing in many areas of science. Progress in many sciences that rely in large part on observation, including zoology, oceanography, human biology, botany, climatology, and others, used to be curbed by the limited availability of data. After all, studying snow leopards is difficult if it takes three weeks to find one.

However, now that scientists can equip animals with various sensors and collected large amounts of information, from video to GPS data and heart rates, scientific progress is limited primarily by the researchers’ inability to analyze large data chunks, which makes computing power ever more important for scientific progress.

Some of the machine’s usefulness may not yet be known, even by the people who are working on it, because the scientists sometimes don’t know what science they can do until new tools are made available to them.

“It’s going to be transformative in several ways,” Wells said.

Jacobson’s exaops milestone is one of the finalists for the 2018 Gordon Bell Prize, the Oscar of achievements in high-performance computing. Five of this year’s six nominees performed their work on Summit.

Part of an occasional series on research at Oak Ridge National Laboratory.