Researchers at a US university have made a breakthrough in the field of turbulence studies, with a supercomputer and a new programming strategy that can help solve some of the most complex and intricate processes taking place in the universe.

Turbulence everywhere

The effects of turbulence can be felt and observed everywhere in our day-to-day lives, from the action of tap water in a shower to the coffee sloshing in our morning cups to the way fuel behaves in an engine and the air disturbances we might feel on a plane trip – the last something that climate scientists predict will get worse due to the Earth’s warming.

Turbulence also takes place on an atomic scale, and understanding it better can shape our understanding of astrophysics mysteries such as how stars are formed and why certain elements are so plentiful.

A major computing challenge

However, the calculations needed to simulate and observe turbulence have long been considered beyond even computer capabilities. Direct numerical simulation (DNS), which can solve mass and momentum equations is the go-to but requires a huge amount of processing power. Interesting Engineering explains that “Fluid turbulence, with its irregular fluctuations across various scales in time and 3D space, is a complex science problem and a major challenge in high-performance computing.”

But a team at Georgia Tech’s Daniel Guggenheim School of Aerospace Engineering, working at the Oak Ridge National Laboratory, has achieved a turbulence simulation at a record resolution of up to 35 trillion grid points, using Frontier, the world’s first exascale computer, capable of a quintillion operations per second.

How?

The scientists used “key programming strategies designed to take maximum advantage of the machine architecture” which involved performing almost all computations on the graphics processing unit (GPU), leaving the central processing unit (CPU) “idle” and reducing “the overhead of data movement between host and device”.

“In many scientific fields, people thought calculations of this magnitude were not possible, but now we are there, perhaps earlier than anticipated,” said P.K. Yeung, a professor at the university.

R. Vaideswaran, Prof. P.K Yeung, and D.L. Dotson pictured at a recent User Meeting at the Oak Ridge Leadership Computing Facility © Carol Morgan/Oak Ridge National Laboratory

Game-changing handling of diverse phenomena

The record high resolution means the simulator can handle complicated turbulence phenomena such as chemical reactions, magnetic fields, mixing, particle movements, as well as allowing greater precision in extreme small-scale modelling. Published in Computer Physics Communications, the method, findings and their potential applications could prove to be a game changer.

“The science impacts of our extreme scale simulations are expected to be further enhanced by public data-sharing in partnership with the National Science Foundation-supported Johns Hopkins Turbulence Database project,” said Yeung.

Last month a team solved the decades-long scientific mystery of calcium-48’s magnetism using Frontier.



Source link