Science Computation Flies High on Frontera SupercomputerTexascale Days see massive simulations from the cosmic web to the atomic nucleusbyJorge Salazar April 2, 2024 Feature StoryFronterashare this: Texascale Days on TACC’s Frontera supercomputer gives scientists full access to the most powerful supercomputer at any U.S. university. Shown is a volume rendered view of a thin equatorial slice of the model 25 Mʘ star. A surface convection zone interacts with gravity waves excited by the convection zone in the core region of this star. Credit: Paul Woodward, University of Minnesota. One of the biggest thrills for scientists who code is scaling up their simulations to push the limit of the most powerful supercomputers. Texascale Days at the Texas Advanced Computing Center (TACC) gives scientists that rare opportunity. The quarterly event awards a handful of research groups full use of the National Science Foundation-funded Frontera supercomputer, the fastest supercomputer at any U.S. university and the leading capability system in the national cyberinfrastructure intended for large applications that require thousands of compute nodes. Frontera, the fastest academic supercomputer in the U.S., is a strategic national capability computing system funded by the National Science Foundation. Credit: Jorge Salazar, TACC. “Texascale Days gives the researcher an opportunity to run code on problems at a scale that is not available during regular production on any NSF system,” said John Cazes, director of High Performance Computing at TACC.Normally, at any given time, dozens of scientists share Frontera and run scientific computational jobs that need less than a quarter of its 8,300 Intel Cascade Lake Xeon nodes, supplemented by 90 graphics processing unit (GPU) nodes of NVIDIA Quadro RTX 5000. Allocations are requested through the Frontera user portal and the National Artificial Intelligence Research Resource. “Texascale Days is different in that the simulations have demonstrated through smaller jobs that they can scale up to at least half the nodes on Frontera. It takes quite a bit of expertise and work to optimize the researcher’s code to hit that scale,” Cazes added.Following are highlights of production and benchmarking runs from the latest Texascale Days in February 2024.Wish Upon a Star The magnitude of the horizontal velocity component in stellar convection simulations is shown volume rendered in a thin slice through the center of a 25 Mʘ star at intervals of 8.202 days, beginning at 8.202 days. The internal gravity waves excited by the core convection zone grow and work their way outward in time to influence the wave oscillations in the entire stably stratified envelope between the two convection zones. Credit: Paul Woodward, University of Minnesota. The team of astronomer Paul Woodward at the University of Minnesota, in collaboration with Falk Herwig’s team at the University of Victoria, has been studying convection and its effects on the deep interiors of massive stars for several years. The gravity wave oscillations that can be seen using instruments like the Kepler space telescope and the Transiting Exoplanet Survey Satellite can provide a unique window into the interior structure of massive stars. “These internal gravity waves (IGWs) can provide a connection between simulations and observations,” Woodward said. Stellar hydrodynamic simulations have demonstrated that IGWs are excited by convection in the stellar core, and Woodward’s team has shown that features in the spectrum of excited waves and their stochastic time dependence bear a resemblance to the low-frequency excess that is observed. However, open questions about the origins of IGWs prevent the scientific community from fully exploiting asteroseismic observations of massive stars. To resolve this question, the teams need simulations that reveal how the low-frequency waves are excited by core convection in the inner regions of the stable layer. “The fine simulation grids and the high computational performance made possible with our PPMstar code running on Frontera enables us to resolve both the core convection, the near-surface convection, and the proper excitation and damping of IGWs in the stably stratified envelope between these convection zones," Woodward said. "Our team exploited the most recent Texascale Days opportunity to perform some first experiments at scale to maximum of 3,510 nodes, in which we include nearly the entire star in our computational domain."“These are our first simulations at scale of full star models," he added. "We learn from these numerical experiments how much gravity wave signals can tell us about the structure of a massive star’s deep interior."Cosmic History The ASTRID cosmological simulation models large volumes of the cosmos spanning hundreds of millions of light years yet can zoom in to very high resolution. Credit: ASTRID team. ASTRID, one of the largest-ever cosmological simulations, was developed on Frontera, and it too had its day during Texascale Days. It maxed out Frontera at 8,192 nodes during the peak of the simulation. The goal is to study galaxy formation, supermassive black hole coalescence, and re-ionization over the cosmic history. “The Texascale Days run was very successful," said Nianyi Chen of Carnegie Mellon University, "and utilized an optimized version of our cosmological hydrodynamics code MP-Gadget. "We evolved the ASTRID simulation by about 100 million years while efficiently processing galaxy and black hole catalogs on the fly.", The science team includes Tiziana Di Matteo (CMU); Simeon Bird (UC Riverside); Yueying Ni (Harvard); and graduate students Yihao Zhou (CMU) and Yanhui Yang (UC Riverside).“We finished massively parallelized I/O for a total of a few hundred terabytes of data during the 24-hour run. The adaptation of our code to the Frontera cores produced a speed-up of about 10 percent on our problem," she added. “The Texascale Day resources are crucial for this part of the ASTRID production run: our simulation is at the peak of cosmic star formation, and we need a larger memory to accommodate the information from the newly formed stars and galaxies. It provides a precious opportunity to test the scalability and reliability of our simulation code in a massively parallel context, allowing us to make further improvements to our simulation code for robust performance on large machines like Frontera and continue to push the simulation to the present day universe,” Chen said.That’s A Moiré The average computed density of the electron comprising one of the strongly bound excitonic states in a 55 atom silver nanoparticle (overlaid). Credit: The Jornada Group. When two layers of atomically thin materials overlap, they can produce a moiré pattern that creates intriguing electronic phenomena such as superconductivity and ferromagnetism. What’s more, bouncing light off overlapping sheets of exotic materials can produce excitons, which are quasiparticles being studied for applications in new optical sensors and communication technology such as optical fibers and lasers.“Using TACC's Frontera supercomputer, we performed first-principles density functional theory calculations of the electronic ground state energies and wave functions for a plasmonic nanoparticle of experimentally relevant size,” said Felipe Jornada, an assistant professor in the Department of Materials Science and Engineering at Stanford University and a principal investigator at the SLAC National Accelerator Laboratory.The Jornada Group needed over 4,000 nodes of Frontera to capture the atomistic details in these nanoparticles and the complex way that their electrons interact with light, using computationally demanding quantum-mechanical theories.“This is the first calculation of its kind that addresses the intricate nature of the atomic structure of such nanoparticles, and the resultant correlations left behind in the electronic system after the photoexcitation,” Jornada said.Plasmonic nanoparticles can be used to drive chemical reactions such as the production of ammonia fertilizer and hydrogen fuel, as well as plastic decomposition, using light instead of costly high temperature and pressure conditions created by burning fossil fuels. “We think this is an exciting time where our theories, codes, and computational resources finally let us make practical predictions for new, light-driven chemical reactions,” added PH.D. student Akash Ramdas in the Jornada Group.Going Nuclear Density and the shape of the 24Mg ground state from the first-principle nuclear theory calculations. The apparent deformation of 24Mg nucleus is visible from the simulation obtained during the February 2024 Texascale Days on Frontera. Credit: Kristina Launey, LSU. https://doi.org/10.1103/PhysRevLett.128.202503 The isotope magnesium-24 (24Mg) is a heavy hitter in the universe. It’s one of the 10 most common elements in our galaxy and is vital in the synthesis of nuclei that form stars. During the Texascale Days event, a team led by Kristina Launey at Louisiana State University and Grigor Sargsyan at Michigan State University performed several large-scale simulations for the atomic nucleus of 24Mg across the entire 8,000+ nodes of Frontera.Launey’s team uses a many-body method based on first principle approaches, which takes into account the underlying interactions of protons and neutrons. (use live link) Descriptions of alpha-conjugate nucleus — nuclei with multiples of alpha particles, i.e., two protons and two neutrons, such as 24Mg — are challenging to derive from first principle approaches. “Texascale Days allowed us to utilize the full power of one of the largest supercomputers in the world to expand the first-principle simulations to heavier and more challenging nuclei," Launey said. “Almost all chemical elements on Earth have been created in the stars thanks to complex chains of nuclear processes," said Grigor Sargsyan, Michigan State University, who is a co-PI on the Frontera allocation and a member of Launey's team that does these first-principle calculations. "To understand how these chains proceed, a reliable description of nuclear properties is needed. Thanks to the modern-day supercomputers and the advances in nuclear modeling, we are greatly expanding our knowledge of nuclear properties and complement the measurements at the state-of-the-art nuclear physics laboratories,” Sargsyan said. A New VistaThe large-scale experiences gained from Texascale Days on Frontera apply to new systems on the horizon for TACC, such as Vista, slated for production in Summer of 2024 with an artificial intelligence focus.“Texascale Days have been a great success for TACC in helping stress-test our flagship system, and for researchers in optimizing their codes to run at scales of the largest supercomputers in the world," Cazes said. "We look forward to more years of Texascale Days on Frontera and on new, exciting systems to come."