Credit: Dorland photo by Alison Harbaugh; Stellar photos by Kevin Abbey, PICSciE; Collage by Elle Starkman/PPPL Office of Communications
Stellar, a computing cluster that Princeton University is installing in its High-Performance Computing Research Center, will sharply advance research at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) to bring to Earth the fusion energy that powers the sun and stars. The computer, which the Laboratory will share with a broad range of University departments, will be available to the entire PPPL scientific community including engineers.
“Stellar will greatly expand our computing capacity,” said Bill Dorland, associate director for computational science at PPPL. “We’ll now be able to do more computations, and make more calculations, to figure out the physics and engineering problems that we work on.”
Huge leap forward
The new machine “is a huge leap forward in local computing capacity,” said Steve Cowley, PPPL director. “Bill Dorland has solidified our partnership with Princeton University and delivered an extraordinary computing resource.”
Stellar complements Traverse, the high-performance computing (HPC) cluster equipped with graphics processing units (GPUs) that facilitate artificial intelligence that Princeton installed in 2019. “Stellar provides leading-edge capacity for those codes that can’t run on GPUs,” said Curt Hillegas, associate chief information officer for Princeton University’s Office of Information Technology and the Princeton Institute for Computational Science and Engineering (PICSciE). “Think of Traverse as the place to go for GPUs,” added Bill Wichser, PICSciE’s director for advanced systems and data storage. “Codes that don’t use GPUs run on clusters like Stellar.”
The new HPC Stellar cluster will provide swift access to users compared with the weeks-long wait that researchers can encounter on far larger and more powerful national computers such as Cori, the newest machine at the National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science user facility at Lawrence Berkeley National Laboratory. Jobs up to a certain size could run on Stellar, reserving the DOE facilities for the truly large jobs.
Fusion reactions combine light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei that makes up 99 percent of the visible universe — to generate massive amounts of energy. Researchers at PPPL study the behavior of plasma confined in magnetic fields in what is called a tokamak fusion facility. Scientists around the world use mainly tokamaks in the quest to produce and control fusion reactions to create a safe, clean and virtually inexhaustible supply of power to generate electricity.
At PPPL, “I want Stellar to be for teams of people that are trying to do mission-critical calculations,” Dorland said. These include users of key codes such as M3D-C1, which focuses on the simulation of plasma disruptions in tokamaks, and Gkeyll (pronounced Jekyll), that simulates the orbiting of plasma particles around the magnetic field lines at the edge of fusion plasmas. Included in the tasks will be moving the code TRANSP to Stellar to provide much faster operation for worldwide users of the integrated tokamak modeling program. TRANSP analyzes and predicts the diffusion, or transport, of particles in fusion experiments.
Among other Stellar applications will be testing and tuning key PPPL codes to ensure that they can run on the major DOE leadership computing facilities. “We want to be able to do development work to prepare for hard-to-get national time,” Dorland said. “And we need to be able to vary the parameters of a problem to see how the system works.”
The dual architecture of Stellar allows it to do double duty. Its Intel computer chips will run codes for PPPL and University departments ranging from Geoscience and Astrophysical Sciences to Physics and Chemical and Biological Engineering. The computer’s AMD chips, which comprise about a third of the machine, will run climate models for Princeton’s Geophysical Fluid Dynamics Laboratory (GFDL), which collaborates with the U.S. National Oceanic and Atmospheric Administration (NOAA).
This dual capability pleases all hands. “I see Stellar as a really exciting development,” said Jeroen Tromp, the Blair Professor of Geology and Professor of Geosciences and Applied and Computational Mathematics at Princeton and also director of PICSciE, which has been instrumental in developing partnerships with PPPL. “We’ve talked about engaging more closely with PPPL for a long time,” Tromp said. “We’ve always felt that we have synergy with the laboratory and GFDL and this machine is concrete evidence that I’m really pleased to see.”
Sharing such views is Anatoly Spitkovsky, a Department of Astrophysical Sciences professor who works on theoretical high-energy astrophysics. “There is a lot of need on campus for a mid-scale computer like Stellar that allows for fast turnaround time,” he said. “The queues are short and while the problems we run will not be as big as those we run on national machines, we can do more of them and do them more often. And that is extremely important for graduate students and postdocs who need their research done today.”
For Dorland, who joined the Laboratory in 2020, Stellar is central to increasing outside recognition of computing at PPPL. “Every lab has something it is great at and although PPPL does have great computation, it is not recognized for it,” he said. “Stellar will help get the word out so that people understand that computation really is a major strength of this Laboratory.”
PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit energy.gov/science.