RICHLAND, Wash. – When Shuaiwen Leon Song boots up Doom 3 and Half-life 2, he does so in the name of science. Song studies high performance computing at Pacific Northwest National Laboratory, with the goal of making computers smaller, faster and more energy efficient. A more powerful computer, simply put, can solve greater scientific challenges. Like modeling complex chemical reactions or monitoring the electric power grid.
The jump from supercomputers to video games began when Song asked if hardware called 3D stacked memory could do something it was never designed to do: help render 3D graphics. 3D rendering has advanced science with visualizations, models and even virtual reality. It's also the stuff of video games.
"We're pushing the boundaries of what hardware can do," Song said. "And though we tested our idea on video games, this improvement ultimately benefits science."
Song collaborated with researchers from the University of Houston to develop a new architecture for 3D stacked memory that increases 3D rendering speeds up to 65 percent. The researchers exploited the hardware's feature called "processing in memory," the results of which they presented at the 2017 IEEE Symposium on High Performance Computer Architecture, or HPCA.
A normal graphics card uses a graphics processing unit, or GPU, to create images from data stored on memory. 3D stacked memory has an added logic layer that allows for the memory to do some processing too — hence the name "processing in memory." This essentially reduces the data that has to travel from memory to GPU cores. And like an open highway, less traffic means faster speeds.
The researchers found the last step in rendering — called anisotropic filtering — creates the most traffic. So by moving anisotropic filtering to the first step in the pipeline, and performing that process in memory, the researchers found the greatest performance boost.
Song tested the architecture on popular games such as Doom 3 and Half-life 2. Virtual aliens and demons aside, this research is not so different than Song's other work. For example, Song is exploring how high performance computers can model changing networks of information, and how to predict changes in these graphs. With research questions like these, Song means to push the boundaries of what computers can do.
The work was funded by DOE's Office of Science.
Interdisciplinary teams at Pacific Northwest National Laboratory address many of America's most pressing issues in energy, the environment and national security through advances in basic and applied science. Founded in 1965, PNNL employs 4,400 staff and has an annual budget of nearly $1 billion. It is managed by Battelle for the U.S. Department of Energy's Office of Science. As the single largest supporter of basic research in the physical sciences in the United States, the Office of Science is working to address some of the most pressing challenges of our time. For more information on PNNL, visit the PNNL News Center, or follow PNNL on Facebook, Google+, Instagram, LinkedIn and Twitter.