Skip to main content

CPU-GPU optimization could offer big power savings for drones, data centers

October 3, 2012 By Mark Riechers

The speed boost that a powerful computer processor can provide seems great, but the electric bill can be a real shocker. Not unlike choosing between a Ford Mustang and a Toyota Prius, faster processors require more energy to run, making them more expensive for their users.

Photo: Nam Sung Kim

Kim

While speed versus power consumption is an ever-present tradeoff in designing computer processors, Nam Sung Kim is optimizing processors in a way that delivers both speed and energy efficiency and, for example, could save billions of dollars in electrical costs for data centers.

Kim, a University of Wisconsin–Madison assistant professor of electrical and computer engineering, seeks significant energy savings by optimizing the processors that integrate a central processing unit (CPU) that handles complex operations with a graphic processing unit (GPU) that typically handles the repetitive task of calculating and drawing thousands of pixels on the user’s screen. These processors — known as accelerated processing units (APUs) — can process a large amount of complex information more efficiently than either a CPU or GPU alone.

Tightly integrating a CPU and a GPU on a single chip considerably reduces power and performance overhead normally wasted as CPUs and GPUs communicate over long electrical connections. “Letting the two processors work together at a close distance increases their efficiency,” says Kim.

Manufacturers can streamline APU energy use further by building more intelligent energy management systems into their architecture. “Even while the CPU or the GPU is working, there are many components that tap into the processor, but aren’t used at the same time,” says Kim. “They can be put into sleep mode or low-power mode as well. The overall power budget can be used more efficiently for higher computing performance.”

“If we can reduce the power consumption by 10 percent, that can translate to millions, potentially billions in savings. It has a huge economic impact.”

Nam Sung Kim

Even tiny power savings could make a big difference at the scale of a modern data center. “It costs billions of dollars in electricity to keep servers running each year,” says Kim. “If we can reduce the power consumption by 10 percent, that can translate to millions, potentially billions in savings. It has a huge economic impact.”

Kim’s research funding includes a $450,000 grant from the National Science Foundation and $900,000 from the Defense Advanced Research Projects Agency (DARPA) as part of a $2.7 million collaboration with Josep Torrellas of the University of Illinois at Urbana-Champaign and Radu Teodorescu of Ohio State University, where Kim will apply his work to support a DARPA initiative aiming to maximize processing capability in unmanned aerial vehicles with limited energy resources.

“The program focus is on how much more you can do per second, and how much power is needed to do it,” says Kim. “They need a high-performance computing system that consumes very low amounts of power for UAV drones and other unmanned vehicles, because of the energy constraints they face.”

Decreasing power constraints without sacrificing processing power could also have a huge impact on fields previously hamstrung by technology that hasn’t been up to the task until now. “The effectiveness of many artificial intelligence techniques that can benefit our everyday life has been limited by computing performance,” says Kim. “With much greater power efficiency, much greater performance can be attained and more possibilities will open up.”