
How CMU Is Curbing Energy Demands From AI Data Centers
News ClipCarnegie Mellon University·PA·3/16/2026
Researchers at Carnegie Mellon University are developing new technology, including energy-efficient computer chips and dynamic workload adjustments, to reduce the strain that AI data centers place on the US energy grid. The goal is to lower the energy demands of data centers and provide more sustainable solutions.
electricityannouncement
Microsoft
Researchers at Carnegie Mellon University are developing new technology that could lower how much energy data centers need to operate, reducing the strain on the energy grid that Americans rely on.
Akshitha Sriraman, an assistant professor at CMU, and her team are designing "carbon-efficient servers" that require less energy over their lifecycle. They are blending new and old technology by applying sustainable resource management principles to server hardware design. Microsoft is exploring adopting Sriraman's designs to help meet its decarbonization targets.
CMU professors Brandon Lucia and Nathan Beckmann have also created a new type of computer chip architecture through their company Efficient Computer. This new processor design can perform general-purpose computing using just a fraction of the energy required by traditional CPUs and GPUs.
Additionally, CMU professor Peter Zhang is exploring whether dynamic workload adjustments could help stabilize the electricity demand profiles of data centers, potentially shifting more AI workloads to be done overnight when energy demand is lower. This could further reduce the strain on the aging US energy grid from the growing energy needs of AI.