The Myth of Data Center Inefficiency
With their constantly whirring server fans and enormous cooling systems in cavernous buildings, data centers seem to be sucking up a lot of energy and contributing more than their fair share to climate change. But are they?
Dr. Eric Masanet and several colleagues decided to put it to the test. What they found is that, while demand for compute instances has gone up sixfold, the energy trend is close to flat.
With servers becoming more efficient and virtualization coming more into vogue, the data center has seen its energy per compute instance plummet.
“That has gone down by about 20 percent every year, and that pace of efficiency improvement is far greater than we can see in any other sector of the energy system,” Masanet said. “We wanted to point out that, even though the data center industry sometimes gets beaten up about its energy use or its contributions to climate change, that is really a remarkable energy improvement and better than nearly any other sector for which we have data.”
While the news is mostly positive for data centers, Masanet and his team also want to push for more data to be made available on data center energy consumption and sound a warning about major challenges on the horizon.
“We think there are three trends that really need to be better understood,” he said. “One is artificial intelligence, which should require a lot of computational intensity. The second is 5G, which could spur lots of new demand for data center services, and the third is edge computing, moving potentially some workloads and compute instances much closer to the end user in smaller data centers. We really don’t know yet how that’s going to play out from an energy perspective.”
The path certainly is not linear, but with progressive solutions, data centers can continue to be mindful about their energy use.