Your online shopping is costing more than you think.
Every year, more and more companies are shifting their business-critical systems and data online, into what has been nicknamed “the cloud” – and we’re just beginning to understand the real-world cost of doing that.
In an ideal world, the internet would be underpinned with technology running at peak efficiency. As a year-long study conducted by the New York Times has discovered, the world is far from ideal. The data centers (or servers) worldwide that are almost always kept running 24 hours a day – in line with the 24-hour nature of online business – consume some 30 billion watts of electricity – that’s 30 nuclear power plants, just to keep servers ticking over. In the United States, they account for an estimated 2% of the national electricity output.
There’s no doubt that this figure is going to rise. What’s appalling is that it really should be dropping.
The study found that up to 90% of the electricity reaching these servers was being wasted, mainly due to servers running flat-out around the clock and converting so much energy into excess heat. Furthermore, these servers require backups, usually running on diesel, and the study uncovered multiple examples of clean air regulation violations. Worse still, some of these servers will be doing nothing useful with the power they’re receiving – one sample of 333 servers in 2010 found more than half of them in a “comatose” state, while three-quarters of them were using less than 10% of their computational power.
And how’s this for a depressing summary?
In a typical data center, those losses combined with low utilization can mean that the energy wasted is as much as 30 times the amount of electricity used to carry out the basic purpose of the data center.
James Glanz, New York Times
In a world of business-critical redundancy hardware and the need to meet instantly scaled-up demand, does the world need to put up with such chronic inefficiency? Not so. The National Energy Research Scientific Computing Center (which of course you’d expect to be championing computer efficiency) managed a 96.4% level of server utilization in July of this year, according to its director of operations. It did it by queuing tasks and making sure the servers were packed out with work throughout the day, rather than compensating for less efficient processor use with more energy-guzzling hardware.
The good news is that this is happening in an industry obsessed with innovation. Google, Facebook and Intel are extremely vocal about their efforts to streamline their power usage, improve server-cooling technology and a host of other features (for example, super-efficient cloud-based structures that would manage data for clients more efficiently than they could themselves) that would make a dent in their energy costs. If data center owners find that investing in efficiency can leads to greater running profits, this situation could be rectified in record time – but as one commenter notes, electricity is cheap, and wasting it to ensure uninterrupted service might be more profitable in the short term. If that’s the case, getting companies to clean up their act will get tricky.
For some people, though, it’s not the data centers that are the root of the problem:
“That’s what’s driving that massive growth — the end-user expectation of anything, anytime, anywhere,” said David Cappuccio, a managing vice president and chief of research at Gartner, the technology research firm. “We’re what’s causing the problem.”
Image: Server room at CERN – torkildr
All quotes: James Glanz, New York Times