But visible or not, the ecological and economic costs of those servers are massive. A report released last week by the Environmental Protection Agency estimated that U.S. data centers (collections of computers used to power businesses' and government agencies' IT infrastructures and Web sites) consumed around 61 billion kilowatt-hours in 2006 at a cost of about $4.5 billion.
That's about 1.5 per cent of total US electricity consumption, more than the electricity used by American televisions, or equivalent to the output of about 15 typical power plants.
Woaah..thats some consumption! And how about the desktops? All those folks who leave thir desktop running all day and night!
For every server that's virtualized, a company saves around $560 a year, according to VMware, the fast-growing technology company that pioneered the process. Three California power companies are also offering cash rebates for every server its customers remove through virtualization.
"The Googles of the world are growing up, and using more online applications than ever," says Mark Bramfitt of Pacific Gas and Electric, which offers its customers rebates of as much as $300 per virtualized server. "There are phenomenal opportunities for energy conservation. These data centers use 50 to 100 times the power per square foot of an office building."
In fact, Google and PG&E are two of the big names that launched an environmental technology consortium in June known as Climate Savers Computing Initiative. The group, which also includes Yahoo!, Intel, Hewlett-Packard, Dell, Sun Microsystems and Advanced Micro Devices, aims to drastically cut the energy wasted by computing devices, with the goal of cutting greenhouse gas emissions by 54 million tons a year and saving $5.5 billion in energy costs by 2010.
Staggering figures indeed. Read more... (A couple of interesting slideshows leading to Forbes.com, BTW)
Comments
Post a Comment