Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Power Hardware

Server Power Consumption Doubled Over Past 5 years 148

Watt's up writes "A new study shows an alarming increase in server power consumption over the past five years. In the US, servers (including cooling equipment) consumes 1.2% of all the electricity in 2005, up from 0.6% in 2000. The trend is similar worldwide. 'If current trends continue, server electricity usage will jump 40 percent by 2010, driven in part by the rise of cheap blade servers, which increase overall power use faster than larger ones. Virtualization and consolidation of servers will work against this trend, though, and it's difficult to predict what will happen as data centers increasingly standardize on power-efficient chips." We also had a recent discussion of power consumption in consumer PCs that you might find interesting.
This discussion has been archived. No new comments can be posted.

Server Power Consumption Doubled Over Past 5 years

Comments Filter:
  • by bilbravo ( 763359 ) on Friday February 16, 2007 @05:10PM (#18044172) Homepage
    Nah... the figure doubled. I'm sure the overall power consumption in the US (or elsewhere) has not lessened while servers have doubled.
     
    Nitpicking, I know...
  • by bilbravo ( 763359 ) on Friday February 16, 2007 @05:13PM (#18044244) Homepage
    Hit submit, not preview...

    I wanted to add, I'm sure that means the number has more than doubled; I'm sure power consumption has grown, so if the percentage doubled, that needs to be multiplied by whatever factor energy consumption OVERALL has increased.

    I got too excited about my nitpicking to post my actual though.
  • Moore's law (Score:5, Insightful)

    by k3v0 ( 592611 ) on Friday February 16, 2007 @05:27PM (#18044434) Journal
    Considering that the processing power has more than doubled over that amount of time it would seem that we are still getting more bang per watt than before
  • by G4from128k ( 686170 ) on Friday February 16, 2007 @05:29PM (#18044456)
    Why does this alarm anyone and is it even really true? Several factors conspire to make this statistic both bogus and unalarming.

    1. More computers are classed as "servers." I'd bet that before many of the workgroup and corporate IT computers and mainframes weren't classed as "servers." It's the trend toward hosted services, web farms, ASPs, etc. that is moving more computers from dispersed offices to concentrated server farms.

    2. More of the economy runs on servers - this would be like issuing a report during the industrial revolution that power consumption by factories increased at an "alarming" rate. Moreover, I'd wager that a good chunk of that server power is paid for by exporting internet and IT-related services.

    3. Electricity is only a small fraction of U.S. energy consumption. Most of the energy (about 2/3) goes into transportation (of atoms, not bits).

    It's only natural and proper that server power consumption should rise with the increasing use of the internet in global commerce. This report should be cause for celebration, not cause for alarm. (but then celebration does sell news, does it.)
  • Re:Solution (Score:2, Insightful)

    by Anonymous Coward on Friday February 16, 2007 @05:32PM (#18044496)
    The frustrating part is that some of the equpiment has that ability built in, it's just not standardized enough to be used. A bunch of our cisco gear has a plug for backup power, and we had some DEC equipment years back that did, but they were different plugs and different voltages. If it were standardized, life would be good.
    I think what it would take is for UPS manufacturers to standardize a set of voltages (12, 5, 3.3 perhaps) and a plug so that it would be very easy to replace standard power supplies with a standard DC in power supply.
  • by stratjakt ( 596332 ) on Friday February 16, 2007 @05:42PM (#18044622) Journal
    "If current trends continue" is almost always followed by a fallacious argument. Current trends rarely continue. Be it world population, transistor density, climatology, and especially at the blackjack table.

    Just pointing that out.
  • by fred fleenblat ( 463628 ) on Friday February 16, 2007 @05:52PM (#18044784) Homepage
    more to the point energy-wise, people using those servers (for on-line shopping, telecommuting, etc) are saving tons of enegy by not driving to the store, the mall, or the office to accomplish everything.
  • Bullshit (Score:3, Insightful)

    by MindStalker ( 22827 ) <mindstalker@[ ]il.com ['gma' in gap]> on Friday February 16, 2007 @06:09PM (#18045012) Journal
    Trend continues. Thats like saying people have been using more 120W bulbs than when they used to use 60W bulbs, if this trend continues everyone will be using 500W bulbs by 2015.

    Yea as computing has gotten cheaper and people are using more of it, but thats because the relative cost of powering them have remained cheap. Don't expect the trend to continue once it becomes expensive compared to other things.
  • Re:Solution (Score:5, Insightful)

    by NerveGas ( 168686 ) on Friday February 16, 2007 @07:01PM (#18045676)

          Get a grip on reality.

          Even if you switch to 48V DC, you still have to convert 120 VAC to 48 V DC, then down to 12/5/3.3/1.x volts for motors and logic, so all you're doing is moving the conversion from a decentralized setup (a power supply in each computer) to a centralized one (a single large power supply). In the end, however, you still have to get from 120 down to around 1 volt for the CPU, and you're not going to suddenly make an order-of-magnitude change in the efficiency of that - or even near a doubling.

        To keep it in perspective, though, there are vastly overshadowing losses which make the small differences in centralized/decentralized conversion efficiency moot. Your 120 VAC leg is probably coming from a 440 VAC lead coming into the building, and going through a very large transformer to get 120 VAC - and the 440 VAC that comes in is coming from a much higher voltage that was converted down at least once (and perhaps more) after being transmitted very long distances. The losses in all of that are much, much higher than the losses in conversion that you mention.

        Sure, if you could generate and transmit a nice, smooth, regulated 48V DC from the power station to your computer, that would be great - but that's so unfeasable that you might as well wish for a pink unicorn while you're at it.

Happiness is twin floppies.

Working...