Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Power IT

Price of Power in a Data Center 384

mstansberry writes "Much like the rest of the country, IT is facing an energy crisis. The utilities are bracing companies for price spikes this winter and according to experts and IT pros, those prices aren't going to come down any time soon. This is thefirst article in a four-part series investigating the impact of energy issues on IT."
This discussion has been archived. No new comments can be posted.

Price of Power in a Data Center

Comments Filter:
  • Moore's law? (Score:4, Insightful)

    by ChrisGilliard ( 913445 ) <christopher.gilliard@nOSpAM.gmail.com> on Thursday October 27, 2005 @05:27PM (#13892533) Homepage
    "It's the other side of Moore's Law," Sneider said. "As the cost of [buying] these machines decreases, the cost of powering and cooling them increases."

    I don't agree with this. How power efficient was Eniac? Also, my laptop lasts much longer the one I had a few years back. I think we're making progress on the power front, but the demand for computing power is attracting more and more dollars, the power cost is largely insignificant with regards to the return on investment.
  • by Black-Man ( 198831 ) on Thursday October 27, 2005 @05:29PM (#13892543)
    And the cost of extracting a ton of coal hasn't changed much from 1995 to 2005. But it shows what a sham commodity trading is - the price of a ton of coal (because it is 'energy' related) is traded relative to the price of a barrel of oil or the cost of a cfu of natural gas.

    All this does is further underline the boom/bust cycles of the energy business and how it negatively affects the economy.

  • by Jeff DeMaagd ( 2015 ) on Thursday October 27, 2005 @05:29PM (#13892544) Homepage Journal
    The cooling expense isn't as bad as the heating. I think the theoretical efficiency of cooling is 10% of the heat to be removed, where it would take 100W to remove 1000W of heat. In practice, it is about 30%, so it's not as bad as some people think.

    One thing I am skeptical of is the need to cool to like 60 degrees F that I've heard (and felt in one room). Good cooling is nice, but I know one guy that says they don't ever see problems until the temperature is above 80F, so businesses can save a lot by not being so freaking cold.
  • Re:Unctuous (Score:2, Insightful)

    by wilsonjd ( 597750 ) on Thursday October 27, 2005 @05:31PM (#13892565)
    Have you ever heard of supply and demand? If we fix oil prices at $25 per barrel, the oil companies will just sell it to China at $70, and we will have NONE. If China convinces Venesula to sell them all their oil, we will see $100 oil very soon.
  • by pcguru19 ( 33878 ) on Thursday October 27, 2005 @05:34PM (#13892589)
    HP just came out with multicore half-height blades. Their latest requirements are 30 amp, three phase power per PDU for a blade rack, with 4 pdus/rack for redundancy. That's enough power service to cover 3 modern 3000 square foot homes when you factor the energy back to 240 volts.

    Getting the power to something this silly isn't the pain. COOLING something that consumes 14KW in a 4 square foot space is the challenge anyone in data center management faces. Both HP and IBM have come out with the "innovation" of heat exchangers that run off your chilled water loop. Some of us have been there and done that and don't want to try it again.

    Every time someone comes to me selling density and physical consolidation, I throw them out on their ass. It's cheaper to just buy or build more traditional raised floor space and run good old fashioned 6, 4, or 2u servers than to cool a bunch of blade racks.
  • One question (Score:4, Insightful)

    by Spy der Mann ( 805235 ) <`moc.liamg' `ta' `todhsals.nnamredyps'> on Thursday October 27, 2005 @05:39PM (#13892623) Homepage Journal
    Shouldn't there be an initiative to certify computer systems as "low energy", i.e. using low power processors, come with LCD monitors, etc?

    Just as the state of Massachusetts chose to use F/OSS to save in office software, why not asking government offices to replace CRT's with LCD monitors?
  • by pete6677 ( 681676 ) on Thursday October 27, 2005 @05:41PM (#13892639)
    When demand increases, price will increase regardless of supply cost. Commodity trading isn't a sham, it's just the way the economy works. If oil and coal were mandated to be sold at a constant price regardless of demand, the supply would run out quickly as people would have no incentive to conserve or to explore for new sources.
  • by grqb ( 410789 ) on Thursday October 27, 2005 @05:44PM (#13892658) Homepage Journal
    I bought myself a watt meter to measure the power of some of my home electronics. So I tested my friend's laptop, it was a Dell, 15 inch monitor P4. Under linux the laptop was drawing 50-100watts (which is very high for a laptop), under windows it was drawing from 30-50 watts. Linux on desktops has the same power management as windows on desktops though.
  • Blame XML and Java (Score:2, Insightful)

    by fuzzy12345 ( 745891 ) on Thursday October 27, 2005 @05:50PM (#13892700)
    I've been in the coding game a long time, and a lot of the new technologies in the last few years are just insanely wasteful of computer power. Computers are, of course, supposed to releive humans of drudgery, and this includes programmers just as much as the clerk/minions that are our end-users.

    Nonetheless, having the computer repetitively recompute the exact same answers (parse that huge XML config file! JIT-compile that Java app, AGAIN AND AGAIN AND AGAIN!) is an exercise in keeping your hardware vendor happy, and a sign of laziness on the part of programmers. Who among us doubts that one AMD64 with a few gigs of RAM could, if programmed properly, calculate the payroll for the entire USA every night?

  • by davidwr ( 791652 ) on Thursday October 27, 2005 @05:53PM (#13892736) Homepage Journal
    Capitalism is all about supply and demand and the cost of buying A vs B vs doing without.

    A barrel of oil may cost $x to pump out of the ground, deliver, process, and burn and coal may cost a fraction of that for the same energy-equivalent.

    But it doesn't matter. As long as the demand at either of those prices exceeds supply, the open-market price of both will be about the same and will be higher than the "production" costs.

    When the demand is between the two "production costs" that one will be heavily favored, possibly knocking the more expensive one off the market entirely until prices rise or production costs go down.

    By the way, even within the same commodity, you have this effect:
    Oil in some places is dirt-cheap to produce. In others it is so expensive to extract that nobody bothers unless they think oil prices will stay high enough to make it worthwhile. But once it gets out of the ground, it's just oil.
  • by demigod ( 20497 ) on Thursday October 27, 2005 @05:54PM (#13892743)
    One thing I am skeptical of is the need to cool to like 60 degrees F that I've heard (and felt in one room). Good cooling is nice, but I know one guy that says they don't ever see problems until the temperature is above 80F, so businesses can save a lot by not being so freaking cold.

    I always considered that as buffer for when you loose one of the AC units. That way if it takes all day to get it fixed, your only up to 80F and still OK.

  • Re:Folding (Score:2, Insightful)

    by ichigo 2.0 ( 900288 ) on Thursday October 27, 2005 @05:57PM (#13892769)
    Now that "free" idle cpu cycles are getting more expensive to produce, and with newer processors going into power-save modes when idle, I wonder if we'll see a distributed computing project that buys cpu cycles from it's participants. It would probably only make sense for companies to do something like that, but it could still be cheaper than building or renting your own supercomputer.
  • Re:Unctuous (Score:3, Insightful)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Thursday October 27, 2005 @06:02PM (#13892821) Homepage Journal
    Uh, this is not true. China's demand is new demand. As demand increases, and supply does not, what happens? Note: This is not a bonus question, it is the entire quiz. Granted, production IS increasing, but I don't think that China's demand is going to fail to outstrip it dramatically as they haul themselves into the modern age.
  • by fejikso ( 567395 ) on Thursday October 27, 2005 @06:03PM (#13892827) Homepage
    As an international student currently living in the US, it is quite shocking to see how Americans waste electric power. It is simply not logical why people have to bring sweaters to be comfortable during the scorching summer (because the thermostat is set too low) whereas in winter, buildings become furnaces.

    I won't even get started on the obscene generation of trash.

    Hopefully these crises well force Americans to find ways of making themselves more efficient.
  • by Jeff DeMaagd ( 2015 ) on Thursday October 27, 2005 @06:07PM (#13892852) Homepage Journal
    At what point do the heat effects of their computers get folded into the climate simulation parameters themselves?

    It probably isn't a big enough factor yet. Keep in mind that one car outputs nearly 10x as much heat energy as a desktop PC.
  • Elsewhere? (Score:1, Insightful)

    by DarkIye ( 875062 ) on Thursday October 27, 2005 @06:35PM (#13893030) Journal
    As is to be expected with Slashdot, the article only discusses the current situation in the USA. Can anyone shed some light on whether this is being reflected elsewhere in the world?

    What's important to realise is that this power isn't just being consumed by servers doing the flops, but (as anyone living in Las Vegas will well know) it's cooling that's soaking up all the juice. The article's probably right about the cost soaring in the near future, but mainly because cooling systems will rely ever more heavily on liquid and active cooling measures.

    On an unrelated note, I wonder if anyone (like our good friends Microsoft) will do some studies into which OS will consume the most energy? Would it be Windows, turning up the thermostat with it's multiple unused processes, or Linux, it's kernel threading model making it the most efficient multi-purpose space-heater?

  • by Tiger4 ( 840741 ) on Thursday October 27, 2005 @06:38PM (#13893047)
    That depends on the room and the equipment. A single small box ins a large room might take all day, or all week to heat it up to 80F. On the other hand, a lot of boxes in a small room might jump to 80F within minutes of losing the cooling. There is no substitute for good engineering. Do the calcs and set it up right.
  • by TRRosen ( 720617 ) on Thursday October 27, 2005 @07:19PM (#13893290)
    In theory it costs the same to keep a data room at 60 degrees as it does to keep it at 80. in both cases your just removing the heat your adding with the equipment. the only difference comes as you increase the difference between the outside temp and the room temp therefore increasing the rate heat will leak in. If your dataroom is well insulated there should be little difference.
  • What to do (Score:3, Insightful)

    by burnin1965 ( 535071 ) on Thursday October 27, 2005 @07:46PM (#13893461) Homepage
    While killing services and cutting back on powered equipment is an option people should consider efficiency improvements.

    Speaking from experience, a large number of x86 boxes out there are running on power supplies which run in the 60 to 70 percent efficiency range. By replacing old low efficiency power supplies with some of the newer 80plus supplies you will save on electricity for the box and for cooling.

    I did some tests with replacing a cheap 250 watt low efficiency power supply with Seasonic 250 watt 80 plus supplies and found a 20%+ reduction in power consumption at the AC outlet. When I ran the numbers the savings in electricity to the power supply alone would pay for the new supply in one year. And that does not include the saving in air conditioning costs.

    http://www.seasonic.com/ [seasonic.com]

    And no I don't work for them or own stock. :) And there are other 80plus manufacturers, its just that this is the only one I tested.

    burnin
  • by CausticPuppy ( 82139 ) on Thursday October 27, 2005 @10:50PM (#13894299)
    Yeah, I've done payroll systems, too. You seem to be confusing difficulty of implementation with the speed the calculations could be done once the program's written.

    We're talking about a meaningless hypothetical situation. Yeah, if such a program existed, it would certainly save a lot of power. But how much power will be used by all of the development systems, servers, QA environment, staging, etc etc in order to produce the program in the first place? I think you'd lose out in the long run, unless there's a lone genius that can crank out the program in assembly in a week (though in reality it would take probably 20-30 years just to digest all the business rules).

    BTW, most large scale payroll systems I know of still run on AS/400's. No java or XML in sight, except for external interfaces.
  • by smilemaster_12 ( 812877 ) on Friday October 28, 2005 @12:11PM (#13897193)
    Why aren't all large computer facilities located in cold-weather climates to begin with? Cooler summers and colder winters could cut electricity bills substantially. Special outside vents could be setup that allow colder air into the building from outside during the winter.

E = MC ** 2 +- 3db

Working...