Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Power The Internet

The Risks and Rewards of Warmer Data Centers 170

1sockchuck writes "The risks and rewards of raising the temperature in the data center were debated last week in several new studies based on real-world testing in Silicon Valley facilities. The verdict: companies can indeed save big money on power costs by running warmer. Cisco Systems expects to save $2 million a year by raising the temperature in its San Jose research labs. But nudge the thermostat too high, and the energy savings can evaporate in a flurry of server fan activity. The new studies added some practical guidance on a trend that has become a hot topic as companies focus on rising power bills in the data center."
This discussion has been archived. No new comments can be posted.

The Risks and Rewards of Warmer Data Centers

Comments Filter:
  • by Geoffrey.landis ( 926948 ) on Thursday October 22, 2009 @09:30AM (#29834425) Homepage
    Locate the server farm in Antarctica!
    • It seems to me that computers produce X BTUs of energy that must be taken out of the server room. They will produce this energy regardless of the temperature in the server room. So... with great insulation around the room, the temperature INSIDE the room should not matter much with regards to the cost of keeping it cold. I think you'd want a temperature where the fans never come on at all ideally. How about making the server room a large dewar flask and fill it with liquid nitrogen and running servers?
      • Good thoughts, but I think that cooling equipment works better when the internal temperature is higher. That way the coolant can collect more heat in the evaporation phase, which can then be dumped outside from the condenser. So, by keeping the temperature warmer, the return coolant has more heat in it, but the coolant still evaporates at the same temperature, so you get a larger delta. Yes, the compressers would have to work a little harder, but overall, apperantly it is a net savings in energy.
        • "That way the coolant can collect more heat in the evaporation phase"
          WHAT?!?

          The heat of vaporization doesn't change based on temperature. What are you talking about?
      • Re: (Score:3, Insightful)

        by autora ( 1085805 )
        I see you've really thought this one through... A warehouse full of servers that need regular maintenance filled with liquid nitrogen is sure to lower costs.
        • Actually I haven't thought much about it at all. For all I know liquid nitrogen is a conductor which would be like filling the server room with ice water. Then again maybe liquid nitrogen is a good insulator.
        • Re: (Score:3, Funny)

          by lgw ( 121541 )

          I always figured the best approach was a combined server room/aquarium. But that assumes you can train some fish to do your server maintenance. I hear the octopus is quite smart, and could easily move around inside of cases. I wonder though, will this provoke cries of "fight octopus outsourcing now!" from the Slashdot crowd?

      • Re:Quick solution (Score:5, Informative)

        by jschen ( 1249578 ) on Thursday October 22, 2009 @10:39AM (#29835059)

        It is true that if you are producing X BTUs of heat inside the room, then to maintain temperature, you have to pump that much heat out. However, the efficiency of this heat transfer depends on the temperature difference between the inside and the outside. To the extent you want to force air (or any other heat transfer medium) that is already colder than outside to dump energy into air (or other medium) that is warmer, that will cost you energy.

        Also, too cold, and you will invite condensation. In your hypothetical scenario, you'd need to run some pretty powerful air conditioning to prevent condensation from forming everywhere.

        • Re:Quick solution (Score:4, Informative)

          by Yetihehe ( 971185 ) on Thursday October 22, 2009 @12:43PM (#29836769)
          Condensation happens on surfaces colder than surrounding air. If you have computers which are warmer than your cooling air, it would not be a problem.
        • Re:Quick solution (Score:4, Interesting)

          by DavidTC ( 10147 ) <slas45dxsvadiv.v ... m ['box' in gap]> on Thursday October 22, 2009 @02:03PM (#29837991) Homepage

          I think we're spending way too much time trying to 'cool' things that do not, in fact, need to be cooler than outside. Nowhere on earth is so hot that servers won't run, unless you've built a server room over an active volcano or something.

          All we actually need to do is remove the heat from the servers to the air, and then keep swapping the air with the outside.

          Which happens automatically if you let heat out the top and air in the bottom. Even if you have to condition the incoming air to remove moisture, that's cheaper than actually 'cooling' AC. So the second part, replacing the room air, is easy.

          As for the first, I've always wondered why they don't use chimney-like devices to generate wind naturally and send it though server racks, instead of fans. I think all the heat in a server room could actually, on exit, suck incoming air in fast enough to cool computers if it actually hit the right places on the way in.

          Heck, this would apply anyway. Instead of having AC vent into server rooms, why not have AC vent into server racks? Hook up the damn AC to the fan vent on each server, blow cold air straight in. The room itself could end not cold at all.

          • by Zerth ( 26112 )

            Many racks already do this. Plus, if you aren't looking to pipe AC in each rack, just rotate your racks into alternating hot/cold aisles and seal them off so air can only pass through intakes/outflow vents.

          • Re:Quick solution (Score:4, Informative)

            by Chris Burke ( 6130 ) on Thursday October 22, 2009 @07:04PM (#29841267) Homepage

            Nowhere on earth is so hot that servers won't run, unless you've built a server room over an active volcano or something.

            Given a sufficiently powerful fan, then yes.

            All we actually need to do is remove the heat from the servers to the air, and then keep swapping the air with the outside.

            Which becomes more difficult the higher the ambient air temperature becomes. Heat transfer is proportional to heat delta, so the closer the air temperature is to the heat sink temperature, the more air you need to blow to remove the same amount of heat. Eventually, the amount of electricity you are spending blowing air over the heat sinks is greater than the savings of using less AC.

            This was half the point of the article -- you can save a lot of money by raising server room temperatures, but eventually (at a temperature well below outdoor ambient around here) you actually start to lose money due to all the extra fan activity.

            Which happens automatically if you let heat out the top and air in the bottom.

            Yes but much too slowly to be of use. Convection is also proportional to temperature difference. By the time your server room temperature is enough higher than outside temperature to create significant airflow, your servers are toast.

            As for the first, I've always wondered why they don't use chimney-like devices to generate wind naturally and send it though server racks, instead of fans.

            Go ahead and try it. A lot of cases already have ducting that funnels air directly from outside the case to the CPU. A few more pieces of cardboard, a hole and chimney in the top of your case, and you should be ready to remove the fan and see what convection can do for you. Sneak preview: unless you've specifically picked components that can run off passive cooling, you'll be in the market for a new one. Especially if you live in a hot place and turn off your AC for this experiment.

            While its conceivable to have an effective server room based entirely off of low-power chips that require no active cooling, space is still a major concern in the server room. The desire for greater compute density is directly fighting against using a large number of low-power chips spread out. Thus performance/watt becomes a major metric for the server room, because they want the most performance for a fixed amount of space and thus cooling.

            why not have AC vent into server racks?

            That's actually a good idea, and a lot of places do it.

      • Re: (Score:3, Interesting)

        by lobiusmoop ( 305328 )

        Data centers would be much more efficient if blade servers had modular water cooling instead of fans. Water is much better at transferring heat than air. Then you could just remove all the fans from the data center and add a network of water pipes (alongside the spaghetti of network and power cabling) around the data center. Then just pump cold water in and dispose of the hot water (pretty cheap to do). Should be reasonable safe too really - the water should only be near low-voltage systems really (voltage

        • I thought the reasons for water cooling systems not being placed in data centers were:
          1) A failure causing coolant leakage could potentially destroy tens of servers.
          2) Maintenance of these systems is quite expensive (mold and such growing int he lines that needs to be periodically cleaned out.)
          3) Failure of a main pump could bring down the entire data center (although I assume there would be redundant systems in place)
        • Re:Quick solution (Score:4, Informative)

          by billcopc ( 196330 ) <vrillco@yahoo.com> on Thursday October 22, 2009 @01:14PM (#29837201) Homepage

          You mean like Crays used to have ?

          The problems with water are numerous: leaks, evaporation, rust/corrosion, dead/weak pumps, fungus/algae, even just the weight of all that water can cause big problems and complicate room layouts.

          Air is easy. A fan is a simple device: it either spins, or it doesn't. A compressor is also rather simple. Having fewer failure modes in a system makes it easier to monitor and maintain.

          You also can't just "dispose of the hot water". It's not like you can leave the cold faucet open, and piss the hot water out as waste. Water cooling systems are closed loops. You cool your own water via radiators, which themselves are either passively or actively cooled with fans and peltiers. You could recirculate the hot water through the building and recycle the heat, but for most datacenters you'd still have a huge thermal surplus that needs to be dissipated. Heat doesn't just vanish because you have water, it only allows you to move it faster.

      • "Why should it cost any more to maintain the room at 0 degrees than it would to maintain the room at 100 degrees. I would expect quite the opposite ( with great insulation AROUND the room. )"
        Yeah, therein lies the problem. This "great insulation":
        A) Doesn't exist.
        B) Is horrendously expensive.

        Yes, in an ideal environment this makes sense, but we're not working in one. You have energy leak in from the outside. In addition to that, there's no device that can move energy ideally. There's inefficiencies in e
      • You get better efficiency with trying to cool a warmer enviroment. And you get better efficiency trying to cool with a colder outside temperature. The the smaller the delta T (assuming Cooler inside, warmer outside) the better your efficency.
        • What about filling the server room with freon, and letting it boil away at 1 atm. Then recompress the freon, radiating the heat away ( the radiator can be arbitrarily hot depending on the power of the compressor providing a very hot thing and as large a temp difference as you could want ) Then use the (cool) liquid freon to top off the server room ( since some has boiled away )?
      • by sjames ( 1099 )

        The temperature gradient between the hot and cold side of the cooling system makes a big difference, just like it takes more work to pump 100 gallons of water to a height of 100 feet than it does to pump it 1 foot. Meanwhile if the outside is cooler than the inside the heat will flow with no effort at all.

    • I know it was meant as a joke, but moving to colder climates may not be such a bad idea. Moving to a northern country such as Canada or Norway, you would benefit from the colder outside temperature, in the winter, to keep the servers cool and then any heat produced could be funnelled to keeping nearby buildings warm. The real challenge will be keeping any humidity out, but considering how dry the air during the winters can get there it may not be any issue.

      All this said and done, trying to work out the swee

      • Re: (Score:3, Interesting)

        by asdf7890 ( 1518587 )

        I know it was meant as a joke, but moving to colder climates may not be such a bad idea. Moving to a northern country such as Canada or Norway, you would benefit from the colder outside temperature, in the winter, to keep the servers cool and then any heat produced could be funnelled to keeping nearby buildings warm.

        There has been a fair bit of talk about building so-call "green" DCs in Iceland, where the lower overall temperatures reduce the need for cooling (meaning less energy used, lowering operational costs) and there is good potential for powering the things mainly with power obtained from geothermal sources.

        There was also a study (I think it came out of Google) suggesting that load balancing over an international network, like Google's app engine or similar, be arranged so that when there is enough slack to make

        • That was my first thought a year ago, when Iceland was going bankrupt: Google should buy the whole place. They have nearly free power because of all the hydroelectric, the ambient temperature is low, they have gobs of smart engineering and IT people looking for work, and Icelandic women are really hot.
          • Re: (Score:3, Funny)

            by speculatrix ( 678524 )
            when Iceland was going bankrupt: Google should buy the whole place....Icelandic women are really hot

            Ah, that's why you never see Icelandic women working in data centres, they overload the air-con!!!
      • by ffejie ( 779512 )
        You mean something like this? Cisco Building Mediator [cisco.com].
    • Locate the server farm in Antarctica!

      Perhaps not quite Antarctica, but according to the BBC's Click program [bbc.co.uk] Iceland is bidding for server business based on the low temperatures and lots of cheap geothermal power.

    • by xaxa ( 988988 )

      Very high altitude, very cold, very low humidity -- you regularly lose hard drives from head crashes.

  • Possible strategy (Score:4, Interesting)

    by Nerdposeur ( 910128 ) on Thursday October 22, 2009 @09:38AM (#29834507) Journal

    1. Get a thermostat you can control with a computer
    2. Give the computer inputs of temperature and energy use, and output of heating/cooling
    3. Write a program to minimize energy use (genetic algorithm?)
    4. Profit!!

    Possible problem: do we need to factor in some increased wear & tear on the machines for higher temperatures? That would complicate things.

    • Re: (Score:3, Funny)

      by jeffmeden ( 135043 )

      Careful with that, there are numerous patents to that effect. You wouldn't want to be suggesting IP theft, now, would you?

      • by dissy ( 172727 )

        Careful with that, there are numerous patents to that effect. You wouldn't want to be suggesting IP theft, now, would you?

        Of course not! We don't steal IP here. In fact, that sheet of paper with their IP on it (the patent) will forever remain safely tucked away on file at the patent office, safe from all thieves.

        We are however suggesting to ignore the fact that patent exists, and use that knowledge anyway.

        Even if you want to be anti-capitalist and follow patent law, it is easy enough to use only the methods provided by IBMs expired patents and thus not run a fowl of any laws.

    • Erm... Surely you just replace a room thermostat with a CPU temp probe and the biiiiiiig chillers on the back wall with smaller chillers feeding directly into the cases?

      Feedback loop to turn on the chillers above a certain temp and... Bob's your mother's brother.
    • by Jurily ( 900488 )

      Possible problem: do we need to factor in some increased wear & tear on the machines for higher temperatures? That would complicate things.

      And the increased burnout rate of your sysadmins. But who cares about them, right?

    • by Monkeedude1212 ( 1560403 ) on Thursday October 22, 2009 @10:18AM (#29834827) Journal

      Sadly, in an effort to save money, we hired some developers with little to no experience, and zero credentials. Turns out the program they wrote to control the thermostat eats up so many compute cycles that it visibly raises temperature of whatever machine its running on. So we ran it in the server room, because thats where temperature is most important. However by the time it would adjust the temperature the room would raise 1 Degree. Then it would have to redo its analysis and adjustments.

      Long story short, the building burned down and I'm now unemployed.

      • Re: (Score:3, Funny)

        by JustinRLynn ( 831164 )
        Wow, I never thought I'd observe a thermal cascade reaction outside of a chemistry lab or a nuclear power plant. Thanks slashdot!
        • This reminds me of Apple's advertising in 1982 when they were trying to get people to buy a muffin fan for their Applie 2E's, their logo was, "The Only Thing You Can Do With a Baked Apple is Eat It." and a picture below of a partially incinerated Apple 2E.
    • Re:Possible strategy (Score:4, Interesting)

      by Linker3000 ( 626634 ) on Thursday October 22, 2009 @10:20AM (#29834843) Journal

      Interestingly enough, I recently submitted an 'Ask Slashdot' (Pending) about this as my IT room is also the building's server room (just one rack and 5 servers) and we normally just keep the windows open during the day and turn on the aircon when we close up for the night, but sometimes we forget and the room's a bit warm when we come in the next day! We could just leave the aircon on all the time but that's not very eco-friendly.

      I was asking for advice on USB/LAN-based temp sensors and also USB/LAN-based learning IR transmitters so we could have some code that sensed temperature and then signalled to the aircon to turn on by mimicking the remote control. Google turns up a wide range of kit from bareboard projects to 'professional' HVAC temperature modules costing stupid money so I was wondering if anyone had some practical experience of marrying the two requirements (temp sensor and IR transmitter) with sensibly-priced, off-the-shelf (in the UK) kit.

      Anyone?

    • You don't need anything as complicated as a genetic algorithm. You have a defined control (thermostat), a defined state (temperature), some external but relatively predictable variables (outside temperature and server load), and a decent model (should be a straightforward ODE) that defines the relationship between those. Define a cost function, balancing the need to keep both temperatures and energy costs low, and you've got a very straightforward optimal control problem.

      Because its continuous, not partic

    • by PPH ( 736903 )

      What? Do you live in some science-fictiony, futuristic world? That sort of thing would be great. But here's how our PHB management deals with such problems. Note that this is an office environment, not a server room:

      For years, we've worked in an office that is usually maintained at about 80F, summer or winter. Management mandated that the thermostats all be set to this point so as to save energy consumed by the AC system. During the summer, the math was simple. Its an old building (1950s era), single story

  • Sure, the fans kick in and you aren't saving as much, but are you still saving? I suspect you still are, there is a reason you are told to run ceiling fans in your house even with the AC on.

    The thermal modeling for all this isn't that difficult. You can get power consumption, fan speeds, temp, etc and feed them into a pretty accurate plant model that should be able to on the fly adjust temperature for optimal efficiency. Or I guess we can hire company to form a bunch of committees to do a bunch of studies
    • Re: (Score:3, Insightful)

      by mea37 ( 1201159 )

      "Sure, the fans kick in and you aren't saving as much, but are you still saving? I suspect you still are, there is a reason you are told to run ceiling fans in your house even with the AC on."

      If only someone would do a study based on real-world testing, we could be sure... Oh, wait...

      There are several differences between ceiling fans and server fans. You can't use one to make predications about the other. "Using one large fan to increase airflow in a room is a more efficient way for people to feel cooler

      • by greed ( 112493 ) on Thursday October 22, 2009 @10:48AM (#29835167)

        For starters, people sweat and computers do not. So, airflow helps cool people by increasing evaporation, in addition to direct thermal transfer. Even when you think you aren't sweating, your skin is still moist and evaporative cooling still works.

        Unless someone invents a CPU swamp cooler, that's just not happening on a computer. You do need airflow to keep the hot air from remaining close to the hot component (this can be convection or forced), but you don't get that extra... let's call it "wind chill" effect that humans feel.

  • I thought the internet was free (or so people keep telling me). You mean it actually costs these companies money to maintain the connections??? Wow. I guess my $15/month bill actually serves a purpose after all.

  • UNITS? (Score:2, Interesting)

    80 whats? Obviously they mean 80F (running a temperature at 80K, 80C or 80R would be insane), but you should always specify units (especially if your using some backwards units like Fahrenheit!)

    • Re: (Score:3, Funny)

      by friedo ( 112163 )

      Fahrenheit backwards? That shit was metric before the Metric System even existed.

      To wit:

      0F is about as cold as it gets, and 100F is about as hot as it gets.

      See? Metric.

      • lol not around here [google.ca] my friend.

        Have you ever gone outside when it's -40 (C or F, it's the same)? The air is so cold that it hurts to breathe, but I love it. There is nothing like it. The humidity from your breath sticks to your eyelashes and they freeze together and you have to pick the ice off so you can open your eyes. It's amazing human beings even live here.

      • Re: (Score:3, Informative)

        by tepples ( 727027 )

        Fahrenheit backwards? That shit was metric before the Metric System even existed.

        To wit:

        0F is about as cold as it gets, and 100F is about as hot as it gets.

        You're right for the 40th parallel or so. But there are parts of the world that routinely dip below 0 deg F (-18 deg C) and other parts that routinely climb above 100 deg F (38 deg C). Things like that are why SI switched from Fahrenheit and Rankine to Celsius and Kelvin.

        • Things like that are why SI switched from Fahrenheit and Rankine to Celsius and Kelvin.

          Sure, blame climate change. Everyone else does!

      • Re:UNITS? (Score:4, Interesting)

        by nedlohs ( 1335013 ) on Thursday October 22, 2009 @11:08AM (#29835437)

        And yet the temperature here measured in F gets negative every winter. And where I previously lived it got above 100F every summer (and it also does where I am now, but only a day or three each year).

        But in both those places a temperature of 0C was the freezing point of water, and 100C the boiling point. Yes that 100C one isn't so useful in terms of daily temperature, the 0C is though since whether water will freeze or not is the main transition point in daily temperature.

        • by ndege ( 12658 )

          in both those places a temperature of 0C was the freezing point of water, and 100C the boiling point

          Wow...cool! I have always wanted to live at a location whose conditions matched the International Standard Atmosphere [wikipedia.org]: ie: you lived at sea-level with the temperature at +15 deg C and the pressure at 101,325Pa?

          Btw, if it had been said that those values are approximate, I would have let it go. ;)

          • by DavidTC ( 10147 )

            How could the freezing point of water at exactly 0C require that the temperature is 15C? That makes no sense.

  • Ducted cabinets (Score:3, Interesting)

    by tom17 ( 659054 ) on Thursday October 22, 2009 @10:04AM (#29834707) Homepage
    So what about having ductwork as another utility that is brought to each individual server? Rather than having thousands of tiny inefficient fans whirring away, you could have a redundant farm of large efficient fans that pull in cool air from outside (cooling only required then in hot climates or summer) and duct it under the floor in individual efficient ducts to each cabinet.Each cabinet would then have integral duct-work that would connect to individual servers. The servers would then have integral duct-work that would route the air to the critical components. There would have to be a similar system of return-air duct-work that would ultimately route back to another redundant farm of large efficient fans that scavenge the heated air and dump it outside.

    I realise that this is not something that could be done quickly, it would require co-operation from all major vendors and then only if it would actually end up being more efficient overall. There would be lots of hurdles to overcome too... Efficient ducting (no jagged edges or corners like int domestic HVAC ductwork), no leaks, easy interconnects, space requirements, rerouting away from inactive equipment etc etc etc.You would still need some ac in the room as there is bound to be heat leakage from the duct-work, as well as heat given off from less critical components, but the level of cooling required would be much less if the bulk of the heat was ducted straight outside.

    So I know the implementation of something like this would be monumental, requiring redesigning of servers, racks, cabinets and general DC layout. It would probably require standards to be laid out so that any server will work in any cab etc (like current rackmount equipment is fairly universally compatible), but after this conversion, could it be more efficient and pay off in the long run?

    Just thinking out loud.

    Tom...

    • Re: (Score:2, Informative)

      by Cerberus7 ( 66071 )

      THIS. I was going to post the same thing, but you beat me to it! APC makes exactly what you're talking about. They call it "InfraStruXure." Yeah, I know... Anywho, here's a link to their page for this stuff [apc.com].

      • Ah, erm, no. That's not what InfraStruXure is. And there is a good reason. What happens when you need to work in the front or back of the cabinet? All of a sudden your cooling mechanism is offline and you have precious few minutes without forced air before your servers roach themselves.

        The reason this has never (and probably will never) been done is the amount of form factor standardization required from top to bottom in the vendor lineup. Even if the heavens parted and God himself handed down a standa

        • "asking every bit of equipment to conform to the same standard, and to stick to that standard for more than one product release cycle, is something of a pipe dream."

          Yeah... I dream the day they decide, well, I don't know, something like all server-grade equipment to fit into a cabinet 482.6mm wide and heigth by multiples of 44.45mm

    • Re: (Score:3, Informative)

      by Linker3000 ( 626634 )

      While I was looking at aircon stuff for our small room, I came across a company that sold floor-to-ceiling panels and door units that allowed you to 'box in' your racks and then divert your aircon into the construction rather than cooling the whole room. Seems like a sensible solution for smaller data centres or IT rooms with 1 or 2 racks in the corner of an otherwise normal office.

    • by LMacG ( 118321 )

      I know just the man to work on this -- Archebald 'Harry' Tuttle. [wikipedia.org]

    • Why even have individual cases? It seems to be rare now days that a full rack isn't just full of computers. Why not have one massive door and a bunch of naked computers on the racks. Set up the air flow in your building such that one side is high pressure the other side is low and blow air across the entire thing.

  • Well, if you have a large cluster, you can load balance based on CPU temp to maintain a uniform junction temp across the cluster. Then all you need to do is maintain just enough A/C to keep the CPU cooling fans running slow (so there is excess cooling capacity to handle a load spike since the A/C can only change the temp of the room so quickly)

    Or, you can just bury your data center in the antarctic ice and melt some polar ice cap directly.

  • I used to have a Pentium 4 Prescott , the truth is processors can run significantly above spec (hell the thing would go above the "max temp" just opening notepad). It's already been shown that higher temps don't break HDD, are the downsides of running the processor a few degrees hotter significant or can they be ignored?

    • With the P4, there are significant disadvantages with running it too hot. The CPU contains a thermal sensor and automatically reduces the clock speed when it gets hot to prevent damage. When it's running at 100MHz, it takes a while to get anything done...
  • by Yvan256 ( 722131 ) on Thursday October 22, 2009 @10:24AM (#29834887) Homepage Journal

    If you save enery by having warmer data centers, but that it shortens the MTBF, is it really that big of a deal?

    Let's say the hardware is rated for five years. Let's say that running it hotter than the recommended specifications shortens that to three years.

    But in three years, new and more efficient hardware will probably replace it anyway because it will require, let's say, 150 watts instead of 200 watts, so the old hardware would get replaced anyway because the new hardware will cost less to run in those lost two years.

    • Re: (Score:3, Interesting)

      by amorsen ( 7485 )

      But in three years, new and more efficient hardware will probably replace it anyway because it will require, let's say, 150 watts instead of 200 watts

      That tends to be hard to get actually, at least if we're talking rack-mountable and if you want it from major vendors.

      Rather you get something 4 times as powerful which still uses 200W. If you can then virtualize 4 of the old servers onto one of the new, you have won big. If you can't, you haven't won anything.

  • Longer Study (Score:4, Insightful)

    by amclay ( 1356377 ) on Thursday October 22, 2009 @10:37AM (#29835039) Homepage Journal
    The studies were not long enough to constitute a very in-depth analysis. It would have to be a multi-month, or up to a year to analyze all the effects of raising temperatures.

    For example, little was considered with:

    1) Mechanical Part wear (increased fan wear, component wear due to heat)

    2) Employee discomfort (80 degree server room?)

    3) Part failure*

    *If existing cooling solutions had issues, it would be a shorter time between the issue and additional problems since you have cut your window by ~15 degrees.
  • Yes, but if you have the room at the tipping point what does this do to your ability to recover from a fault? I know one reason many datacenters have experienced outages even with redundant systems is that the AC equipment is almost never on UPS and so it takes some time for them to recover after switching to generators. If you are running 10F hotter doesn't that mean you have that much less time for the AC to recover before you start experiencing problems? For a large company with redundant datacenters or
  • by BenEnglishAtHome ( 449670 ) on Thursday October 22, 2009 @11:32AM (#29835809)

    I'm less concerned with the fine-tuning of the environment for servers than I am with getting the basics right. How many bad server room implementations have you seen?

    I'm sitting in one. We used to have a half-dozen built-for-the-purpose Liebert units scattered around the periphery of the room. The space was properly designed and the hardware maintained whatever temp and humidity we chose to set. They were expensive to run and maintain but they did their job and did it right.

    About seven years ago, the bean-counting powers-that-be pronounced them "too expensive" and had them ripped out. The replacement central system pumps cold air under the raised floor from one central point. Theoretically, it could work. In practice, it was too humid in here the first day.

    And the first week, month, and year. We complained. We did simple things to demonstate to upper management and building management that it was too humid in here, things like storing a box of envelopes in the middle of the room for a week and showing management that they had sealed themselves due to excessive humidity.

    We were, in every case, rebuffed.

    A few weeks ago, a contractor working on phone lines under the floor complained about the mold. *HE* got listened to. Preliminary studies show both penicillin (relatively harmless) and black (not so harmless) mold in high concentrations. Lift a floor tile near the air input and there's a nice thick coat of fluffy, fuzzy mold on everything. There's mold behind the sheetrock that sometimes bleeds through when the walls sweat. They brought in dehumidifiers that are pulling more than 30 gallons of water out of the air every day. The incoming air, depending on who's doing the measuring, is at 75% to 90% humidity. According to the first independent tester who came in, "Essentially, it's raining" under our floor at the intake.

    And the areas where condensation is *supposed* to happen and drain away? Those areas are bone dry.

    IOW, our whole system was designed and installed without our input and over our objections by idiots who had no idea what they were doing.

    So, my fellow server room denizens, please keep this in mind - When people (especially management types) show up with studies that support the view that the way the environment is controlled in your server room can be altered to save money, be afraid. Be very afraid. It doesn't matter how good the basic research is or how artfully it could be employed to save money without causing problems, by the time the PHBs get ahold of it, it'll be perverted into an excuse to totally screw things up.

    • Re: (Score:3, Funny)

      by infinite9 ( 319274 )

      About a year ago, I worked on a project in a backwards location that was unfortunately within driving distance of the major city where I live. The rate was good though so I took the job. These people were dumb for a lot of reasons. (it takes a lot for me to call my customers dumb) But the one that really made me laugh was the server rack strategically placed in the server room so that the server room door would smack into it whenever someone came in the room.

    • Re: (Score:3, Insightful)

      by bill_kress ( 99356 )

      You left out what is usually the best part!

      For his valiant efforts in preventing waste did the bean counter get promoted to VP level or directly to an officer of the company? Or did he quit (get pushed out) and get a higher paying job elsewhere. This kind of stupidity never goes unrewarded.

  • I was at a Google presentation on this last night. If I remember correctly, I believe they found the 'ideal' temperature for running server hardware without decreasing lifespan to be about 45 C.
    • by jbengt ( 874751 )
      Are you sure they were'nt measuring this temperature somewhere inside the server rack rather than the ambient temperature in the room? 45C is around 113F, which is hotter than most daytime highs in Phoenix, Arizona, and typical motors are only rated for ambient conditions of 104F, i.e. 40C (not sure about the little motors inside drives, etc.). See Skapare's post below about the failures he experienced.
  • Risk of AC failure (Score:5, Interesting)

    by Skapare ( 16644 ) on Thursday October 22, 2009 @12:09PM (#29836315) Homepage

    If there is a failure of AC ... that is, either Air Conditioning OR Alternating Current, you can see a rapid rise in temperature. With all the systems powered off, the latent heat inside the equipment, which is much higher than the room temperature, emerges and raises the room temperature rapidly. And if the equipment is still powered (via UPS when the power fails), the rise is much faster.

    In a large data center I once worked at, with 8 mainframes and 1800 servers, power to the entire building failed after several ups and downs in the first minute. The power company was able to tell us within 20 minutes that it looked like a "several hours" outage. We didn't have the UPS capacity for that long, so we started a massive shutdown. Fortunately it was all automated and the last servers finished their current jobs and powered off in another 20 minutes. In that 40 minutes, the server room, normally kept around 17C, was up to a whopping 33C. And even with everything powered off, it peaked at 38C after another 20 minutes. If it weren't so dark in there I think some people would have been starting a sauna.

    We had about 40 hard drive failures and 12 power supply failures coming back up that evening. And one of the mainframes had some issues.

  • by speculatrix ( 678524 ) on Thursday October 22, 2009 @12:28PM (#29836545)
    UPS batteries are sealed lead-acid and they definitely benefit from being kept cooler, it's also good to keep them in a separate room, usually close to your main power switching. As far as servers are concerned, I've always been happy with ab ambient room temp of about 22 or 23, provided air-flow is good so you don't get hot-spots, and it makes for a more pleasant working environment (although with remote management I generally don't need to actually work in them for long periods of time).
    • by jbengt ( 874751 )
      You are deefinitely right about the UPS benefitting from cooler air, but I have been involved in several projects where A/C was not installed for the UPS rooms, just ventilation; and this in a Chicago climate where the summer temperature often exceeds 95F (35C). The systems ran fine, though we did let the owner know that they were shortening the life of the system.
      As far as 22C or 23C goes, that is a cooler than typically req'd now; 75F to 78F (24C to 27C) is usually perfectly fine for UPS rooms or even s
  • describes temperatures using the Fahrenheit scale.

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...