The Risks and Rewards of Warmer Data Centers 170
1sockchuck writes "The risks and rewards of raising the temperature in the data center were debated last week in several new studies based on real-world testing in Silicon Valley facilities. The verdict: companies can indeed save big money on power costs by running warmer. Cisco Systems expects to save $2 million a year by raising the temperature in its San Jose research labs. But nudge the thermostat too high, and the energy savings can evaporate in a flurry of server fan activity. The new studies added some practical guidance on a trend that has become a hot topic as companies focus on rising power bills in the data center."
Quick solution (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
WHAT?!?
The heat of vaporization doesn't change based on temperature. What are you talking about?
Re: (Score:3, Insightful)
Re: (Score:2)
Re: (Score:3, Funny)
I always figured the best approach was a combined server room/aquarium. But that assumes you can train some fish to do your server maintenance. I hear the octopus is quite smart, and could easily move around inside of cases. I wonder though, will this provoke cries of "fight octopus outsourcing now!" from the Slashdot crowd?
Re:Quick solution (Score:5, Informative)
It is true that if you are producing X BTUs of heat inside the room, then to maintain temperature, you have to pump that much heat out. However, the efficiency of this heat transfer depends on the temperature difference between the inside and the outside. To the extent you want to force air (or any other heat transfer medium) that is already colder than outside to dump energy into air (or other medium) that is warmer, that will cost you energy.
Also, too cold, and you will invite condensation. In your hypothetical scenario, you'd need to run some pretty powerful air conditioning to prevent condensation from forming everywhere.
Re:Quick solution (Score:4, Informative)
Re:Quick solution (Score:4, Interesting)
I think we're spending way too much time trying to 'cool' things that do not, in fact, need to be cooler than outside. Nowhere on earth is so hot that servers won't run, unless you've built a server room over an active volcano or something.
All we actually need to do is remove the heat from the servers to the air, and then keep swapping the air with the outside.
Which happens automatically if you let heat out the top and air in the bottom. Even if you have to condition the incoming air to remove moisture, that's cheaper than actually 'cooling' AC. So the second part, replacing the room air, is easy.
As for the first, I've always wondered why they don't use chimney-like devices to generate wind naturally and send it though server racks, instead of fans. I think all the heat in a server room could actually, on exit, suck incoming air in fast enough to cool computers if it actually hit the right places on the way in.
Heck, this would apply anyway. Instead of having AC vent into server rooms, why not have AC vent into server racks? Hook up the damn AC to the fan vent on each server, blow cold air straight in. The room itself could end not cold at all.
Re: (Score:2)
Many racks already do this. Plus, if you aren't looking to pipe AC in each rack, just rotate your racks into alternating hot/cold aisles and seal them off so air can only pass through intakes/outflow vents.
Re:Quick solution (Score:4, Informative)
Nowhere on earth is so hot that servers won't run, unless you've built a server room over an active volcano or something.
Given a sufficiently powerful fan, then yes.
All we actually need to do is remove the heat from the servers to the air, and then keep swapping the air with the outside.
Which becomes more difficult the higher the ambient air temperature becomes. Heat transfer is proportional to heat delta, so the closer the air temperature is to the heat sink temperature, the more air you need to blow to remove the same amount of heat. Eventually, the amount of electricity you are spending blowing air over the heat sinks is greater than the savings of using less AC.
This was half the point of the article -- you can save a lot of money by raising server room temperatures, but eventually (at a temperature well below outdoor ambient around here) you actually start to lose money due to all the extra fan activity.
Which happens automatically if you let heat out the top and air in the bottom.
Yes but much too slowly to be of use. Convection is also proportional to temperature difference. By the time your server room temperature is enough higher than outside temperature to create significant airflow, your servers are toast.
As for the first, I've always wondered why they don't use chimney-like devices to generate wind naturally and send it though server racks, instead of fans.
Go ahead and try it. A lot of cases already have ducting that funnels air directly from outside the case to the CPU. A few more pieces of cardboard, a hole and chimney in the top of your case, and you should be ready to remove the fan and see what convection can do for you. Sneak preview: unless you've specifically picked components that can run off passive cooling, you'll be in the market for a new one. Especially if you live in a hot place and turn off your AC for this experiment.
While its conceivable to have an effective server room based entirely off of low-power chips that require no active cooling, space is still a major concern in the server room. The desire for greater compute density is directly fighting against using a large number of low-power chips spread out. Thus performance/watt becomes a major metric for the server room, because they want the most performance for a fixed amount of space and thus cooling.
why not have AC vent into server racks?
That's actually a good idea, and a lot of places do it.
Re: (Score:3, Interesting)
Data centers would be much more efficient if blade servers had modular water cooling instead of fans. Water is much better at transferring heat than air. Then you could just remove all the fans from the data center and add a network of water pipes (alongside the spaghetti of network and power cabling) around the data center. Then just pump cold water in and dispose of the hot water (pretty cheap to do). Should be reasonable safe too really - the water should only be near low-voltage systems really (voltage
Re: (Score:2)
1) A failure causing coolant leakage could potentially destroy tens of servers.
2) Maintenance of these systems is quite expensive (mold and such growing int he lines that needs to be periodically cleaned out.)
3) Failure of a main pump could bring down the entire data center (although I assume there would be redundant systems in place)
Re:Quick solution (Score:4, Informative)
You mean like Crays used to have ?
The problems with water are numerous: leaks, evaporation, rust/corrosion, dead/weak pumps, fungus/algae, even just the weight of all that water can cause big problems and complicate room layouts.
Air is easy. A fan is a simple device: it either spins, or it doesn't. A compressor is also rather simple. Having fewer failure modes in a system makes it easier to monitor and maintain.
You also can't just "dispose of the hot water". It's not like you can leave the cold faucet open, and piss the hot water out as waste. Water cooling systems are closed loops. You cool your own water via radiators, which themselves are either passively or actively cooled with fans and peltiers. You could recirculate the hot water through the building and recycle the heat, but for most datacenters you'd still have a huge thermal surplus that needs to be dissipated. Heat doesn't just vanish because you have water, it only allows you to move it faster.
Re: (Score:2)
Yeah, therein lies the problem. This "great insulation":
A) Doesn't exist.
B) Is horrendously expensive.
Yes, in an ideal environment this makes sense, but we're not working in one. You have energy leak in from the outside. In addition to that, there's no device that can move energy ideally. There's inefficiencies in e
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
The temperature gradient between the hot and cold side of the cooling system makes a big difference, just like it takes more work to pump 100 gallons of water to a height of 100 feet than it does to pump it 1 foot. Meanwhile if the outside is cooler than the inside the heat will flow with no effort at all.
Move to Canada (Score:3)
I know it was meant as a joke, but moving to colder climates may not be such a bad idea. Moving to a northern country such as Canada or Norway, you would benefit from the colder outside temperature, in the winter, to keep the servers cool and then any heat produced could be funnelled to keeping nearby buildings warm. The real challenge will be keeping any humidity out, but considering how dry the air during the winters can get there it may not be any issue.
All this said and done, trying to work out the swee
Re: (Score:3, Interesting)
I know it was meant as a joke, but moving to colder climates may not be such a bad idea. Moving to a northern country such as Canada or Norway, you would benefit from the colder outside temperature, in the winter, to keep the servers cool and then any heat produced could be funnelled to keeping nearby buildings warm.
There has been a fair bit of talk about building so-call "green" DCs in Iceland, where the lower overall temperatures reduce the need for cooling (meaning less energy used, lowering operational costs) and there is good potential for powering the things mainly with power obtained from geothermal sources.
There was also a study (I think it came out of Google) suggesting that load balancing over an international network, like Google's app engine or similar, be arranged so that when there is enough slack to make
Re: (Score:2)
Re: (Score:3, Funny)
Ah, that's why you never see Icelandic women working in data centres, they overload the air-con!!!
Re: (Score:2)
Re: (Score:2)
Locate the server farm in Antarctica!
Perhaps not quite Antarctica, but according to the BBC's Click program [bbc.co.uk] Iceland is bidding for server business based on the low temperatures and lots of cheap geothermal power.
Re: (Score:2)
Very high altitude, very cold, very low humidity -- you regularly lose hard drives from head crashes.
Re: (Score:3, Funny)
Re: (Score:2)
Antartica would indeed not be a good choice but afaict there are places with temperatures low enough that you could use outside air to cool stuff year round while not being so low as to cause major logistical problems.
Re: (Score:3, Funny)
So all you have to do is tighten those savings and you'll be fine.
Re: (Score:2)
Re: (Score:2)
Try ... offtopic. But you can't anyway since you already posted.
--Part-time grammar/spelling Nazi with no mod points right now.
Re: (Score:2)
I was actually thinking that the spelling/grammar Nazi tag might come with a Karma bonus ;-)
Re: (Score:2)
Re: (Score:2)
Just make the building well insulated and then have controlled fans to bring in just enough outside air to keep the temperature where you want it.
Re: (Score:2)
Re: (Score:2)
Allow me to speak for Slashdot grammar nazis and say: we're damn tired of seeing that lose/loose error, in particular. At least with the its/it's and their/there/they're errors, you could see how someone could become confused, but there's just no way anyone would pronounce "loose" as "lose". Seriously, the internet is great and all but ... read a book!
Re: (Score:2)
we're damn tired of seeing that lose/loose error, in particular
So am I, but I am very familiar with the correct way to spell lose but now and then we all make typos. I don't actually mind being called out on it because it bugs me too, but I don't appreciate the "read a book" comment.
Possible strategy (Score:4, Interesting)
1. Get a thermostat you can control with a computer
2. Give the computer inputs of temperature and energy use, and output of heating/cooling
3. Write a program to minimize energy use (genetic algorithm?)
4. Profit!!
Possible problem: do we need to factor in some increased wear & tear on the machines for higher temperatures? That would complicate things.
Re: (Score:3, Funny)
Careful with that, there are numerous patents to that effect. You wouldn't want to be suggesting IP theft, now, would you?
Re: (Score:2)
Careful with that, there are numerous patents to that effect. You wouldn't want to be suggesting IP theft, now, would you?
Of course not! We don't steal IP here. In fact, that sheet of paper with their IP on it (the patent) will forever remain safely tucked away on file at the patent office, safe from all thieves.
We are however suggesting to ignore the fact that patent exists, and use that knowledge anyway.
Even if you want to be anti-capitalist and follow patent law, it is easy enough to use only the methods provided by IBMs expired patents and thus not run a fowl of any laws.
Re: (Score:2)
Feedback loop to turn on the chillers above a certain temp and... Bob's your mother's brother.
Re: (Score:2)
Re: (Score:2)
Possible problem: do we need to factor in some increased wear & tear on the machines for higher temperatures? That would complicate things.
And the increased burnout rate of your sysadmins. But who cares about them, right?
Re:Possible strategy (Score:5, Funny)
Sadly, in an effort to save money, we hired some developers with little to no experience, and zero credentials. Turns out the program they wrote to control the thermostat eats up so many compute cycles that it visibly raises temperature of whatever machine its running on. So we ran it in the server room, because thats where temperature is most important. However by the time it would adjust the temperature the room would raise 1 Degree. Then it would have to redo its analysis and adjustments.
Long story short, the building burned down and I'm now unemployed.
Re: (Score:3, Funny)
Re: (Score:2)
Re:Possible strategy (Score:4, Interesting)
Interestingly enough, I recently submitted an 'Ask Slashdot' (Pending) about this as my IT room is also the building's server room (just one rack and 5 servers) and we normally just keep the windows open during the day and turn on the aircon when we close up for the night, but sometimes we forget and the room's a bit warm when we come in the next day! We could just leave the aircon on all the time but that's not very eco-friendly.
I was asking for advice on USB/LAN-based temp sensors and also USB/LAN-based learning IR transmitters so we could have some code that sensed temperature and then signalled to the aircon to turn on by mimicking the remote control. Google turns up a wide range of kit from bareboard projects to 'professional' HVAC temperature modules costing stupid money so I was wondering if anyone had some practical experience of marrying the two requirements (temp sensor and IR transmitter) with sensibly-priced, off-the-shelf (in the UK) kit.
Anyone?
Re: (Score:2)
Basically, you're looking at $300-$1000 in hardware, but it can interface with Nagios.
If we ever move our servers to the basement, I'll be setting these up to monitor for flooding or temperature issues.
Re: (Score:2)
Cheers - my start point for pricing was a USB-based IR transmitter that only costs 67UKP
http://www.redrat.co.uk/products/index.html [redrat.co.uk]
Anyone used this for a similar temp control project?
Re: (Score:2)
If you are going for cheap, you can get a USB thermometer for like 9 bucks. I'm sure you can find them locally for a bit more.
http://www.dealextreme.com/details.dx/sku.7003 [dealextreme.com]
The included software is win-only. Several people have coded some linux tools of varying usefulness. You'll probably want to do your own calibrations, mine is consistantly off, but I've seen others complain of nonlinear responses.
http://err.no/personal/blog/tech/2008-07-22-10-17_kernel_patches_TEMPer_thermometer.html [err.no]
http://www.roaringpe [roaringpenguin.com]
Re: (Score:2)
You don't need anything as complicated as a genetic algorithm. You have a defined control (thermostat), a defined state (temperature), some external but relatively predictable variables (outside temperature and server load), and a decent model (should be a straightforward ODE) that defines the relationship between those. Define a cost function, balancing the need to keep both temperatures and energy costs low, and you've got a very straightforward optimal control problem.
Because its continuous, not partic
Re: (Score:2)
What? Do you live in some science-fictiony, futuristic world? That sort of thing would be great. But here's how our PHB management deals with such problems. Note that this is an office environment, not a server room:
For years, we've worked in an office that is usually maintained at about 80F, summer or winter. Management mandated that the thermostats all be set to this point so as to save energy consumed by the AC system. During the summer, the math was simple. Its an old building (1950s era), single story
A little bit unclear (Score:2)
The thermal modeling for all this isn't that difficult. You can get power consumption, fan speeds, temp, etc and feed them into a pretty accurate plant model that should be able to on the fly adjust temperature for optimal efficiency. Or I guess we can hire company to form a bunch of committees to do a bunch of studies
Re: (Score:3, Insightful)
"Sure, the fans kick in and you aren't saving as much, but are you still saving? I suspect you still are, there is a reason you are told to run ceiling fans in your house even with the AC on."
If only someone would do a study based on real-world testing, we could be sure... Oh, wait...
There are several differences between ceiling fans and server fans. You can't use one to make predications about the other. "Using one large fan to increase airflow in a room is a more efficient way for people to feel cooler
Re:A little bit unclear (Score:4, Insightful)
For starters, people sweat and computers do not. So, airflow helps cool people by increasing evaporation, in addition to direct thermal transfer. Even when you think you aren't sweating, your skin is still moist and evaporative cooling still works.
Unless someone invents a CPU swamp cooler, that's just not happening on a computer. You do need airflow to keep the hot air from remaining close to the hot component (this can be convection or forced), but you don't get that extra... let's call it "wind chill" effect that humans feel.
The internet's not free? (Score:2)
I thought the internet was free (or so people keep telling me). You mean it actually costs these companies money to maintain the connections??? Wow. I guess my $15/month bill actually serves a purpose after all.
UNITS? (Score:2, Interesting)
80 whats? Obviously they mean 80F (running a temperature at 80K, 80C or 80R would be insane), but you should always specify units (especially if your using some backwards units like Fahrenheit!)
Re: (Score:3, Funny)
Fahrenheit backwards? That shit was metric before the Metric System even existed.
To wit:
0F is about as cold as it gets, and 100F is about as hot as it gets.
See? Metric.
Re: (Score:2)
lol not around here [google.ca] my friend.
Have you ever gone outside when it's -40 (C or F, it's the same)? The air is so cold that it hurts to breathe, but I love it. There is nothing like it. The humidity from your breath sticks to your eyelashes and they freeze together and you have to pick the ice off so you can open your eyes. It's amazing human beings even live here.
Re: (Score:2)
Re: (Score:2)
So true! At least we have something mildly interesting to look at, like a tree once in a while, maybe a lake.
Re: (Score:3, Informative)
Fahrenheit backwards? That shit was metric before the Metric System even existed.
To wit:
0F is about as cold as it gets, and 100F is about as hot as it gets.
You're right for the 40th parallel or so. But there are parts of the world that routinely dip below 0 deg F (-18 deg C) and other parts that routinely climb above 100 deg F (38 deg C). Things like that are why SI switched from Fahrenheit and Rankine to Celsius and Kelvin.
Re: (Score:2)
Things like that are why SI switched from Fahrenheit and Rankine to Celsius and Kelvin.
Sure, blame climate change. Everyone else does!
Re:UNITS? (Score:4, Interesting)
And yet the temperature here measured in F gets negative every winter. And where I previously lived it got above 100F every summer (and it also does where I am now, but only a day or three each year).
But in both those places a temperature of 0C was the freezing point of water, and 100C the boiling point. Yes that 100C one isn't so useful in terms of daily temperature, the 0C is though since whether water will freeze or not is the main transition point in daily temperature.
Re: (Score:2)
in both those places a temperature of 0C was the freezing point of water, and 100C the boiling point
Wow...cool! I have always wanted to live at a location whose conditions matched the International Standard Atmosphere [wikipedia.org]: ie: you lived at sea-level with the temperature at +15 deg C and the pressure at 101,325Pa?
Btw, if it had been said that those values are approximate, I would have let it go. ;)
Re: (Score:2)
How could the freezing point of water at exactly 0C require that the temperature is 15C? That makes no sense.
Ducted cabinets (Score:3, Interesting)
I realise that this is not something that could be done quickly, it would require co-operation from all major vendors and then only if it would actually end up being more efficient overall. There would be lots of hurdles to overcome too... Efficient ducting (no jagged edges or corners like int domestic HVAC ductwork), no leaks, easy interconnects, space requirements, rerouting away from inactive equipment etc etc etc.You would still need some ac in the room as there is bound to be heat leakage from the duct-work, as well as heat given off from less critical components, but the level of cooling required would be much less if the bulk of the heat was ducted straight outside.
So I know the implementation of something like this would be monumental, requiring redesigning of servers, racks, cabinets and general DC layout. It would probably require standards to be laid out so that any server will work in any cab etc (like current rackmount equipment is fairly universally compatible), but after this conversion, could it be more efficient and pay off in the long run?
Just thinking out loud.
Tom...
Re: (Score:2, Informative)
THIS. I was going to post the same thing, but you beat me to it! APC makes exactly what you're talking about. They call it "InfraStruXure." Yeah, I know... Anywho, here's a link to their page for this stuff [apc.com].
Re: (Score:2)
Ah, erm, no. That's not what InfraStruXure is. And there is a good reason. What happens when you need to work in the front or back of the cabinet? All of a sudden your cooling mechanism is offline and you have precious few minutes without forced air before your servers roach themselves.
The reason this has never (and probably will never) been done is the amount of form factor standardization required from top to bottom in the vendor lineup. Even if the heavens parted and God himself handed down a standa
Re: (Score:2)
"asking every bit of equipment to conform to the same standard, and to stick to that standard for more than one product release cycle, is something of a pipe dream."
Yeah... I dream the day they decide, well, I don't know, something like all server-grade equipment to fit into a cabinet 482.6mm wide and heigth by multiples of 44.45mm
Re: (Score:3, Informative)
While I was looking at aircon stuff for our small room, I came across a company that sold floor-to-ceiling panels and door units that allowed you to 'box in' your racks and then divert your aircon into the construction rather than cooling the whole room. Seems like a sensible solution for smaller data centres or IT rooms with 1 or 2 racks in the corner of an otherwise normal office.
Re: (Score:2)
I know just the man to work on this -- Archebald 'Harry' Tuttle. [wikipedia.org]
Re: (Score:2)
Why even have individual cases? It seems to be rare now days that a full rack isn't just full of computers. Why not have one massive door and a bunch of naked computers on the racks. Set up the air flow in your building such that one side is high pressure the other side is low and blow air across the entire thing.
Cluster load balancing based on temp (Score:2, Interesting)
Well, if you have a large cluster, you can load balance based on CPU temp to maintain a uniform junction temp across the cluster. Then all you need to do is maintain just enough A/C to keep the CPU cooling fans running slow (so there is excess cooling capacity to handle a load spike since the A/C can only change the temp of the room so quickly)
Or, you can just bury your data center in the antarctic ice and melt some polar ice cap directly.
Turn fans down? (Score:2, Informative)
I used to have a Pentium 4 Prescott , the truth is processors can run significantly above spec (hell the thing would go above the "max temp" just opening notepad). It's already been shown that higher temps don't break HDD, are the downsides of running the processor a few degrees hotter significant or can they be ignored?
Re: (Score:2)
Hardware upgrades probably nullify the problems (Score:3, Interesting)
If you save enery by having warmer data centers, but that it shortens the MTBF, is it really that big of a deal?
Let's say the hardware is rated for five years. Let's say that running it hotter than the recommended specifications shortens that to three years.
But in three years, new and more efficient hardware will probably replace it anyway because it will require, let's say, 150 watts instead of 200 watts, so the old hardware would get replaced anyway because the new hardware will cost less to run in those lost two years.
Re: (Score:3, Interesting)
But in three years, new and more efficient hardware will probably replace it anyway because it will require, let's say, 150 watts instead of 200 watts
That tends to be hard to get actually, at least if we're talking rack-mountable and if you want it from major vendors.
Rather you get something 4 times as powerful which still uses 200W. If you can then virtualize 4 of the old servers onto one of the new, you have won big. If you can't, you haven't won anything.
Longer Study (Score:4, Insightful)
For example, little was considered with:
1) Mechanical Part wear (increased fan wear, component wear due to heat)
2) Employee discomfort (80 degree server room?)
3) Part failure*
*If existing cooling solutions had issues, it would be a shorter time between the issue and additional problems since you have cut your window by ~15 degrees.
Re: (Score:2)
"Had we been warm in there we would have lost numerous server towers and Raid Racks."
Don't your servers have thermal protection?
Don't you have a power down temp and procedure?
I can not imagine a professional allowing hardware to fry just because they loose AC. You must have a procedure for that situation because it will happen if you run a data center long enough.
Re: (Score:2)
Your lucky, here the outside air temp is often higher than your shutdown temp.
The thing is that having a higher standard temp in the server room will not mean that your server "fries" when you the HVAC goes down. It means that you may have to take down the machines. When I worked in a place with a big machine room our priorities where protect the data, protect the machine, uptime. Our shutdown was 87 and we got to 85 once during a back up power test that failed to power the HVAC.
Time to recover (Score:2)
Another server room horror story (Score:5, Insightful)
I'm less concerned with the fine-tuning of the environment for servers than I am with getting the basics right. How many bad server room implementations have you seen?
I'm sitting in one. We used to have a half-dozen built-for-the-purpose Liebert units scattered around the periphery of the room. The space was properly designed and the hardware maintained whatever temp and humidity we chose to set. They were expensive to run and maintain but they did their job and did it right.
About seven years ago, the bean-counting powers-that-be pronounced them "too expensive" and had them ripped out. The replacement central system pumps cold air under the raised floor from one central point. Theoretically, it could work. In practice, it was too humid in here the first day.
And the first week, month, and year. We complained. We did simple things to demonstate to upper management and building management that it was too humid in here, things like storing a box of envelopes in the middle of the room for a week and showing management that they had sealed themselves due to excessive humidity.
We were, in every case, rebuffed.
A few weeks ago, a contractor working on phone lines under the floor complained about the mold. *HE* got listened to. Preliminary studies show both penicillin (relatively harmless) and black (not so harmless) mold in high concentrations. Lift a floor tile near the air input and there's a nice thick coat of fluffy, fuzzy mold on everything. There's mold behind the sheetrock that sometimes bleeds through when the walls sweat. They brought in dehumidifiers that are pulling more than 30 gallons of water out of the air every day. The incoming air, depending on who's doing the measuring, is at 75% to 90% humidity. According to the first independent tester who came in, "Essentially, it's raining" under our floor at the intake.
And the areas where condensation is *supposed* to happen and drain away? Those areas are bone dry.
IOW, our whole system was designed and installed without our input and over our objections by idiots who had no idea what they were doing.
So, my fellow server room denizens, please keep this in mind - When people (especially management types) show up with studies that support the view that the way the environment is controlled in your server room can be altered to save money, be afraid. Be very afraid. It doesn't matter how good the basic research is or how artfully it could be employed to save money without causing problems, by the time the PHBs get ahold of it, it'll be perverted into an excuse to totally screw things up.
Re: (Score:3, Funny)
About a year ago, I worked on a project in a backwards location that was unfortunately within driving distance of the major city where I live. The rate was good though so I took the job. These people were dumb for a lot of reasons. (it takes a lot for me to call my customers dumb) But the one that really made me laugh was the server rack strategically placed in the server room so that the server room door would smack into it whenever someone came in the room.
Re: (Score:3, Insightful)
You left out what is usually the best part!
For his valiant efforts in preventing waste did the bean counter get promoted to VP level or directly to an officer of the company? Or did he quit (get pushed out) and get a higher paying job elsewhere. This kind of stupidity never goes unrewarded.
Warmer, better, faster... (Score:2, Informative)
Re: (Score:2)
Risk of AC failure (Score:5, Interesting)
If there is a failure of AC ... that is, either Air Conditioning OR Alternating Current, you can see a rapid rise in temperature. With all the systems powered off, the latent heat inside the equipment, which is much higher than the room temperature, emerges and raises the room temperature rapidly. And if the equipment is still powered (via UPS when the power fails), the rise is much faster.
In a large data center I once worked at, with 8 mainframes and 1800 servers, power to the entire building failed after several ups and downs in the first minute. The power company was able to tell us within 20 minutes that it looked like a "several hours" outage. We didn't have the UPS capacity for that long, so we started a massive shutdown. Fortunately it was all automated and the last servers finished their current jobs and powered off in another 20 minutes. In that 40 minutes, the server room, normally kept around 17C, was up to a whopping 33C. And even with everything powered off, it peaked at 38C after another 20 minutes. If it weren't so dark in there I think some people would have been starting a sauna.
We had about 40 hard drive failures and 12 power supply failures coming back up that evening. And one of the mainframes had some issues.
keep UPS separate and cooler (Score:3, Insightful)
Re: (Score:2)
As far as 22C or 23C goes, that is a cooler than typically req'd now; 75F to 78F (24C to 27C) is usually perfectly fine for UPS rooms or even s
Be very afraid of a study that ... (Score:2)
describes temperatures using the Fahrenheit scale.
Re: (Score:2)
Re:What about HDDs? (Score:5, Informative)
Here's the link for your review: http://hardware.slashdot.org/story/07/02/18/0420247/Google-Releases-Paper-on-Disk-Reliability [slashdot.org]
Re: (Score:3, Insightful)
I am a little skeptical since most hard drive failures I've had have been right after a air conditioning outage. The Google paper uses temperature obtained from SMART, which is usually 10 to 15C higher than the ambient temperature in the room, and the tail of their sample falls off rapidly over 40C. What would the SMART temperature be if the ambient temperature was 40 or so? Probably 60 or above. Their graphs don't do that high.
But we're talking raising the temperature of a data center only 2 or 3 deg. Meat
Re: (Score:3, Funny)
Oh really?
Let's see your proposal, your test criteria, your plan.
Let's see your budget... cut it in half
Now for risk analysis, what if you're right and the servers all fail sooner than expected (i.e. sooner than budgeted)?
Spend 3 weeks filling out red tape
Spend 2 weeks waiting.
OK, you can run your study. Set up two racks in a closet and take measurements every day for a year.
Now write up the review.
Alrigh
Re: (Score:2)
[Citation Needed]
Here's a few links to the contrary:
http://www.tomshardware.com/news/google-hard-drives,4347.html [tomshardware.com]
http://tech.blorge.com/Structure:%20/2007/02/20/googles-hard-disk-study-shows-temperature-is-not-as-important-as-once-thought/ [blorge.com]
Re: (Score:2, Informative)
Until what point? You can't consistently say "increase the temperature to decrease the MTBF".
You'll end up with molten slag.
Yes, you can. MTBF = mean time before/between failure. To decrease, reduce, lower, however you want to say it, it is going to fail SOONER meaning it is getting LESS reliable. That was the point, hotter temps = less reliability. Same goes for just about any physical/chemical process (fans, batteries, hard drive motors, etc.)
Re: (Score:2)
Except the google study didn't display any evidence of this happening - there was no correlation between higher temperatures and higher failure rates on mechanical drives.
http://hardware.slashdot.org/article.pl?sid=07/02/18/0420247 [slashdot.org]
http://www.engadget.com/2007/02/18/massive-google-hard-drive-survey-turns-up-very-interesting-thing/ [engadget.com]
http://labs.google.com/papers/disk_failures.pdf [google.com]
Even if it were, it'd be easy to rememdy - boot all your servers off SSD and keep them in a "hot" room. Keep your SANs-full-o'-spinnin
Re: (Score:2)
After giving that paper a closer look (the best link is this one, btw, the engaget link is dead: http://research.google.com/archive/disk_failures.pdf [google.com] ) The failure rate went up with cold AND hot temperatures. How they got the disk temps that cold is beyond me, but their hot end seemed a little optimistic since I have seen desktops in comfortably air conditioned rooms running disk temps of 50C or more, and have a strong set of anecdotal evidence that these are the disks that fail most often.
Re: (Score:2)
The google survey appears to show that drives are happiest arround 35-40 celcius with failure rates increasing both sides of that band.
Of course there are a couple of issues with that data
Firstly the data comes from the drives built in sensors so if a particular brand of drive has both an abnormal failure rate and an abnormal reported running temperature (either due to producing a different ammount of heat or due to a bad sensor) it would skew the results.
The second problem is they simply don't have much da