A Closer Look At Immersion Cooling For the Data Center 213
1sockchuck writes "Want to save money on data center cooling? Tip your racks on their side, fill them with mineral oil, and submerge your servers. Austin startup Green Revoluton Cooling first profiled here) has a video demo of its immersion cooling solution, which it says can handle racks using up to 100kW of power. A photo gallery on the company web site shows some early installations."
Marketing that niche? (Score:2)
The thing here is they are commercializing a cooling technique usually reserved for the hobbyist. I don't know about the energy saving claims, but their setup looks fairly organized. Interesting turn for a still niche cooling solution.
What about the hard drives? (Score:3)
Well, on the other hand, if they're supposed to be air-tight, I guess they're baby oil-tight, too.
But there's got to be something or another that doesn't react well with mineral oil, right?
I guess this means they save on fans, and the power to run fans. That's additional power and heat savings right there.
OK, I've got it: what about the CD/DVD drives? Or is it all network IPL in data centers? I'm racking my brains trying to think of something this would mess up.
Re: (Score:2)
TFA says the hard drives have to be sprayed with a coating, presumably to make the housing oil-tight as well as airtight.
Blowing air around is a tremendously inefficient way of cooling, and this replaces that with pumping the oil through a heat exchanger / cooling tower.
Things with unsealed moving parts obviously are vulnerable, but not everything needs to be submerged. If you really want a DVD drive, have it outside the oil.
There may be some things that don't react well with mineral oil: avoid having these
Re: (Score:3)
TFA says the hard drives have to be sprayed with a coating, presumably to make the housing oil-tight as well as airtight.
This should IMMEDIATELY ring alarm bells. Hard drives are NOT airtight. They have a filtered air hole. They would never get away with such flimsy construction on an airtight product.
Plus does this system REALLY offer that much advantage over conventional "waterblocks" which keep the cooling fluid seperate from the electronics. I very much doubt it. The major heat generators in a PC are designed to pass out their heat by contact conduction anyway.
Re:What about the hard drives? (Score:4)
Re: (Score:2)
I am told, though, that for very harsh environment embedded systems, that is pretty much what they do. Conformal heatsink plate firmly attached to the board so that the whole thing is a solid block.
Re: (Score:2)
I am told, though, that for very harsh environment embedded systems, that is pretty much what they do. Conformal heatsink plate firmly attached to the board so that the whole thing is a solid block.
They used to do this on notebooks. I'm ripping apart a HP mobile P4 notebook for embedded use and I found out why it is so damned heavy, it has a single gigantic cast/milled heat sink which crosses CPU and GPU (old Radeon.) My later, bigger HP used heat pipes which actually made it lighter.
Re: (Score:2)
The other thing that rang my alarm bells is the idiocy of submerging hard drives at all. You DON'T DO IT. It doesn't help, it only causes problems. They should either do what immersion gaming cases do and use long SATA cables to run the hard drives to a dry compartment on the outside of the case, or network-boot these servers.
Re: (Score:2)
... or use SSD.
But if you're going to have HDD storage anywhere, that's going to be generating heat too. Possibly most of it, if you're doing non-CPU-intensive file serving. So the bean counters would probably *like* their fancy new cooling technology to work with HDDs.
Re: (Score:2)
Even if you want to use SSDs, immersion-cooling them is not worth it. The SSD is then at risk of succumbing to PCB softening like every other component, and for what? Do SSDs even get that hot?
Network booting is probably the best solution here.
Re: (Score:2)
Network booting is probably the best solution here.
Using SAN disks would work quite nicely too. Either way you still end up with disks somewhere that will need to be cooled outside the oil bath.
Plus this setup uses at least 4 times the floor space of vertical racks.
This is a neat idea, but I don't see it working in practise.
Re:What about the hard drives? (Score:5, Informative)
Actually, hard drives are *not* supposed to be air tight. They intentionally allow airflow into the HD, but through a filter to keep dust out. If you want a drive that is airtight, it'll cost more.
http://www.acsdata.com/how-a-hard-drive-works.htm [acsdata.com]
Re: (Score:2)
ssd are to small for a data center may for booting (Score:2)
ssd are to small for a data center maybe for booting with the data on a SAN but that will give off heat.
Re: (Score:3)
Nothing stopping you from PXE booting(or iSCSI or fiber channel
Re:What about the hard drives? (Score:4, Interesting)
The only other issue I can imagine might crop up would be discovering the hard way that some polymer used in one of the system's components doesn't handle oil exposure well in the long term. I suspect that most are fine; but if the plasticizer used to soften the insulator coating on some important bundle of wires leaches out over 18 months in a warm oil bath, and the embrittled insulator cracks and shorts the next time you mess with it, the joke would be on you...
Hard drives aren't the only thing designed with "vent holes".
Every single electrolytic capacitor has a tiny vent hole (to keep them from acting like a mini fragmentation grenade if they develop an internal short circuit, etc.) Over time, with thermal cycling, the oil might get pumped in and out of the vent holes, thus degrading the electrolyte (guessing), and one fine day...
And as you say, think of the insulation on the cables...
Re: (Score:3)
Capacitors do not "breathe" like you describe, otherwise humidity would get in and ruin them. Instead they have "vents" in the sense that part of the casing is weakened to rupture safely if the electrolyte starts to break down and build pressure, rather than have the whole can explode.
The irony here is that the electrolyte in the "beer can" style caps is a mineral oil not completely unlike what they use as a coolant. The other option is to pay extra for non electrolytic capacitors on your equipment.
As for t
Re: (Score:2)
The irony here is that the electrolyte in the "beer can" style caps is a mineral oil not completely unlike what they use as a coolant. The other option is to pay extra for non electrolytic capacitors on your equipment.
Is there a list of manufacturers using solid caps? My Gigabyte board proclaims loudly on the boot screen JAPANESE solid capacitor... whoops. Looks like it's time to find a new supplier. Having sucked up a lungful of blown cap smoke in the past, I'm glad to see solids...
Re: (Score:2)
My understanding is that if you cover a hard drive's vent hole, unusual atmospheric pressures could cause the disk head to either crash into the platter or float too high over it.
It certainly looks cool... (Score:2)
Re: (Score:2)
It seems pretty trivial to replace the fans with mock fans, that either always reports an OK fan speed, or does something with the measured oil temperature.
True enough about the horizontal mounting and the weight. I don't fancy dealing with a heavy unit dripping with baby oil -- but surely, since they have an installation, they've addressed these practicalities?
Re: (Score:3)
The one picture showing "easy serviceability" shows the operator having completely unracked a relatively small server. That doesn't scream 'easy' to me. Pull server up, hold with one hand, unclip a bunch
Re: (Score:2)
I have to wonder at their claim that it works well with standard OEM gear. Even most cheap consumer shit monitors the speed of at least the CPU fan and tends to freak out if a fan that is supposed to be there is either absent or performing substantially below expected speed
Enter the BIOS (hit DEL during Power On Self Test), go into the Power or PC Health (depending on what BIOS you have). Alter the value of CPU FAN to "Not Monitored" or "Ignored". Hit F10 (or whatever yor key is) to save settings and reboot.
SpeedFan (etc) will still give you a speed readout if a fan is connected, but your BIOS won't complain if one isn't.
This procedure should be similar for UEFI based systems.
Paging Dr. Freeze (Score:5, Informative)
1999 [slashdot.org] Have I been reading Slashdot that long?
Rack density (Score:2)
Re: (Score:3)
Re: (Score:2)
How do you keep their loose pants on in that case? I wouldn't think that the normal spec grubby suspenders would do the trick any more.
Re: (Score:2)
Re: (Score:2)
Normal racks are 15-20kW. If you can fit 100kW into a rack that uses up the floor space of 3 normal racks, you're still ahead of the curve in terms of power usage.
Then again, I'm getting 4*14*12=672 cores in a single rack using less than 20kW. Unless they're using 150W CPUs, I have no idea how they need to cool 1/10 of a megawatt in a single rack.
Re: (Score:2)
Why not just use water cooling. Water is a lot more efficient at removing heat than oil. Just install water blocks in place of the heat sinks. You wouldn't need to worry about sealing the contacts and pins which you would with oil. and if done correctly maintenance would be too bad.
I believe that Cray used immersion cooling on their Cray-2 line using Fluorinert.
Re: (Score:2)
Re: (Score:2)
But they are not using blades they are using 1Us. At least that is what it looks like. I find it odd that they are putting the HDs in the oil and in the server. To be as green as possible I would think that having the server boot from the network and having a SAN would be a better choice. And it would beat the daylights out of having to pull an HD out of an oil soaked server.
How about you used heat tubes from the copper block on the chip connect to an easy to fill water block/ heat exchanger for a blade sys
Four years of college.... (Score:5, Funny)
Aha (Score:2)
So this is what happens when you drool too much.
condensation problems... (Score:2)
When I experimented with mineral oil based cooling, the main issue I had were water droplets condensing on the surface of the cold mineral oil and then promptly sinking... towards the motherboard sitting in the bottom of the old aquarium I was using as a case. Of course there were solutions to this problem but it was a quick and dirty (you can take that dirty word quite literally) test back when I was a student, so we gave up on the idea pretty quickly.
I wonder how they have managed to solved the condensati
Re: (Score:3)
They run their oil at 40C. If the dew point in your server room is that high you have other problems...
Re: (Score:2)
... and you needn't have. The aim isn't to get the CPU cold. It's just to keep it at operating temperature. Overclocking makes it churn out more heat, which means you need to draw more heat away -- but the point with oil is that it can conduct the heat away more readily. Elsewhere in this thread, it says the system we're talking about here has the oil at 40C. You could have done similar.
The only reason we have to use air con to keep server rooms so cold, is that air is such a bad thermal conductor.
Re: (Score:2)
If you had condensation (water) that was dense enough to sink in your oil, you were using the wrong type of oil.
Mineral oil is way more dense than water. And condensation that occurs should sit at the top of the pool and never create an issue (and it should be pretty easy to skim off).
-Rick
Re: (Score:2)
I stand corrected. I was thinking that heavy mineral oil was ~1.1-1.2, but it's only .9. Whoops!
luckily, it would still be easy to skim, you'd just have to pull off the bottom instead of the top.
-Rick
There has to be a better way (Score:2)
Okay, oil is more efficient than air. But the problems with this are plainly obvious when it comes to anything that falls in the area of maintenance and upgrades.
I wonder what gasses can be used instead of oil? Something that wouldn't likely leave a residue? Substituting a gas for a liquid might reduce some efficiency, but you are still containing the unit completely and entirely. A lot of efficiency can be added merely through the act of containment. There must be some sort of gas that can be pumped i
Re: (Score:2)
I wonder what gasses can be used instead of oil?
Refrigerated air might be best. Cool it before it passes through the system, so it can remove more heat. Of course you either pay for the energy to pre-cool the air, or you locate your data centre somewhere very cold. If thats not good enough you could look at supercharging. Compress the air around the system. That increases density and heat capacity.
Fukucenter (Score:2)
Re: (Score:2)
Re: (Score:2)
If the tsunami wipes out the generators, there will be no power to run the data center themselves.
But on the other hand, what if the heat exchange pump breaks down, and the liquid solution ends up boiling.....
Re: (Score:2)
French fries!
Not really (Score:2)
I don't see how this can be good for use w/ a server. It does nothing to increase cooling within the room itself. The server is going to emit the same amount of heat regardless of whether it's air cooled, or mineral oil cooled. The mineral oil will transfer the heat from the components faster, but it will not transfer out of the mineral oil into the air as fast. On a hobbyists computer, it will get shut down, or the load will decrease to almost nothing daily and allow the built up heat to dissipate. On a s
Re: (Score:2)
Well, I just went back and watched the video in TFA. They are pumping the oil out to a radiator, or are cooling it somehow. With the added cost of needing to pump mineral oil and cool it, I'm not sure where the savings in electricity is coming from. And all the other problem are still present.
Re: (Score:2)
definitely not for everyone (Score:2)
it's hard to tell but they might have a special adapter for plugs. i have read that cables will wick the oil in the shielding braid (usb cat5 etc) which can cause a mess.
also there are a lot of server rooms out there that aren't that organized. i'd imagine in a working installation everything would end up oily, i'd be also a bit wary of installing hard drives in these things. i thought they had the pressure hole for a reason and if your coating failed you could have a massive drive failure.
that said i'm su
oil in between the card/memory/etc. contacts? (Score:2)
Re: (Score:2)
My guess about the contacts is that once things are plugged together, they're touching and oil won't break that contact.
I seriously suspect that the expectation is that a system won't ever be repaired -- if it breaks, it's binned. This is likely to be justifiable based on some sums to do with MTBF, depreciation etc. Of course that kind of thing only really works if your operation scales to hundreds or thousands of machines -- or, I suppose, if an insurance company takes on the spread of that risk.
Re: (Score:2)
Air is also not electrically conductive and seeps between things much more efficiently than oil.
Re: (Score:2)
Not a new idea... (Score:3)
Cray-2 used Fluorinert. In 1985. Related jokes and memes abounded until... dunno. Certainly they were still part of HPC culture when I started my career in 1994.
Re: (Score:2)
Everything old is new again. In other news, someone just discovered that distributing DC within a rack uses less power.
Mineral Oil is not exactly green (Score:3)
http://www.jtbaker.com/msds/englishhtml/m7700.htm [jtbaker.com]
Re: (Score:2)
It's quite interesting that your link mentions mineral oil as irritant to skin, yet it's sold as "baby oil".
Re: (Score:2)
Maybe if you're running servers in your basement.. (Score:2)
Wow... awful idea for 99% of datacenters... Especially those that have ceilings greater than 6ft high.
Let's see... in all of the pictures the submerged rack is placed on some sort of black grid. I'd bet that if you put this rack on a normal datacenter tile floor and 1 drop of oil got on those tiles, you'd have a nice slip n fall lawsuit on your hands. Besides, the thought of having to stock paper towels and a hazardous spill cleanup kit next to every rack doesn't excite me...
How many vendors actually supp
High density != Green data centers (Score:2)
It seems like a lot of people confuse the ability to cram this many servers into a "rack" with an energy efficient, "green" data center.
The thing is, even though it's about 5x the power density of a "normal" data center, all you're saving is space that more conventional servers would have taken, and maybe gaining a little efficiency in power at the cost of having to maintain all those mineral oil baths. You still have to supply those servers with network connections, potentially also external storage, p
Re: (Score:2)
Your link doesn't work, but I imagine it was a hobbyist. It looks as if this lot have built an industrial-strength product. It's clearly not practical in the home.
The entire computer is submerged in a bath of oil, and the oil is circulated through a cooling tower. I doubt there are any holes for IO below the oil level of the bath, so leakage isn't a concern.
I don't fancy the messy job of making hardware changes though.
Re: (Score:3)
> I don't fancy the messy job of making hardware changes though.
THIS. If it's made by man, it will eventually fail and will require service or replacement.
The cost in labor (and cleanup!!!) (and replacement oil!) (and trips to the emergency room for employees who slipped and fell in the oil on the floor!) (not to mention the lawsuits) make this a supremely dumb idea. Now add in the cost of the hermetically-sealed rack(s) and it would be difficult to imagine a dumber idea.
Google is pretty innovative about
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
So, heat capacity of oil is twice air's heat capacity.
Re: (Score:2)
There's also thermal conductivity to consider. Oil may have twice the heat capacity of air, but it will also spread that heat throughout itself much faster.
Air: 0.025 W/(mK)
Mineral oil 0.138 W/(mK)
Water: 0.6 W/(mK)
Of course the key property of mineral oil is that it has better thermal conductivity than air, while still being an electrical insulator.
Re: (Score:3)
1 kg of air takes up a lot more space than 1 kilogram of oil.
So the heat capacity of oil in a server is far more than twice the heat capacity of air in a server.
Re: (Score:2)
I would imagine -- and someone else has surely done the sums -- that the oil would have enough heat capacity, and be kept at a low enough standard temperature, that it could absorb enough heat without the pump running that a controlled shutdown could be done.
But it's not beyond the wit of man to have a standby pump.
Re: (Score:2)
The oil holds 1200 times as much heat as air so it would be MUCH slower to heat up...
Marine engines (Score:2)
Re: (Score:2)
Google is pretty innovative about stuff like this. They use their own in-house version of Linux on commodity hardware, thousands upon thousands of PCs in each data center. But they still use air cooling and air conditioning because, at the end of the day, it's the best bang for the buck.
Not any more. In the early days, as I understand it, Google bought the cheapest desktop PCs they could find, made them netboot, and filled warehouses with them. Their software would deal with hardware failures by routing around the broken server. It worked out cheaper to leave a broken server where it was, than to locate it and repair it or dispose of it. This made sense when Google was a certain size -- big enough to need a big cloud of servers, but too small to invest in custom hardware.
Nowadays, however,
Re: (Score:2)
AFAIK, Google doesn't replace their servers. If it fails, it fails. Hence the viability of this scheme.
This isn't for co-loc type data centers where you're buying a $20000 dream machine and lovingly installing it in a cage, and coming by to visit it once in a while with gifts (RAM, HDD) in hand.
It's for set it and forget it cloud-based, commodity data centers.
Re: (Score:2)
Actually since the mineral oil is so thin it's fairly trivial to deal with. It makes it more complex to pull the hardware and bring it to a test station because you need to account for dripping oil but other than that there isn't nearly as big a difference or problem as you seem to believe.
Re:Mineral oil = nightmare (Score:5, Informative)
I once drank half a bottle of mineral oil, and let me tell you, leakage was definitely a concern.
Re: (Score:2)
Re:Mineral oil = nightmare (Score:4, Interesting)
Additionally, it has the nasty tendency to dissolve some plastics over time.
From what I understand, this has been the main problem with immersion cooling. Mineral oil softens PCBs.
Re: (Score:3)
Sorry, no goatse.
Re: (Score:2)
And one other thing, for any of you hobbyists out there who plan to try this.
Heat generally rises, but remember that many computers are designed so that the cooling fans force the air *horizontally* across the components. Case in point: my Dell Poweredges. They have a bank of fans near the front that force air "sideways" across the CPU, RAM and other heat-producing parts.
Unless you put in some sort of coolant pump to circulate that oil (or water, or whatever you plan to use), or *replace* the heatsinks with
Re: (Score:2)
With oils convection, thermal conductivity and heat capacity, these are the reasons to choose submersion over air for cooling. When it takes a thousand times more energy to raise a cubic centimeter of oil by a degree, you are in a much more forgiving situation than with air.
Re: (Score:2)
Next time folks, today isn't my day
Anyway, the post is actually correct, I did read such an article sometime ago.
Don't remember the link unfortunelly
Re: (Score:2)
That's ok. I can live without it.
Re: (Score:2)
I honestly wonder if it's an April Fool's joke.
Re: (Score:2)
I would expect oil to be far more efficient than air. It has a hugely greater thermal capacity (hundreds of times), so it can extract much more heat from the chips and similarly hand it over better to the cooling vanes. You use thermal paste to connect the chip to heat sinks better than air - this is a larger scale version of the same thing, where the whole system is immersed in a sort of thermal paste.
Re: (Score:2)
Liquid to air heat exchangers can be made as big as you need pretty cheaply, it's easier to pump 800 cubic feet of air through a big radiator cooling one cubic foot of oil than it is to pump 800 cubic feet of air thro
Maintenance problems (Score:2)
maintenance is a nightmare
You bet it is! Imagine the mess when you need to replace anything. Not to mention finding the fault. The first thing you do is unplug and plug again the cables just to see if it's just a bad contact problem. Now try to do that when everything is in an oil bath.
Re: (Score:2)
You don't do maintenance. This is for when you have hundreds of servers of the same kind. If it fails, you start up another in its place.
Re:False (Score:4, Informative)
Having worked in a fair number of server rooms, I'd say that the freequency of needing to service equipment has been dropping significantly over the lat 15 years. These days, it's almost a non-issue. I don't think I've pulled a single server for anything but replacement in the last 4 years.
Transfering heat to fluids is significantly more efficient, both on the recieving site (in the server) and the giving side (in the cooling tower). It requires less energy to transfer heat from components to the water (ie: no fans or heat sinks). And it requires less energy to transfer heat from the water in the cooling tower (ie: much smaller chiller/AC unit). So it is more efficient. Acording to the article, their solution consums 50% less energy than the traditional air conditioning and fans.
-Rick
Re: (Score:2)
An oil cooled system could transfer heat to the hot water in your building via a heat exchanger and lower your gas/electric costs. And this would be a much more efficient process than using your computers to keep your building warm...
Re: (Score:2)
I wonder what the cost savings would be overall. TFA says it pays for itself in 1-3 years, but that's marketing and it's vague.
If it really saves a load of energy, I can imagine datacentre ops whining about the hassle and the greasy fingers and so forth, saying it's not been worthwhile -- while the suits who commissioned it look at the bottom line of their electricity bill, and deem it well worth a few inconvenienced staff.
Re: (Score:2)
(I feel a bit patronising spelling this out)
It's simple. Pick a component that emits heat. The CPU is a good example. Of course people are working on more efficient CPUs, but they will always generate heat. ... and if that heat stays near the CPU, the temperature will increase until the CPU stops working -- or melts -- or a safety cutoff kicks in.
So, you have to move that heat away from the CPU, and that in itself takes energy -- powering a fan; powering an air-con unit to keep the room cool enough for the
Re: (Score:2)
Liquids handle heat exchange better. It's the reason your car is water (technically anti-freeze, but you get the point) cooled instead of air cooled. I've never worked with anything this fancy, but we had water cooled racks in the data center of one place I worked. The way it worked was you had a water cooled "radiator" mounted as the back door of the rack. Servers sucked in cool data center air, heated it up normally, shot it out their back ends and forced it through the radiators. There were also fa
Re: (Score:2)
You are absolutely right and this is a somewhat normal cause of failure for harddrives. If you use your laptop on a plane and the cabin pressure drops sufficiently, or perhaps you are in Colorado- the read/write head can crash into the platter due to the lack of dense air to ride upon.
Re: (Score:2)
The company that is pushing the oil-as-a-coolant solution may need to remake servers in a form that is conducive (conductive? :) to oil cooling.
Think "oil-cooled server appliance".
Re: (Score:3)
Yes, fans are removed. RTFA.
Re: (Score:2)
do cooling fans inside the servers need to be disabled? seems like churning that fluid would burn them out.
Since the mineral oil will dissipate the heat from the components much better, they will most likely be fine. But will still use electricity unnecessarily, so I would guess they should be removed, or disabled.
Re: (Score:2)
The video shows the tech unplugging and replugging RAM, so I'm assuming there was no need to seal peripherals that don't have moving parts.
Re: (Score:3)
Excellent brainstorming ... Farmville servers will replace nuclear plants for the purpose of boiling water to spin turbines. Ingenious. (Saying this half sarcastically, yet some marketroid will actually run with it.)
Re: (Score:2)
Well, it's not an entirely stupid idea, and big datacentre operators to like to put themselves in coldish places.
Even so, Iqualuit gets as high as 25C in summer, which is warmer than my server room, and -40C in winter might bring operational problems of its own.
Re: (Score:2)
Is that the real world any more? It seems to me that people spending real money on datacentres are virtualising everything -- so once commissioned, hardware won't get touched again until it breaks -- at which point it's disposed of, not repaired. Configuration changes and "new data connections" are managed in software -- VMs, VLANs, that kind of thing.
I am curious about disposing of equipment that's been used in this way. Is there some solvent bath to clean things up so they don't go to the recyclers coated
Re: (Score:2)
So have two pumps? I don't suppose they're expensive.