Raised Flooring Obsolete or Not? 372
mstansberry writes "In part three of a series on the price of power in the data center, experts debate the merits of raised flooring. It's been around for years, but the original raised floors weren't designed to handle the air flow people are trying to get out them today. Some say it isn't practical to expect air to make several ninety-degree turns and actually get to where it's supposed to go. Is cooling with raised floors the most efficient option?"
Turns? (Score:5, Insightful)
I wouldn't say they're going to become obsolete. (Score:5, Insightful)
One way to fight this -- the CHIP (Score:5, Insightful)
Inefficient architectures must be discarded to make way for more modern, smaller, COOLER processors.
Let's address the real problem here -- not the SYMPTOM of hot air.
We need to address the COMPUTERS.
Why do devices need to be cooled? (Score:3, Insightful)
Think about the lightbulb....A standard 60-watt incadescent bulb generate lots of heat. A better design is something like the LED bulbs that generate the same amount of lumens, with much less power, and more importantly little to no heat.
Good design can allow these devices to not generate excess heat, hence eliminating the need for the raised floor.
No problems cooling (Score:2, Insightful)
Re:One way to fight this -- the CHIP (Score:5, Insightful)
Re:Not Just Cooling (Score:3, Insightful)
The real usefulness is the ability to run cabling from any point A to any point B in the floor space.
That's good to an extent, as long as the cable runs aren't too long. Go take a look at an enterprise grade colocation hosting facility and you may change your mind. I've spent a lot of time at one of the top-tier MCI facilities. It has a raised floor that's used for cooling and power distribution, but all networking is done via 3 or 4 layers of overhead cable trays. It's much easier to climb on top of a cable ladder that can easily support your weight to run a cable the length of a datacenter than it is to crawl underneath a floor trying to fish a cable past supports, power lines, etc.
Airflow and cables not the only reason for raised (Score:1, Insightful)
Raised Flooring should be an SOP (Score:1, Insightful)
The best use is to have your power distribution run below the floor, to have specific tiles with cut outs to allow cool air to enter the bottom of the racks, and to prevent unexpected disasters like burst water pipes from flooding the room.
There are A/C units available for server rooms are designed to send cold air out from the bottom into the crawl space, and they should be spec'd to supply far more cooling than is require. And there should be at least one primary and a backup unit too.
At one site, we had both A/C units fail one night. The temperature inside one of the cabinets containing some of our network switches reach 155' F, according to the temp sensors inside the switches. Fortunately, we only lost one old server that was on the top shelf of one rack. The temperature in the room reached about 140'F (40'x80' or so) within only 3-4 hours. The room housed about 25 servers, a phone switch and a UPS. The only thing that saved the rack mounted servers was the chimney effect of having the air flowing up throught the racks.
For companies who don't see the value in using something like raised flooring, they have never had to face the prospect of replacing all of their servers if their underpowered A/C supplies fail. For the extra few thousand dollars it costs, it's a worthwhile ounce of prevention. To put things in perspective, the most valuable server in the site I was working in cost almost $250,000. Hate to be the Facilities or IT Manager who had to report the lost of an asset like that.
Re:Not Just Cooling (Score:3, Insightful)
I don't believe that there should be a rats nest of cabling _anywhere_ in a datacenter. I hate raised floors because they allow techs to get sloppy. Vertical wiring trays eliminate that possibility by showing their hackish wiring job to everyone.
When your datacenter is new, you should pre-wire patch panels in each cabinet for SAN and Ethernet. Each cabinet should have a PDU.
Run all of the cables from all of the patch panels back to your main SAN and Network patch panels.
If you do that work ahead of time, all you will ever have to do is plug a server into a patch panel in the same cabinet.
For larger equipment (Disk arrays, Tape Libraries, etc) you place the equipment and carefully measure the cable runs. Make sure you only have 3 feet of 'slack' and run the cables cleanly.
Its a lot of work to keep a datacenter in order, but it is worth it in the long run. For one, you'll never have to spend two weeks tracing an ethernet cable around the datacenter to locate a phantom server.
Re:Turns? (Score:4, Insightful)
Overhead tray systems also suffer from a fairly rigid room layout, and I have yet to see a data center being used the way it was originally layed out after a few years. Raised flooring allows for a lot of flexibility for power runs, cabling runs and so on without having to install an overhead tray grid.
Raised flooring also offers some slight protection against water leaks. We had our last raised floor system installed with all the power and data runs enclosed in liquidtight conduit due to the tenant's unfortunate run-ins with the buildings drain system and plumbing in the past.
I guess overhead tray makes sense if all you want to do is fill 10,000 sq ft with rack cabinets, but it's not really that flexible or even attractive, IMHO.
Bodies ??? what about the booze ? (Score:3, Insightful)
Re:Turns? (Score:2, Insightful)
-Peter
Re:No (Score:3, Insightful)
Being, literally, a grey-beard who remembers working on intelligent (3270-series) terminals and water-cooled mainframes and Unix and DOS punks crowing about how "the mainframe is dead"... things like Citrix, LTSP, liquid-cooled racks, and IBM setting new records in "number of mainframe MIPS sold every year" really amuses me.
It's the physics, stoopid (Score:3, Insightful)
Note that I'm not calling the parent poster stoopid, but rather the design of forcing cold air through the *floor*. As the parent here notes, cold air falls. This is presumably why most home fridges have the freezer on top.
I was most surprised to read this article. I've never worked in a data center, but I have worked in semiconductor production cleanrooms, and given the photos I've seen of data centers with the grated flooring, I guess I always assumed the ventilation was handled the same way as in a cleanroom -- new air in from the ceiling, old air whisked away through the floor. (This ensures that any particles, which will naturally fall if heavier than air, will be sucked out of the room.) Note that this is obviously *not* a passive system designed to use convection, but rather an active system using lots of fans.
While a passive convection system with the cold pulled up from below is a nice theory, you can run into the same problems others have pointed out -- what if the bottom units suck in all the cold air? The top units are left too warm.
Meanwhile, if you drop cold air from above, sure, the top units might suck a lot of that in -- but any cold air that isn't sucked in will naturally continue to drop relative to warmer air, ensuring that the lower units are not cooked. If you want to be especially careful about it, you could route all the cold air outputs towards the perimeter of the room and put the uptakes in the center of the ceiling to ensure a vortical flow.
Just my ¥2.
Re:Turns? (Score:2, Insightful)
I worked consolidating serveral data operations from various older centers into a new building. Pulling up a tile and searching down through the archeological strata of cables was amazing. Fiber on top, then a layer of UTP, then coax (getting ever thicker as you got close to the concrete), then--finally that moment of truth when you find the AC plug you were looking for, and the frayed wire next to it that knocked you back. There was no room for airflow in those places.
Fortunately, the new datacenter had cable trays under the floor, tiles at 30", nicely labled AC outlets that matched the rack and region names. I shudder to think what will be like in 30 years.
Benchmarking (Score:2, Insightful)