Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Power IT

Raised Flooring Obsolete or Not? 372

mstansberry writes "In part three of a series on the price of power in the data center, experts debate the merits of raised flooring. It's been around for years, but the original raised floors weren't designed to handle the air flow people are trying to get out them today. Some say it isn't practical to expect air to make several ninety-degree turns and actually get to where it's supposed to go. Is cooling with raised floors the most efficient option?"
This discussion has been archived. No new comments can be posted.

Raised Flooring Obsolete or Not?

Comments Filter:
  • Turns? (Score:5, Insightful)

    by mboverload ( 657893 ) on Thursday November 03, 2005 @04:45PM (#13944534) Journal
    As long as the space under the floor has a negative or positive atmosphere I can't see how somme turns have anything to do with the air flow.
  • by wcrowe ( 94389 ) on Thursday November 03, 2005 @04:48PM (#13944555)
    Another big reason for raised floors is to handle wiring. I know companies where it was installed only for this reason. Cooling wasn't even on their minds.
  • by Work Account ( 900793 ) on Thursday November 03, 2005 @04:53PM (#13944611) Journal
    To paraphrase a popular saying: "It's the COMPUTERS, stupid!"

    Inefficient architectures must be discarded to make way for more modern, smaller, COOLER processors.

    Let's address the real problem here -- not the SYMPTOM of hot air.

    We need to address the COMPUTERS.
  • by WesG ( 589258 ) on Thursday November 03, 2005 @04:53PM (#13944612)
    I am waiting for the day where someone invents a computer that doesn't need to be cooled or generate excess heat.

    Think about the lightbulb....A standard 60-watt incadescent bulb generate lots of heat. A better design is something like the LED bulbs that generate the same amount of lumens, with much less power, and more importantly little to no heat.

    Good design can allow these devices to not generate excess heat, hence eliminating the need for the raised floor.
  • by zjeah ( 623944 ) on Thursday November 03, 2005 @04:58PM (#13944667)
    We have been using raised flooring in our data center for decades and never had any cooling issues. Granted we have 4 large air handlers for the room but when running a raised floor one must have the proper system in place. Some hardware is designed to get it's air right from the floor and some is not. Our large server racks don't have floor openings so we have vent tiles in the floor on the front side and the servers in turn suck the cool air through. Raised floor is a great place to route cables/power/phones you name it. Just make sure your your air handlers are top notch (audible alarms/water detection/humidity & Temp control).
  • by n0dalus ( 807994 ) on Thursday November 03, 2005 @05:03PM (#13944723) Journal
    Perhaps more importantly, better software solutions can make large hardware systems unnecessary. Instead of running and cooling 10 servers for a certain purpose, write better software to allow you to do the same thing on just one or two servers. If you cut down the amount of servers in the room by enough, you don't even need dedicated cooling.
  • by Iphtashu Fitz ( 263795 ) on Thursday November 03, 2005 @05:08PM (#13944768)
    Cooling, IMO, is a secondary use of raised floors.

    The real usefulness is the ability to run cabling from any point A to any point B in the floor space.


    That's good to an extent, as long as the cable runs aren't too long. Go take a look at an enterprise grade colocation hosting facility and you may change your mind. I've spent a lot of time at one of the top-tier MCI facilities. It has a raised floor that's used for cooling and power distribution, but all networking is done via 3 or 4 layers of overhead cable trays. It's much easier to climb on top of a cable ladder that can easily support your weight to run a cable the length of a datacenter than it is to crawl underneath a floor trying to fish a cable past supports, power lines, etc.

  • by Anonymous Coward on Thursday November 03, 2005 @05:12PM (#13944809)
    If you have a water based cooling system (chillers) and you spring a leak... a nice raised floor with deep side channels will save equipment while you figure out how to shut off the water.
  • by Anonymous Coward on Thursday November 03, 2005 @05:34PM (#13945047)
    Having worked in several sites with both raised flooring and without, I would prefer raised flooring for the server room every time. For both wire management and for cooling.

    The best use is to have your power distribution run below the floor, to have specific tiles with cut outs to allow cool air to enter the bottom of the racks, and to prevent unexpected disasters like burst water pipes from flooding the room.

    There are A/C units available for server rooms are designed to send cold air out from the bottom into the crawl space, and they should be spec'd to supply far more cooling than is require. And there should be at least one primary and a backup unit too.

    At one site, we had both A/C units fail one night. The temperature inside one of the cabinets containing some of our network switches reach 155' F, according to the temp sensors inside the switches. Fortunately, we only lost one old server that was on the top shelf of one rack. The temperature in the room reached about 140'F (40'x80' or so) within only 3-4 hours. The room housed about 25 servers, a phone switch and a UPS. The only thing that saved the rack mounted servers was the chimney effect of having the air flowing up throught the racks.

    For companies who don't see the value in using something like raised flooring, they have never had to face the prospect of replacing all of their servers if their underpowered A/C supplies fail. For the extra few thousand dollars it costs, it's a worthwhile ounce of prevention. To put things in perspective, the most valuable server in the site I was working in cost almost $250,000. Hate to be the Facilities or IT Manager who had to report the lost of an asset like that.

  • by TinyManCan ( 580322 ) on Thursday November 03, 2005 @05:44PM (#13945156) Homepage
    Without the raised floor, you have to put your rats nest of cabling somewhere else, which almost certainly mean vertical.

    I don't believe that there should be a rats nest of cabling _anywhere_ in a datacenter. I hate raised floors because they allow techs to get sloppy. Vertical wiring trays eliminate that possibility by showing their hackish wiring job to everyone.

    When your datacenter is new, you should pre-wire patch panels in each cabinet for SAN and Ethernet. Each cabinet should have a PDU.

    Run all of the cables from all of the patch panels back to your main SAN and Network patch panels.

    If you do that work ahead of time, all you will ever have to do is plug a server into a patch panel in the same cabinet.

    For larger equipment (Disk arrays, Tape Libraries, etc) you place the equipment and carefully measure the cable runs. Make sure you only have 3 feet of 'slack' and run the cables cleanly.

    Its a lot of work to keep a datacenter in order, but it is worth it in the long run. For one, you'll never have to spend two weeks tracing an ethernet cable around the datacenter to locate a phantom server.

  • Re:Turns? (Score:4, Insightful)

    by swb ( 14022 ) on Thursday November 03, 2005 @05:55PM (#13945263)
    I still miss why running cabling under the floor is worse than running it in overhead trays. Either the trays are too high to get at without a ladder (thus making them at least as inconvenient as floor tiles), or they're too low and you bash things into them.

    Overhead tray systems also suffer from a fairly rigid room layout, and I have yet to see a data center being used the way it was originally layed out after a few years. Raised flooring allows for a lot of flexibility for power runs, cabling runs and so on without having to install an overhead tray grid.

    Raised flooring also offers some slight protection against water leaks. We had our last raised floor system installed with all the power and data runs enclosed in liquidtight conduit due to the tenant's unfortunate run-ins with the buildings drain system and plumbing in the past.

    I guess overhead tray makes sense if all you want to do is fill 10,000 sq ft with rack cabinets, but it's not really that flexible or even attractive, IMHO.

  • by C0vardeAn0nim0 ( 232451 ) on Thursday November 03, 2005 @05:58PM (#13945286) Journal
    i'm more concerned about keeping my booze cool than hiding bodies. the bodies can be dissolved in caustic soda and flushed down the toilet
  • Re:Turns? (Score:2, Insightful)

    by pete-classic ( 75983 ) <hutnick@gmail.com> on Thursday November 03, 2005 @06:12PM (#13945430) Homepage Journal
    I think that the biggest single reason is that cable ladders encourage neat and sane cabling. Raised flooring . . . doesn't.

    -Peter
  • Re:No (Score:3, Insightful)

    by Nutria ( 679911 ) on Thursday November 03, 2005 @06:42PM (#13945685)
    liquid cooling

    Being, literally, a grey-beard who remembers working on intelligent (3270-series) terminals and water-cooled mainframes and Unix and DOS punks crowing about how "the mainframe is dead"... things like Citrix, LTSP, liquid-cooled racks, and IBM setting new records in "number of mainframe MIPS sold every year" really amuses me.
  • by zooblethorpe ( 686757 ) on Thursday November 03, 2005 @07:19PM (#13946030)

    Note that I'm not calling the parent poster stoopid, but rather the design of forcing cold air through the *floor*. As the parent here notes, cold air falls. This is presumably why most home fridges have the freezer on top.

    I was most surprised to read this article. I've never worked in a data center, but I have worked in semiconductor production cleanrooms, and given the photos I've seen of data centers with the grated flooring, I guess I always assumed the ventilation was handled the same way as in a cleanroom -- new air in from the ceiling, old air whisked away through the floor. (This ensures that any particles, which will naturally fall if heavier than air, will be sucked out of the room.) Note that this is obviously *not* a passive system designed to use convection, but rather an active system using lots of fans.

    While a passive convection system with the cold pulled up from below is a nice theory, you can run into the same problems others have pointed out -- what if the bottom units suck in all the cold air? The top units are left too warm.

    Meanwhile, if you drop cold air from above, sure, the top units might suck a lot of that in -- but any cold air that isn't sucked in will naturally continue to drop relative to warmer air, ensuring that the lower units are not cooked. If you want to be especially careful about it, you could route all the cold air outputs towards the perimeter of the room and put the uptakes in the center of the ceiling to ensure a vortical flow.

    Just my ¥2.

  • Re:Turns? (Score:2, Insightful)

    by laughing rabbit ( 216615 ) on Thursday November 03, 2005 @07:36PM (#13946160)
    Aye...

    I worked consolidating serveral data operations from various older centers into a new building. Pulling up a tile and searching down through the archeological strata of cables was amazing. Fiber on top, then a layer of UTP, then coax (getting ever thicker as you got close to the concrete), then--finally that moment of truth when you find the AC plug you were looking for, and the frayed wire next to it that knocked you back. There was no room for airflow in those places.

    Fortunately, the new datacenter had cable trays under the floor, tiles at 30", nicely labled AC outlets that matched the rack and region names. I shudder to think what will be like in 30 years.

  • Benchmarking (Score:2, Insightful)

    by jay_adelson ( 928345 ) on Thursday November 03, 2005 @09:29PM (#13946922)
    Furthermore, if you speak to the insiders at most of the modern equipment manufacturers, they will tell you that the benchmarking processes are now done on solid, non-raised floor environments. The assumption is tonage of cooling is provided at the intake, which is not located at the bottom of the larger machines, but at the front or back. The hot aisle/cold aisle methodology is still the only viable means for cooling high power density equipment in a large datacenter environment. The only remaining issue is how to get rid of the hot air, and clearly the simplest initial design criteria should be high ceilings (hard to find in datacenters). Outside of that, high velocity air, specially designed air returns, or compartmentalized racks with dedicated air returns are alternatives. In most flow dynamic studies, you find raised floors are riddled with statification, hot air being delivered back into the intakes of other gear, whereas in non-raised hot-aisle/cold-aisle, this problem magically goes away...

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...