Raised Flooring Obsolete or Not? 372
mstansberry writes "In part three of a series on the price of power in the data center, experts debate the merits of raised flooring. It's been around for years, but the original raised floors weren't designed to handle the air flow people are trying to get out them today. Some say it isn't practical to expect air to make several ninety-degree turns and actually get to where it's supposed to go. Is cooling with raised floors the most efficient option?"
The future is not in raised flooring... (Score:2, Funny)
No (Score:4, Informative)
Re:No (Score:3, Insightful)
Being, literally, a grey-beard who remembers working on intelligent (3270-series) terminals and water-cooled mainframes and Unix and DOS punks crowing about how "the mainframe is dead"... things like Citrix, LTSP, liquid-cooled racks, and IBM setting new records in "number of mainframe MIPS sold every year" really amuses me.
Re:No (Score:4, Interesting)
But remember, this is what happens when shit hits the fan [novell.com] and servers are on floor level.
So you don't care - who cares? (Score:5, Interesting)
Large organizations rely on server rooms for their computing environment. Having a cobbled environment where the file server is on the 3rd floor, and the application server is in the janitor's closet, etc. is a recipe for disaster. Troubleshooting connectivity issues (among others) can end up costing more than the apparent simplicity of such a design.
Understanding ways to better cool the space that our servers occupy is important. And being able to do so in a cost effective manner is also important. The organization that I work in has one in-house server room (containing 60 racks of servers), and one 'co-located' server room (containing 72 racks of servers). Heat and power are the two killers. If we experience a 50% power loss (assume that one power grid is knocked out), do we have enough power to run AND cool the server room? If not, what percentage of my gear do I need to shut down in order to prevent overheating, without impacting critical business systems (like payroll).
If we can find a cheaper / better / more cost effective method for cooling that utilizes less power, or find a way to use the cooling systems that we have in a more efficient manner, is that not worth an article on slashdot?
IMHO, This is a valid topic.
Where else? (Score:5, Funny)
Re:Where else? (Score:2, Funny)
The problem is your boss. At a previous company my boss was the one that insisted we have a "beer fridge" hidden in the back of our server room, out of site of the rest of the company.
Re:Where else? (Score:3, Funny)
Re:Where else? (Score:5, Funny)
Re:Where else? (Score:4, Informative)
http://www.canford.co.uk/commerce/resources/catde
or maybe even one of these...
http://www.canford.co.uk/commerce/resources/catde
Re:Where else? (Score:3, Funny)
Re:Where else? (Score:3, Interesting)
Some datacenters have very odd cooling systems... some even distribute cold air from the top and collect hot air at the floor, quite a questionable choice.
Re:Where else? (Score:3, Funny)
Sprinklers are as bad for chemical labs as for computers, but what the building had instead wasn't much better.
sub-floor (Score:5, Funny)
Re:sub-floor (Score:3, Funny)
Re:sub-floor (Score:4, Funny)
Downside: needs more reinforcement, especially if you need to hide an overweight PHB. Upside: if the odors go upwards, the bodies will remain undetected longer.
Or you could just use old enclosed racks as sarcophagi, hiding them in the back of the storage room behind stacks of obsolete boxen.
Re:sub-floor (Score:2)
You might get in trouble with peta, but the last time I checked they only concerned themselves with cute animals and didn't care much about invertebretes.
Re:sub-floor (Score:3, Interesting)
Bodies ??? what about the booze ? (Score:3, Insightful)
Re:sub-floor (Score:3, Informative)
Turns? (Score:5, Insightful)
Re:Turns? (Score:5, Funny)
After reading this very insightful article summary, I was planning to completely replace all of the ductwork in my house on the assumption that air can't go around corners. You just saved me several thousand dollars.
Re:Turns? (Score:5, Interesting)
(Great way to keep your boss at bay, too. "Don't come in here! We've got tiles up and you may fall in a hole! thenthegruewilleastyouandnoonewillnoticebwhahaha"
With computers being designed as they are now, the raised floor no longer makes sense. For one, all your plugs tend to go to the same place. i.e. Your power cords go to the power mains in one direction, your network cables go to the switch (and ultimately the patch panel) in another, and your KVM console is built into the rack itself. With the number of computers being managed, you'd be spending all day pulling up floor tiling and crawling around in tight spaces trying to find the right cable! With guided cables, you simply unhook the cable and drag it out. (Or for new cables, you simply loop a them through the guides.)
So in sort, times change and so do the datacenters.
Re:Turns? (Score:5, Interesting)
I don't know where you've worked, but every datacenter I've seen has had a raised floor, and all of them still had at least one mainframe structure still in use
Re:Turns? (Score:4, Interesting)
With the advent of blades, the heat generated per rack space is now typically MUCH higher than it was a back in the day. If anything, the raised flooring should be redesigned, as it can't cope with the airflow that is needed for higher density server rooms.
You'll find that a number of racks are being redesigned with built-in plenums for cooling... a cold feed on the bottom, and a hot return at the top, with individual ducts for various levels of the rack.
There are even liquid-cooled racks available for the BIG jobs.
I think that it's not so much that we're going to get rid of raised floors, but just redesign the materials and layout of them to be more effective with the needs of today.
yes, but... (Score:3)
Re:Turns? (Score:4, Insightful)
Overhead tray systems also suffer from a fairly rigid room layout, and I have yet to see a data center being used the way it was originally layed out after a few years. Raised flooring allows for a lot of flexibility for power runs, cabling runs and so on without having to install an overhead tray grid.
Raised flooring also offers some slight protection against water leaks. We had our last raised floor system installed with all the power and data runs enclosed in liquidtight conduit due to the tenant's unfortunate run-ins with the buildings drain system and plumbing in the past.
I guess overhead tray makes sense if all you want to do is fill 10,000 sq ft with rack cabinets, but it's not really that flexible or even attractive, IMHO.
Re:Turns? (Score:5, Interesting)
plenum will force air through no matter what. there
are however two problems. the first is that turbulence
underneath the floor can turn the directed kinetic energy
of the air into heat...this can be a real drag. in circumstances
where you need to move alot of air, the channel may not
even be sufficiently wide.
more importantly, the air ends up coming out where the
resistance is less, leading to uneven distribution of
air. if you're grossly overbudget and just relying on
the ambient temperature of the machine room, this isn't
a problem. but when you get close to the edge it can
totally push you over.
Re:Turns? (Score:2)
That line blows.
Re:Turns? (Score:2)
Re:Turns? (Score:2)
Granted, this is 70mph wind stuff we're talking about, so it likely wouldn't apply in a datacenter environment. Although it'd be fun to imagine losing certain co-workers getting sucked into the hurricane-force winds. Tune in tonight at 7 for "When Datacenters Attack!"
Re:Turns? (Score:3, Interesting)
Yes, turns (Score:3, Informative)
Now admittedly, friction isn't as important to gasses as it is to other states of matter, but it can have an effect, especially in high flow cooling.
Re:Turns? (Score:2)
1) Resistance. Turns, right angled plenum, or obstructions from cables/power cords would impede airflow right?
2) While atmospheric differential is key, the magnitude of the differential would be indicate how much resistance/efficiency there is.
3) Even a perfectly working system must only be capable of delivering a certain amount of cool air flow. With these hotter and hotter computers, at some point the equipment exceeds your airflow budget.
Re:Turns? (Score:2)
When the air is forced to turn a corner it creates more friction than if it is pushed/pulled in a straight line. This serves to both heat the air, and to cause the motors creating the negative/positive atmospheres to do that much more work.
I do wonder how much difference either effect really has. Doesn't seem like there should be much. Raised floors are optimal for taking advantage of convection c
Re:Turns? (Score:2)
It's complicated, but basically, yes and no (Score:5, Informative)
If you're bored, check out TileFlow [inres.com]. It's an underfloor airflow simulator. You put in your AC units, perf tiles, floor height, baffles, you name it. It will (roughly) work out how many CFM of cold air you're going to see on a given tile. It's near-realtime (takes a second to recalculate when you make changes), so you can quickly add/remove things and see the effect. I spent some time messing with this a couple of years ago, and it's very easy to set up a situation where you have areas in your underfloow with *negative* pressure.
The article basically summed it up for me:
McFarlane said raised floors should be at least 18 inches high, and preferably 24 to 30 inches, to hold the necessary cable bundles without impeding the high volumes of air flow. But he also said those levels aren't realistic for buildings that weren't designed with that extra height.
I'd go with 24 inches MINIMUM, myself. Also, proper cable placement (ie: not just willy-nilly) goes a long way towards helping airflow issues. Like they said though, you don't always have the space.
Of course, with the introduction of a blade chassis or 4, you suddenly need one HELL of a lot more AC
Re:It's complicated, but basically, yes and no (Score:3, Informative)
I'd go with 24 inches MINIMUM, myself.
Not bad, at about 1" per year is typical. Might last a career.
A layer each for:
Oh, and don't forget power, 2 phase and 3 phase, 240v and 120v. And those silly traceiver boxes and modems.
Floors end up being garbage pits...
Short Article. (Score:4, Interesting)
Why wouldn't raised floors be bad if you used them properly?
Re:Short Article. (Score:2)
If it wasn't my basement I'd just put outlets in the floor, and if I didn't want it also to serve as my theater room I'd consider outlets in the ceiling.
Re:Short Article. (Score:2)
Oh...so it's for practial reasons... (Score:5, Funny)
Turns? (Score:4, Funny)
I wouldn't say they're going to become obsolete. (Score:5, Insightful)
Re:I wouldn't say they're going to become obsolete (Score:3, Informative)
If cooling is not a concern, concrete slab with overhead runs is the best way. If cooling is an issue, use raised floor, for cooling only and overhead runs for cables.
Raised Floor Fun! (Score:5, Funny)
But it also eliminates the joy of making fun of coworkers who gets lost in a raised floor, or closing them in when they go on a hunt for something...
Re:Raised Floor Fun! (Score:4, Funny)
Yikes...
Re:I wouldn't say they're going to become obsolete (Score:4, Interesting)
or pluming. I'm serious. (An a bit OT)
When I was at IBM's Cottle Rd. facility, now (mostly) part of Hitachi, they had just finished rebuilding their main magnetoresitive head cleanroom (Taurus). They took the idea from the server techs, and dug out eight feet from under the existing cleanroom (without tearing down the building) and put in a false floor.
All of the chemicals were stored in tanks under the floor. Pipes ran veritcally, and most spills (unless it was something noxious) wouldn't shut down much of the line. It was a big risk but, if what I hear is correct, people still say it's the best idea they had in a while.
Hey -- who's the experts anyways?!?! (Score:2)
Hey! You! get offa my cloud!
Re: (Score:2)
Re:Air can turn on a dime. (Score:2)
Laminar flow is more efficient at thermal transfer than turbulent flow.
Re:Air can turn on a dime. (Score:5, Interesting)
The problem lies with larger datacenter environments. Imagine a room the size of a football field. Along the walls are rows of air conditioners that blow cold air underneath the raised floor. Put a cabinet in the middle of the room and replace the tiles around it with perforated ones and you get a lot of cooling for that cabinet. Now start adding more rows & rows of cabinets along with perforated tiles in front of each of them. Eventually you get to a point where very little cold air makes it to those servers in the middle of the room because it's flowing up through other vents before it can get there. What's the solution? Removing servers in the middle of hotspots & adding more AC? Adding ducting under the floor to direct more air to those hotspots? Not very cheap & effective approaches...
Wuh? (Score:2)
Put a cabinet in the middle of the room and replace the tiles around it with perforated ones and you get a lot of cooling for that cabinet.
Maybe this is the problem. Every industrial datacenter I have been in places racks over either empty spaces, or tiles with a large vent in them. The rack has fans in it to force air through vertically (bottom to top). A few perforated tiles get scattered about for the humans, but I have been in some datacenters without them to maximize airflow to the racks. But t
Re:Wuh? (Score:2)
That works to an extent, but what if the cabinet is pretty much fully loaded? We loaded up 8-foot cabinets with 30+ 1U dual CPU servers. The amount of air coming up through the holes underneath the cabinets were never enough to cool all that hardware down. Besides, my original example was just that - an example.
Re:Air can turn on a dime. (Score:3, Informative)
The blower moving the air only has a certain amount of power. Hook it up to a duct ten feet long, and output basically equals input. Hook it up to a duct ten *miles* long -- even a perfectly airtight one -- the power you put into one end will be lost by the other end, because the air molecules lose momentum (and gain heat) as they bounce off each other and the walls of the duct.
Every time a duct turns a right angle, the molecules lose a lot
Re:Air can turn on a dime. (Score:2)
Re:Air can turn on a dime. (Score:4, Interesting)
most server rooms aren't part of the duct, for example, the one here is large and rectangular, with enormous vents at either end. not very well designed.
airflow is a very complicated problem, my old employer had at least three AC engineers on full time staff to work out how to keep the tents cold ( I worked for a circus, hence the nick.) the ducting we had to do in many cases was ridiculous.
why do you think the apple engineering used to use a cray to work out the air passage through the old macs. just dropping air-conditioning into a hot room isn't going to do jack if the airflow isn't properly designed and tuned. air, like many things, doesn't like to turn 90 degrees, it needs to be steered.
wired grid (Score:2)
Shheesh,
Re:wired grid (Score:2)
No more zinc whiskers? (Score:2, Interesting)
Re:No more zinc whiskers? (Score:2)
Thanks to the EU, you'll be able to impress people with your knowledge of tin whiskers [empf.org] instead.
Army Research Labs solution... (Score:5, Interesting)
Re:Army Research Labs solution... (Score:3, Funny)
I bet that computer simulated the best cooling for itself.
Re:Army Research Labs solution... (Score:3, Funny)
Some call that planning and engineering.
An engineering firm that was hired to do some upgrades to our 2 room computer facility which included a fan to circulate air between the two rooms. We asked what the CFM of the fans were and how often the air would be exchanged between the rooms. Their answer: Dunno, never thought of that. Good thing we did.
One way to fight this -- the CHIP (Score:5, Insightful)
Inefficient architectures must be discarded to make way for more modern, smaller, COOLER processors.
Let's address the real problem here -- not the SYMPTOM of hot air.
We need to address the COMPUTERS.
Re:One way to fight this -- the CHIP (Score:5, Insightful)
Why do devices need to be cooled? (Score:3, Insightful)
Think about the lightbulb....A standard 60-watt incadescent bulb generate lots of heat. A better design is something like the LED bulbs that generate the same amount of lumens, with much less power, and more importantly little to no heat.
Good design can allow these devices to not generate excess heat, hence eliminating the need for the raised floor.
Re:Why do devices need to be cooled? (Score:3, Informative)
See http://en.wikipedia.org/wiki/Reversible_computing [wikipedia.org]
Fast computing is made possible by destroying information (that's all computers do really, they destroy information). That destruction process entails an entropy cost that must be paid in heat.
Have you looked at LED efficiency (Score:3, Informative)
LEDs are certainly better than flashlight bulbs.
But when a white LED delivers 15-19 lumens per watt, its about the same as a 100W incandescent and five times worse than a fluorescent. LEDs appear bright because they put out a fairly focused beam - not because they put out lots of light.
I got a totally impracticable solution right here! (Score:5, Funny)
Most. Efficient. Cooling. Evar!
Call in the aliens (Score:5, Funny)
Now, if you're willing to host an alien spaceship at the bottom of your datacentre, maybe they could lend a hand...
Re:I got a totally impracticable solution right he (Score:3, Informative)
5th grade science (Score:2)
The article points out that overhead cooling requires additional fans, etc.
Racks need to be built more like refridgerators. Foamcore/fiberglass insulated with some nice weatherstripping to create a chamber of sorts. Since the system would be near sealed, convection currents from the warm air exaust rising off the servers in the rack would pull cold air down. Cold air goes in through the bottom of the rack, heats up, gets pushed back through the top. This could pro
It's the physics, stoopid (Score:3, Insightful)
Note that I'm not calling the parent poster stoopid, but rather the design of forcing cold air through the *floor*. As the parent here notes, cold air falls. This is presumably why most home fridges have the freezer on top.
I was most surprised to read this article. I've never worked in a data center, but I have worked in semiconductor production cleanrooms, and given the photos I've seen of data centers with the grated flooring, I guess I always assumed the ventilation was handled the same way as in a
Time to invent standardized air-interconnects (Score:5, Interesting)
Re:Time to invent standardized air-interconnects (Score:3)
My thinking is a good rack system should have the airflow under control.
No Raised Floors? (Score:3, Interesting)
Now, we had to get building systems to maximize the air flow from the AC vent in the room to ensure maximum cooling and the temperature on the thermostat was set to the minimum (about 65 F I believe). One day, while trying to do some routine upgrades to the server, I noticed things not going so well. So I logged off the remote connection and made my way to the server room.
What do I find when I get there? The room temperature is approximately 95 F (the outside room was a normal 72) and the servers are burning up. I check the system logs and guess what, it has been like this four nearly 12 hrs (since sometime in the middle of the night). To make this worse our system administrator was at home for vacation around X-Mas, so of course all sorts of hell was busting loose.
We wound up getting the room down after the people from building systems managed to get us more AC cooling in the room; however, the point is it was never really enough. Even on a good day it was anywhere from 75 F to 80 F in the room and with nearly a full rack and another one to be moved in there is was never going to be enough. This is what happens though when administrations have apathy when it comes to IT and the needs of the computer systems, particularly servers. Maybe we should bolt servers down and stick them in giant wind tunnels or something...
Re:No Raised Floors? (Score:3, Funny)
No problems cooling (Score:2, Insightful)
Use Moore's law, stupid (Score:2)
More likely the powers that be have overbought capacity, in order to expand the apparent size and importance of their empire. I've seen several computer rooms that could have been replaced with three laptops and a pocket fan.
Re:Use Moore's law, stupid (Score:2)
Thermal Dynamics... (Score:2, Informative)
Heat rises, our original designs back in 2002 for our data center called for overhead cooling using a new gel based radiator system. It would have been a great solution and caused us to go with a lower raised floor, just for cables and bracin
Not obsolete. (Score:2, Interesting)
So, no, I don't think they will be obselete any time soon. But hey, I'm an old punchcard guy.
Wow, I never thought of it like that ;) (Score:3, Interesting)
I wonder how all those ducts throughtout America (with tons of 90 degree turns) carry air that heats and cools houses and office buildings every day?
Yes, It Is The Best Option (Score:3, Interesting)
The only other option would be water cooling but that's viewed by my bosses as supercomputer territory.
Ed Almos
Obsolete or not... (Score:3, Informative)
Very difficult to track down random machine failures to bad interior decoration choices!
Raised floors for cooling=bad (Score:3, Interesting)
We had intended to use the raised floor to supply air, but Liebert's design analysis gave us a clear indication of why that wasn't going to work. We needed to generate air velocities in excess of 35 MPH under the floor. There were hotspots in the room where negative pressure was created and the air was actually being sucked into the floor rather than being blown out from it. So, we happened to get lucky as Liebert was literally just rolling off the production line their Extreme Density cooling system. The system uses rack mounted heat exchangers (air to refrigerant), each of which can dissipate 8 - 10 kW of heat, and can be tied to a building's chilled water system, or a compressor that can be mounted outside the building.
This system is extremely efficient as it puts the cooling at the rack, where it is needed most. It's far more efficient than the floor based system, although we still use the floor units to manage the humidity levels in the room. The Liebert system has been a work horse. Our racks are producing between 8 - 9 kW under load and we consistently have temperatures between 80 - 95 F in the hot aisle, and a nice 68 - 70 F in the cold aisles. No major failures in two years (two software related things early on; one bad valve in a rack mounted unit).
More about bad rack design (Score:2, Interesting)
Re:More about bad rack design (Score:3, Informative)
I used to do commercial HVAC work, and everybody in the business does the opposite from what you describe. The ducts are largest near the air handler, and they are smallest at the end of the line. Typically, the main trunk of the duct gets smaller in diameter after each branch comes off of it and goes to a diffuser.
One issue w
Can't push enough air (Score:3, Interesting)
Wiring is now usually ABOVE the equipment, and with 10Gigabit copper, you can't just put all of the cables in a bundle any more, you have to be very careful.
It's a brave new datacenter world. You need some serious engineering these days, guessing just isn't going to do it. Hire the pros, and save your career.
--Mike--
HVAC concerns (Score:3, Interesting)
When you are designing for a space (such as a room) you design for the shortest amount of ductwork for the greatest amount of distribution. Look up in the ceiling of an office complex sometime and count the number of supply and return diffusers that work to keep your air in reasonable shape. All of the ducts that supply this air are smooth, straight and designed for a minimal amount of losses.
All air flow is predicated on two imporant points within a given pipe (splits and branching with in the duct work is not covered here): pressure loss within the pipe and how much power you have to move the air. The higher the pressure losses, the more power you need to move the same amount of air. Every corner, turn, rough pipe, longer pipe all contribute to the amount of power needed to push the air through at the rate you need.
Where am I going with all of this? Well under floor/raised floor systems do not have alot of space under them and it is assumed that the entire space under it is flexible and can be used (ie no impediments or blockages). Ductwork is immobile and does not appreciate being banged around. Most big servers need immense amounts of cooling. A 10"x10" duct is good for roughly 200 CFM of air. That much air is good for 2-3 people (this is rough, since I do not have my HVAC cookbook in front of me.. yes that is what it is called). Servers need large volumes of air and if that ductwork is put under the floor, pray you don't need any cables in that area of the room. Before you ask: Well why don't we just pump the air into the space under the floor and it will get there? Air is like water, it leaves through the easiest method possible. Place a glass on the table and pour water on the table and see if any of the water ends up in the glass. Good chance it ends up spread out on the floor where it was easiest to leak out. Unless air is specifically ducted to exatcly where you want it, it will go anywhere it can (always to the easiest exit).
Ductwork is a very space consuming item. Main trunks for 2 and three story buildings can be on the order of four to five feet wide and three to four feet high. A server room by itself can require the same amount of cooling as the rest of the floor it is on. (ignoring wet bulb/dry bulb issues, humidity generation and filtering, we are just talking about number of BTUs generated). A good size server room could easily require a seperate trunk line and return to prevent the spreading of heated air throughout the building (some places do actually duct the warm air into the rest of the building during the winter). Allowing this air to return into the common plenum return will place an additional load on the rest of the buildings AC system. Place the server on a seperate HVAC system to prevent overloading the rest of the building's AC system (which is designed on a per square foot basis assuming for a given number of people/computers/lights per square foot if the floor plan does not include a desk plan layout).
Raised flooring is useful for several reasons. (Score:5, Informative)
Raised flooring also provides significant storage for those large eletrical "whips" where 30A (in most US DCs any how) circuits are terminated as well as a place to hide miles of copper and fiber cable (preferably not too close to the electrical whips). Where else would you put this stuff? With high density switches and servers, we certainly aren't seeing less cable needed in the data centers. Cabinets that used to hold five or six servers now hold 40 or more. Each of these needs power (typically redundant) and network connectivity (again, typically redundant), so we actually have more cables to hide than ever before.
Cabinets are built with raised flooring in mind. Manufactureres expect your cabling will probably feed up through the floor into the bottom of the cabinet. Sure, there is some space in the top of the cabinets, but nothing like the wide open bottom!
Anyhow, there you have the ideas of someone who is quickly becoming a dinosaur (again) in the industry.
Hell no (Score:5, Interesting)
Telecom switching equipment still uses vertically mounted boards for the most part and still expects to intake air from the bottom and exhaust it out the top. Have any AT&T/Lucent/Avaya equipment in your computer room? Go look.
Now look at your rack mount computer case. Doesn't matter which one. Does it suck air in at the bottom and exhaust it out at the top? No. No, it doesn't. Most suck air in the front and exhaust it out the back. Some suck it in one side and exhaust it out the other. The bottom is a solid slab of metal which obstructs 100% of any airflow directed at it.
Gee, how's that going to work?
Well, the answer is: with some hacks. Now the holed tiles are in front of the cabinet instead of under it. But wait, that basically defeats the purpose of using the raised floor to move air in the first place. Worse, that mild draft of cold air competes with the rampaging hot air blown out of the next row of cabinets. So, for the most part your machines get to suck someone elses hot air!
So what's the solution? A hot aisle / cold aisle approach. Duct cold air overhead to the even-numbered aisles. Have the front of the machines face that cold aisle in the cabinets to either side. Duct the hot air back from the odd-numbered aisles to the air conditioners. Doesn't matter that the hot aisles are 10-15 degrees hotter than the cold aisles because air from the hot aisles doesn't enter the machines.
Almost the right idea (Score:3, Informative)
Why? You must keep in mind, you're not trying to pump "cold" air in, you're trying to take heat out, and as Mother Nature knows, heat rises. So why not harness the natural convection of heat, allow it to flow up to the ceiling, and have some "perf" ceiling tiles and use the space over the ceiling t
It is not just for air flow (Score:3, Interesting)
Examples:
Wiring: Not everyone likes to use overhead ladders to carry cables around. In the Army we had less than 50% of our wiring overhead, the rest was routed thru channels underneath the raised flooring.
HVAC Spill protection: Many of our NOCs had huge AC units above the tile level, and these things could leak at any moment. With raised flooring the water will pool at the bottom instead of run over the tiles and cause an accident. We had water sensors installed, so we knew we had a problem as soon as the first drop hit the floor.
If the natural airflow patterns are not enough for a specific piece of equipment, it does not take a lot to build conducts to guarantee cold air delivery underneath a specific rack unit.
The one thing I did not like about the raised floors was when some dumbass moron (who did NOT work within a NOC) decided to replace our nice, white, easy to buff tiles, with carpeted tiles. 10 years later and I can't still figure out why the hell would he approve that switch, since our NOC with its white tiles looked fricking gorgeous just by running a buffer and a clean mop thru it. The tiles with carpeting were gray so they darkened our pristine NOC.
I bet many of the people against raised flooring are land lords that don't want to get stuck with the cost of rebuilding flooring if the new tenant does not need a NOC area. I have been to a NOC in a conventional office suite, they basically crammed all of their racks into what seemed to be a former cubicle island. The air conditioning units were obviously a last-minute addition and it looked like the smallest spill would immediately short the lose power strips on the first row of racks in front of them. Shoddy as hell.
Re:How about water cooling? (Score:2, Informative)
Check http://news.com.com/Photos+SGIs+Columbia+supercom
Re:How about water cooling? (Score:3, Interesting)
Re:How about water cooling? (Score:3, Informative)
Water (non-pure... which it will be as soon as it hits your computer) conducts electricity.
Antifreeze is not better and conducts electricity.
The liquid you're looking for is fluorinert, but the price is one the order of hundreds of dollars per gallon.
When you consider the price, you'll see why many people just use water and high-quality plumbing. Why use $500 of flo
Re:Not Just Cooling (Score:3, Insightful)
The real usefulness is the ability to run cabling from any point A to any point B in the floor space.
That's good to an extent, as long as the cable runs aren't too long. Go take a look at an enterprise grade colocation hosting facility and you may change your mind. I've spent a lot of time at one of the top-tier MCI facilities. It has a raised floor that's used for cooling and power distribution, but all networking is done via 3 or 4 layers of overhead c
Re:Not Just Cooling (Score:3, Insightful)
I don't believe that there should be a rats nest of cabling _anywhere_ in a datacenter. I hate raised floors because they allow techs to get sloppy. Vertical wiring trays eliminate that possibility by showing their hackish wiring job to everyone.
When your datacenter is new, you should pre-wire patch panels in each cabinet for SAN and Ethernet. Each cabinet should have a PDU.
Run all