Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware IT

Making Data Centers More People-Friendly 137

1sockchuck writes "Data centers are designed to house servers, not people. This has often meant trade-offs for data center staffers, who brave 100-degree hot aisles and perform their work at laptop carts. But some data center developers are rethinking this approach and designing people-friendly data centers with Class-A offices and amenities for staff and visitors. Is this the future of data center design?"
This discussion has been archived. No new comments can be posted.

Making Data Centers More People-Friendly

Comments Filter:
  • Maybe it's just me, but doesn't it seem like 100 degree aisles wouldn't be particularly server-friendly either? Just my $.02
    • It's where the cooling exhaust goes, that's why it's hot.
      • by Anonymous Coward

        In the Data Centers I work in, the exhaust is sucked out the top of the racks, and every few tiles in the aisles, is a tile with a bunch of holes in it acting as a air-conditioning vent.

        If anything, I freeze at times, if I'm in shorts and flip flops.

        • by Anonymous Coward
          Seriously. I've been in and out of datacenters, rack farms, etc over the past decade and every time I've seen the racks vented through the ceiling. Hot air rises, more efficient to just suck it up through there. Never been in a warm data center, only freezing ass cold.
          • Re:First troll! (Score:4, Interesting)

            by EdIII ( 1114411 ) on Wednesday March 02, 2011 @07:24PM (#35363926)

            The data center I visit most right now has hot/cold aisles. It looks more like a meat processing center with all the heavy plastic drapes. They go from floor to ceiling every other aisle. On the front of the racks they even put in plastic placeholders for gaps where we don't have equipment installed yet to maximize air flow through the equipment. They did it too, we never even had to ask.

            Most of the time we work from the cold aisle with our laptop carts and it is *cold*. The article is confusing because I can't possibly understand why you need to sit with a cart in the hot aisle to work. You can install your equipment and cabling in such a way that you don't need access to the hot aisle for anything other than full server swap outs, cabling swap outs, and that's pretty much it. You can replace the hard drives from the front of the units, and maintenance the server just by pulling it out in front after disconnecting the cables if you need to do so. Most big 4U servers come with cable management arms that allow you to keep "service loops" so that you don't need to disconnect anything to pull the server out on the rails.

            Heck, if you need to just get a 15ft networking cable and thread it through into the cold aisle. You don't have to sit in the heat if you don't want to. Although, I'm a big guy and I like the cold it but its funny as hell to see the skinny bastards walking over to the hot aisle to warm up.

            • by afidel ( 530433 )
              Cable management arms are the work of the devil. They do nothing but stress cables beyond the minimum bend radius and block airflow out of the back of the server. If you are messing with a server enough for the time spent installing the cable management arm is less than the time you spend disconnecting and reattaching cables you are doing something wrong. The only time I thought they were warranted was back when some higher end x86 servers came with hot replace PCI slots so a dead addon card didn't mean a s
              • by EdIII ( 1114411 )

                But Momma said Cable management arms are the work of the devil

                There fixed that for you :)

                *could not help myself*

              • Cable management arms are a God Send.

                Each of my 2U servers has two power, four ethernet, one serial, and three pairs of fiber coming out the back. Without CMAs working on these things would be a freakin' PITA - never mind increasing the risk that somebody plugs something back in wrong.

                If your CMAs are stressing your cables, they are either really crappy or not installed properly.

                • by afidel ( 530433 )
                  IBM, Dell, and HP's arms ALL fold at an angle that is less than the minimum bend radius for fiber and CAT6a cables.
                  • The spec is 4 times the cable radius. So about 1" Sorry but all those manufactures have more than 1" bend area in their cable management arms.
                  • Well, I don't work [much] with IBM, Dell or HP -- but I thought these things were all pretty much the same.

                    When a load a Sun CMA, I make sure the power cables go in closest to the arm; these are the least important cables in terms of bend radius. Then I install serial, ethernet, and fiber.

                    Here's what a loaded CMA on a Sunfire v240 (2U) server looks like: http://www.page.ca/~wes/v240-cma.png [www.page.ca]

                    Sorry for the crappy picture, it's all I had. Sun CMAs are made by King Slide.

          • Peer1, in downtown Vancouver, uses hot/cold aisles.
    • Re:First troll! (Score:4, Informative)

      by SuperQ ( 431 ) * on Wednesday March 02, 2011 @06:10PM (#35363194) Homepage

      100 degree hot isles are too cold. Hot isles should be the temperature near the maximum component tolerance of the parts in the server. If a part has a maximum temperature of 150 degrees, and runs happily at 120 degrees, the hot isle should be 120 degrees. This way the cooling efficiency is the highest.

      See Google and SGI (Rackable) container designs.

      http://arstechnica.com/hardware/news/2009/04/the-beast-unveiled-inside-a-google-server.ars [arstechnica.com]

      As you can see from the photo there, all the cables in the front. No need to get behind it where the hot isle is.

      • by rtb61 ( 674572 )

        Data centres need to pay more attention to warehousing for objects of the size they handling. All the racks should be open and they should be accessed using automatic picking and packing systems.

        Once the system is built automated picking system only should access the racks to pick defective units and return them to an air locked workstation and to insert the new data unit. This allows much higher racks and you can treat the whole data processing and storage area as an industrial process. Bringing in cond

        • by SuperQ ( 431 ) *

          Hell yes. Robot machine replacement would rock. Just like tape swapping. Just bring the busted machines to techs for repair. Or if it's just drive swaps the robot could do that too.

          In theory you could use waste heat as a warming plant for other housing/office use. Since most hardware is happy with 80 degree input air you would have near 80 degree output water. That's more than sufficient to pipe to warm up homes in cold areas of the world.

          • To add:

            With all the advanced redundant power, networking, etc, you could even have the servers made or adapted to allow for seamless picking...

            So you go to your data center, ask for Server X1B7, picker grabs it and brings it to the room / office you are in... all seamless and without losing network or power. The room could even be designed to look and feel like a old-school datacenter.

          • Oh good Lord, no... Given how often my tape robots break. They jam, they get confused... The last thing I need with a downed server is some robot trying to crimp it in two because a roller got worn and lacked grip. Plus, when the robot breaks, how hard will it be for a human to get in and do things manually? Especially when the mechanism is from the low bidder, because the execs will never pay for a good one...
            • by rtb61 ( 674572 )

              It really only works for major data centres. Midi ones should have the servers embedded in external walls behind double glazed doors (internal atrium). Drawing in outside air, conditioning it and exhausting it or making use of the air temperature in a heat exchanger.

              Another big improvement would be to go all DC in the server room, larger external transformers and redundant transformers (externalises a major heat source that can run at very high temperatures), simplifies battery backup and also adding sol

              • by SuperQ ( 431 ) *

                The problem with DC in the server room is distribution. You would have to go high voltage DC to be able to distribute it.. and then you would need to step it down a bunch at the rack. If you do 48v rail, you're going to need huge distribution bars. Think of a typical server rack these days. A 500W 1U server x 40U. That's 20kw per rack. At 48v you're talking 416 Amps. A minor install of say 10 racks is going to bring you up to 4160 Amps. Just guessing you'd need a bar of copper 2x2" in size to move t

                • by rtb61 ( 674572 )

                  Your thinking wrong kind of server. Adjust your thinking for no sound card, no video card, just an energy efficient CPU, hard disk drives and memory. Also no energy losses in a power supply, or its associated fan. In fact the only fan would be on the CPU and even that should be eliminated by with good design and an effective heat sink. Also adjust your DC conductivity for energy losses http://en.wikipedia.org/wiki/High_Voltage_DC [wikipedia.org]. Apparently there is only about a 30% difference between AC and DC for ampera

      • I was thinking, like most rational people would that 100 Degrees Celsius (boiling point of water) _would_ be way too hot and wondered how the servers could keep operating under such extreme conditions. Now in high temperature environments like mine sites we use self contained racks with their own AC unit. We occasionally use empty ones to chill beer.

        But then I read this and realised they were using the backwards Fahrenheit measurement.

        100 degree hot isles are too cold.

        Now here I agree, 38 Degrees is no

        • Should've just ordered a Prescott and saved yourself the heater.

        • by SuperQ ( 431 ) *

          Sorry, I was just responding in the parent post's units. :(

          • by mjwx ( 966435 )
            No need to apologise,

            Yours just happened to be the closest post.

            but my rant stands,

            6.2 billion people use Celsius, if not more. Why are we being held hostage to an out of date, arbitrary temperature system.
  • Seriously! What company is going to pay an extra 10% (guessed figure) on top of the cost so they can have a nice comfy room for their data-rats?

    • by Anonymous Coward

      <smug>
      Any company that cares about the comfort and morale of their employees!
      </smug>

    • Rackforce, for one. Their newest data centre in Kelowna BC has some of the nicest offices, conference rooms, and bathrooms I've seen anywhere. The centre of the building is the "data centre", with the offices around the periphery. Works nicely, looks amazing, and keeps the geeks *and* the suits happy.

  • by Anonymous Coward on Wednesday March 02, 2011 @06:05PM (#35363136)

    I've never had a temp problem in a data center. Noise? yes, hot? no.

    • Agreed. I'll take sound deadening over temperature adjustement. I'm admittedly very sensitive, but for each hour I spend at the DC, my ears ring for three.

      • Go to home depot and buy a pair of ear plugs. Not the foam disposable ones, but the rubber in ear type. Just like some earbuds and they're connected by a rubber cord.

        Just one thing: don't chew gum or clear your throat. It sounds just awful...

        • by 0123456 ( 636235 )

          Go to home depot and buy a pair of ear plugs.

          Nah, get a proper set of ear protectors; mine are probably the best $25 I ever spent.

          • by Anonymous Coward

            I got a pair of electronic headphones at harbor freight tools for like 15 bucks, they are meant for a shooting range. Basically they block out loud noises and through a microphone and speaker re-create low decibel noises such as talking. So basically you can be firing your loud ass gun and hear people talking at a normal voice without having to remove the headphones . This works wonders in any environment with loud noises.

        • by SuperQ ( 431 ) *

          Better than home depot, Etymotic full frequency plugs are great:

          http://www.etymotic.com/ephp/er20.html [etymotic.com]

          I've also had some friends use things like these in datacenters:

          http://www.amazon.com/dp/B00009363P [amazon.com]

          They let you hear people talk (bandpass filter) without letting the low/high noise in.

          • I've got a pair of Peltor Tactical Sports. I use them while at the shooting range, but they're a godsend in the DC as well!

            • by h4rr4r ( 612664 )

              Are they worth it?
              I have 300 win mag with a gun loudener (muzzle break) and it seems no muffs I own are up to that challenge. I end up wearing plugs and muffs at the same time.

              • Are they worth it?
                I have 300 win mag with a gun loudener (muzzle break) and it seems no muffs I own are up to that challenge. I end up wearing plugs and muffs at the same time.

                "Holster. Bandoleer. Silencer. Loudener. Speed-cocker. And this thing's for shooting down police helicopters."

              • You'd still want those. The plus side, you can turn them on and crank the low level sounds up, so you'll be able to hear more than your breathing when you're not firing.

              • Are they worth it? I have 300 win mag with a gun loudener (muzzle break) and it seems no muffs I own are up to that challenge. I end up wearing plugs and muffs at the same time.

                A "loudener"? Seriously? Muzzle breaks do not make the gun "louder" - they redirect exhaust gases in multiple directions for felt-recoil reduction, and to quieten the firearm somewhat.

        • That's great, except when I'm spending most of those hours on the phone with a vendor, trying to fix the overpriced hunk of junk they sold us.

    • Everything's remotely managed these days anyway, so who cares how hot / cold / loud the cabinets are? Scurry in to hot-replace whatever and scurry back out to the cubi.
    • I managed a data center. Temperatures like that radiating from servers is bad, bad news. That is an obvious bad airflow problem.

      Plus, there is emerging technology to use sound waves for refrigeration. I wonder when they'll deploy it for data centers?

  • by Anonymous Coward on Wednesday March 02, 2011 @06:06PM (#35363138)

    No.

  • by confused one ( 671304 ) on Wednesday March 02, 2011 @06:06PM (#35363140)
    This is a marketing ploy to attract customers to a new data center. Ultimately cost will determine the layout. If a cube is cheaper then cubes it will be. If 100 degree hot aisles saves money vs an 85 degree hot aisle, then they'll run them hotter.
    • by Damek ( 515688 )

      Human costs are important. We're forgetting that, hence idiocy like Wisconsin. If you want to be "old-school economics" about it, all costs should be accounted for, including those stakeholder humans bring up that you may not have realized (employees are stakeholders, even if not stockholders). If you want to be currently capitalist about it, don't bother accounting for any costs that aren't affecting today's golf game. Have fun watching the planet burn then.

      The true "tragedy of the commons" is epitomized i

      • Nice sentiments and all. I'm living the "old school economics" tragedy where my I'm stuck in a 8' x 5.5' cubical reporting to a PHB who thinks the software I create doesn't add any value to the company. My only stake is the one that keeps getting driven into my back.
    • This is a marketing ploy to attract customers to a new data center. Ultimately cost will determine the layout. If a cube is cheaper then cubes it will be. If 100 degree hot aisles saves money vs an 85 degree hot aisle, then they'll run them hotter.

      And who wants to invite visitors into a data center? What're they gonna see? Hundreds/thousands of das blinken lights?

  • Wimps (Score:5, Funny)

    by RobotRunAmok ( 595286 ) on Wednesday March 02, 2011 @06:08PM (#35363166)

    In my day Data Centers were at the top of snow mountains which you had to climb barefooted or be turned away. We built them to keep the machinery happy, not the people, whom we preferred behaved like machinery.

    We liked our Data Centers the way we liked our women: Bright, White, Antiseptic, and Bitterly Cold.

  • How getting rid of hand scanners?

    I used to work in colo facilities for years and the one thing that always concerned me was some person that had gone through the man trap before me had some awful bacteria/virus on their hands.

    If the handle on the toilet in the airport can claim that it has an anti-bacterial coating, do you think the hand scanner manufacturers could do the same?

    • by X0563511 ( 793323 ) on Wednesday March 02, 2011 @06:37PM (#35363414) Homepage Journal

      Unless you've lived in a bubble your whole life, you're probably going to be OK...

      • I hope you don't assume that everyone washes their hands after coming out of the bathroom.

        While I don't know about you, I personally would not want to shake the hands of some guy that just dropped a bomb and didn't wash his hands.

        • You would be OK, none of my coworkers have died of dyssenthery yet...

          Now more seriously, relax. Look at how people has historically lived. How long have we had drinkable and controlled water in our home? Or have been almost be assured a john (it is how it is called?) nearby when we need it? And you are probably better feed, have had more vaccines and have acces to more and better physicians than 99,9% of the rest of mankind that has ever been. And even with those disavantages, most of these people did not d

    • Is there any reasonable difference there between hand scanners and doorknobs that would warrant different treatment ?
      You get the same risks just by using the same door as others w/o wearing surgical gloves and discarding them afterwards.

  • All the Data Centers I have been working at (USA and Singapore) had some kind of lounge/relaxing room with games, food vending machines, coffees, meeting rooms you can rent, showers, etc. Maybe they just forgot to mention it to their existing customers? Or maybe Equinix (not my company) is doing a better job at taking care of their customers? And they're still aiming at 30% yearly ROE, I can't see where a few dedicated rooms would hurt the bottom line so much.

  • Haha! We tried this back in 2000 and it didn't work out. Company tanked, got sold for pennies on the dollar. Herakles (new name) is, however, still a really nice facility.
  • Remote Management (Score:4, Insightful)

    by eln ( 21727 ) on Wednesday March 02, 2011 @06:27PM (#35363336)
    If you've got remote management set up properly, the only reason you ever even need to go to the data center is due to some kind of hardware failure. There's no sense paying the extra money a place like this will have to charge (to recoup the cost of all those extra amenities) for colo space if you only need to physically visit your servers maybe once or twice a year.
    • Not just hardware failures but any sort of scheduled physical change as well. Among other things: device upgrades; server, switch and router installation and removal; cabling changes; backup media changes; UPS maintenance; rack moves.

      The last data center maintenance I did, we had to move "only" seven rack's worth of gear from one floor to another. It took place in four carefully-planned phases spanning two months of time. We had eight people working at it for the first week, then the two senior guys p
    • TFA discussed a colo with 2X 3MW suites. If 6MW of space has only 1 or 2 hardware failures per year, they're overpaying for hardware reliability. I guarantee it would be cheaper to have on site staff than pay enough in hardware reliability to not require on site staff, even if they have well-appointed offices.
      • by eln ( 21727 )
        The amenities, as I read it, seem to be geared toward the customers of a colo facility. While the actual data center company needs some on-site staff, the individual colo clients will each only be taking up a small fraction of the space, and will rarely need to visit the facility. Given that the data center company will need to charge more to pay for these amenities, their customers are paying for amenities that they (the customer) should only get any use out of once or twice a year. From the customer's
  • by NikeHerc ( 694644 ) on Wednesday March 02, 2011 @06:34PM (#35363386)
    The folks in India won't care how hot or cold it is in the data centers over here.
  • by Anonymous Coward

    I manage an R&D lab with a few hundred racks. We constantly swap out hardware for testing. This makes sitting in the lab the most efficient way of getting things done. Unfortunately my work space where I'm currently sitting reads 90F and I'm wearing noise canceling headphones, which squeeze my head into oblivion after an hour or two. We're working on redesigning a storage room into office space to improve our quality of life. Our goal is to work within a few yards of the gear. As long as an office space

  • Where I worked (Score:2, Interesting)

    by Anonymous Coward

    I've worked in several data centers. An IBM one had air cooled servers (push cold air into the floor, and every rack has a massive fan pulling cold air up from the floor. It was about 20C day and night. The floor tiles would occasionally break which caused problems when pulling skids of bills over them (it was a phone bill processing and print facility). We would also go through 30 tons of paper per month (2x 1000lb skids of paper per day). There was a lot of paper dust in the place, and the paper was

    • Re:Where I worked (Score:4, Insightful)

      by Mr. Freeman ( 933986 ) on Wednesday March 02, 2011 @09:45PM (#35365024)
      "No emergency backup lights."
      You're aware this is illegal, yes? "My boss is cheap and doesn't care" isn't an excuse. Call the fire marshal and tell them about it. They'll come down and write the owner up a ticket and force him to install the safety equipment.

      Always surprises me the number of idiots that have the motivation and intelligence to bitch about the unsafe working conditions on the internet, but not to the fire marshal or OSHA.
  • by rsilvergun ( 571051 ) on Wednesday March 02, 2011 @06:48PM (#35363530)
    Not the future. Didn't you get the memo? Capitalism says the cheapest is best, and amenities cost money. You might see some for the visitors (gotta keep the client happy), but you think they'll pay to keep the place cool for the admins? Not a chance.
    • by Damek ( 515688 )

      Not until admins unionize, anyway. Which, luckily, they won't, because they're all libertarians and don't value more than money.

  • not useing rent a cop security is good!

  • If you have some things that need certain conditions, and other things that need different conditions, then you have a problem.

    Fortunately, I have a solution. It's called putting a wall between them, you fucktards.

  • Only in those situations where it makes for additional income.

  • In my experience people keep the datacenter way too cold. If the equipment runs fine at 70F, then set the CRAC to that temperature. If the hot aisle is working right, everything will be cooled within specifications.
    • by afidel ( 530433 )
      We have our setpoint at 72 and even without hot aisle containment (but proper hot/cold design) everything is fine. We only lose about 1.5% per year on HDD's and basically no other components in a statistically significant number (maybe 3 PSU's in 5 years).
  • by fahrbot-bot ( 874524 ) on Wednesday March 02, 2011 @07:39PM (#35364018)

    ... amenities for staff and visitors ...

    I want a pony.

    • by syousef ( 465911 )

      ... amenities for staff and visitors ...

      I want a pony.

      I want amentiies for the staff. I hear machines don't work so well when you urinate on them.

  • by OgGreeb ( 35588 ) <og@digimark.net> on Wednesday March 02, 2011 @07:44PM (#35364044) Homepage

    I've been renting facility space in a number of data centers over the last fifteen years, including Exodus (remember them?), IBM and Equinix. In particular Equinix facilities have always provided meeting rooms, work areas, (seriously locked-down) access terminals, great bathrooms and showers for visiting techs for at least 5-7 years. OK, the actual cage areas are pretty cold, but that's the nature of the beast -- I wouldn't want my equipment to overheat. Equinix also has tools you can checkout if you forgot yours or were missing something critical, and racks of screws, bolts, optical wipes, common cable adapters, blue Cisco terminal cables... just in case. (Other than paying them for service, not affiliated with or owning stock in Equinix. But perhaps I should have.)

    I would always look forward to the free machine hot-chocolate when visiting for work assignments.

    • I'll second your opinion on Equinix. The data centers I have frequented of theirs even have arcade machines in a breakroom. I rarely see places as well-managed or designed as theirs.

  • by Anonymous Coward

    This troll was good, though my favorites are more like "My boss asked me to spend $5 million upgrading the machine room but I've never done this before, so do you have any advice? Should I include comfy chairs?" or "I'm considering upgrading my skills, do you think it would be worth it to learn Javascript or should I just go to grad school at MIT?" Or sometimes, "I'm having a big fight with my boss, can you give me some evidence that Erlang is really the programming language of the future?" I love slashdo

  • Heat: Data centers should be cool. Everyone wants to do things as cheaply as possible, so they spot cool the racks instead of circulating the air and cooling the entire room. Nothing short of abandoning this practice will remove the "it sucks to be in here" factor. The problem isn't so much that it's 100 degrees, but that it's 100 degrees on one side of the rack and 40 degrees on the other. Spend a bit more on cooling costs and get that to 80/60 or even 90/50 and workers will be much less miserable (an

  • If you have a datacenter large and serious enough that you've got a full hot/cold aisle setup, deafening fans, etc. rather than just a rack frame in a closet somewhere, people being in there is supposed to be unusual. Unless a piece of hardware is being swapped out, as fast as your screwdrivers will carry you, or somebody fucked up in a network-unrecoverable way, why are there humans inside at all?
  • If it does, then the answer is a resounding no. NO. NO!!! NoooOOOOOOooooOOOOOOooooOOOOOOooooo!!!!!!!!!!!!!!!!!!!!!!!
  • by Desmoden ( 221564 ) on Wednesday March 02, 2011 @09:50PM (#35365072) Homepage

    It's not that I don't like humans, hell I married one. However humans are unpredictable. Applications want and need predictable hardware to live on. Even in a "CLOUD" with floating VMs that fly around like Unicorns you want stable predictable hardware underneath.

    Humans trip on things, excrete fluids and gases, need oxygen, light, are temperature sensitive and depending on who's stats you believe cause up to 70% of outages.

    I see convergence, virtualization etc as a chance to finally get humans OUT of the data center. Build it out, cable everything. Then seal it. Interaction does NOT require physical access. And a team of dedicated obsessive compulsive robots or humans can replace memory, drives etc.

    Data Centers need to be human FREE zones. Not the common room in a dorm.


    • Even in a "CLOUD" with floating VMs that fly around like Unicorns you want stable predictable hardware underneath.

      Simply untrue. When you have a "CLOUD" that's big enough, the failure rate itself is stabilized. But you still need humans to work it. Human techs are way cheaper than machines that are so reliable that they don't need them.
      • For some App, Data Center RAID works well. Build 10 low cost data centers, mirror 5->5, run like the wind.

        However, due to the transactional nature of some Applications this is not true.

        For example, Google's model works great for search.

        However it didn't keep those 150,000 users from loosing their data did it?

        Once the transaction is done, you can sync the results, but a failure mid transaction can be VERY bad.

  • by lcllam ( 714572 )
    Data centers are utility rooms and serve a utility purpose. Aside from the showoff trips for the clients, they are probably factored as such and will be closer to a boiler room than an office. Ever see a nicely decked out boiler room?
  • Shouldn't it be windows friendly since they have to be there first Tuesday of the month.

  • That's boiling water temperture! I can just imagine the piles of cooked sysadmins there.

  • Are people still using crash carts for routine maintenance, in this day of ubiquitous KVM? In our datacenter every other row has a desk with two terminals with access to the KVM for those two rows. Unfortunately it's in the cold aisle, which makes it a little chilly to work there if you didn't bring a coat. I have been known to take breaks in the nearest hot aisle to warm up.

    A crash cart should be used for diagnosis and installs. KVM for everything else.

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...