Making Data Centers More People-Friendly 137
1sockchuck writes "Data centers are designed to house servers, not people. This has often meant trade-offs for data center staffers, who brave 100-degree hot aisles and perform their work at laptop carts. But some data center developers are rethinking this approach and designing people-friendly data centers with Class-A offices and amenities for staff and visitors. Is this the future of data center design?"
First troll! (Score:1)
Re: (Score:2)
Re: (Score:1)
In the Data Centers I work in, the exhaust is sucked out the top of the racks, and every few tiles in the aisles, is a tile with a bunch of holes in it acting as a air-conditioning vent.
If anything, I freeze at times, if I'm in shorts and flip flops.
Re: (Score:1)
Re:First troll! (Score:4, Interesting)
The data center I visit most right now has hot/cold aisles. It looks more like a meat processing center with all the heavy plastic drapes. They go from floor to ceiling every other aisle. On the front of the racks they even put in plastic placeholders for gaps where we don't have equipment installed yet to maximize air flow through the equipment. They did it too, we never even had to ask.
Most of the time we work from the cold aisle with our laptop carts and it is *cold*. The article is confusing because I can't possibly understand why you need to sit with a cart in the hot aisle to work. You can install your equipment and cabling in such a way that you don't need access to the hot aisle for anything other than full server swap outs, cabling swap outs, and that's pretty much it. You can replace the hard drives from the front of the units, and maintenance the server just by pulling it out in front after disconnecting the cables if you need to do so. Most big 4U servers come with cable management arms that allow you to keep "service loops" so that you don't need to disconnect anything to pull the server out on the rails.
Heck, if you need to just get a 15ft networking cable and thread it through into the cold aisle. You don't have to sit in the heat if you don't want to. Although, I'm a big guy and I like the cold it but its funny as hell to see the skinny bastards walking over to the hot aisle to warm up.
Re: (Score:2)
Re: (Score:3)
But Momma said Cable management arms are the work of the devil
There fixed that for you :)
*could not help myself*
Re: (Score:2)
Cable management arms are a God Send.
Each of my 2U servers has two power, four ethernet, one serial, and three pairs of fiber coming out the back. Without CMAs working on these things would be a freakin' PITA - never mind increasing the risk that somebody plugs something back in wrong.
If your CMAs are stressing your cables, they are either really crappy or not installed properly.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Well, I don't work [much] with IBM, Dell or HP -- but I thought these things were all pretty much the same.
When a load a Sun CMA, I make sure the power cables go in closest to the arm; these are the least important cables in terms of bend radius. Then I install serial, ethernet, and fiber.
Here's what a loaded CMA on a Sunfire v240 (2U) server looks like: http://www.page.ca/~wes/v240-cma.png [www.page.ca]
Sorry for the crappy picture, it's all I had. Sun CMAs are made by King Slide.
Re: (Score:2)
Re:First troll! (Score:4, Informative)
100 degree hot isles are too cold. Hot isles should be the temperature near the maximum component tolerance of the parts in the server. If a part has a maximum temperature of 150 degrees, and runs happily at 120 degrees, the hot isle should be 120 degrees. This way the cooling efficiency is the highest.
See Google and SGI (Rackable) container designs.
http://arstechnica.com/hardware/news/2009/04/the-beast-unveiled-inside-a-google-server.ars [arstechnica.com]
As you can see from the photo there, all the cables in the front. No need to get behind it where the hot isle is.
Re: (Score:2)
Data centres need to pay more attention to warehousing for objects of the size they handling. All the racks should be open and they should be accessed using automatic picking and packing systems.
Once the system is built automated picking system only should access the racks to pick defective units and return them to an air locked workstation and to insert the new data unit. This allows much higher racks and you can treat the whole data processing and storage area as an industrial process. Bringing in cond
Re: (Score:2)
Hell yes. Robot machine replacement would rock. Just like tape swapping. Just bring the busted machines to techs for repair. Or if it's just drive swaps the robot could do that too.
In theory you could use waste heat as a warming plant for other housing/office use. Since most hardware is happy with 80 degree input air you would have near 80 degree output water. That's more than sufficient to pipe to warm up homes in cold areas of the world.
Re: (Score:2)
To add:
With all the advanced redundant power, networking, etc, you could even have the servers made or adapted to allow for seamless picking...
So you go to your data center, ask for Server X1B7, picker grabs it and brings it to the room / office you are in... all seamless and without losing network or power. The room could even be designed to look and feel like a old-school datacenter.
Better be more reliable than my tape libraries (Score:2)
Re: (Score:2)
It really only works for major data centres. Midi ones should have the servers embedded in external walls behind double glazed doors (internal atrium). Drawing in outside air, conditioning it and exhausting it or making use of the air temperature in a heat exchanger.
Another big improvement would be to go all DC in the server room, larger external transformers and redundant transformers (externalises a major heat source that can run at very high temperatures), simplifies battery backup and also adding sol
Re: (Score:2)
The problem with DC in the server room is distribution. You would have to go high voltage DC to be able to distribute it.. and then you would need to step it down a bunch at the rack. If you do 48v rail, you're going to need huge distribution bars. Think of a typical server rack these days. A 500W 1U server x 40U. That's 20kw per rack. At 48v you're talking 416 Amps. A minor install of say 10 racks is going to bring you up to 4160 Amps. Just guessing you'd need a bar of copper 2x2" in size to move t
Re: (Score:2)
Your thinking wrong kind of server. Adjust your thinking for no sound card, no video card, just an energy efficient CPU, hard disk drives and memory. Also no energy losses in a power supply, or its associated fan. In fact the only fan would be on the CPU and even that should be eliminated by with good design and an effective heat sink. Also adjust your DC conductivity for energy losses http://en.wikipedia.org/wiki/High_Voltage_DC [wikipedia.org]. Apparently there is only about a 30% difference between AC and DC for ampera
Fecking Fahrenheit (Score:2)
But then I read this and realised they were using the backwards Fahrenheit measurement.
Now here I agree, 38 Degrees is no
Re: (Score:2)
Should've just ordered a Prescott and saved yourself the heater.
Re: (Score:2)
Sorry, I was just responding in the parent post's units. :(
Re: (Score:2)
Yours just happened to be the closest post.
but my rant stands,
6.2 billion people use Celsius, if not more. Why are we being held hostage to an out of date, arbitrary temperature system.
Re: (Score:1)
Not to mention that people do not operate too well at 120 degrees.
Re: (Score:2)
Ever lived in Phoenix? :-)
Re:First troll! (Score:4, Funny)
Re: (Score:2)
He's also right that the cooler you keep electrical equipment running, the longer it will last, generally, though fl
Re: (Score:2)
This is why most data centers are kept so cool.
Keeping your servers hot (but not too hot) will enable you to reuse the heat from the datacenter for other purposes. Keeping your servers cool will waste energy cooling them, and waste energy when re-using the energy you extracted from the datacenter.
The sticky stuff is in the temperature-differential between 'inlet' and 'outlet'. If you pump a square yard of air through your server every second the differential will be low, and you have generated low-quality heat. Slow down the flow and the differential wi
Re: (Score:2)
Who's going to pay for this? (Score:2)
Seriously! What company is going to pay an extra 10% (guessed figure) on top of the cost so they can have a nice comfy room for their data-rats?
Re: (Score:1)
<smug>
Any company that cares about the comfort and morale of their employees!
</smug>
Re: (Score:2)
Rackforce, for one. Their newest data centre in Kelowna BC has some of the nicest offices, conference rooms, and bathrooms I've seen anywhere. The centre of the building is the "data centre", with the offices around the periphery. Works nicely, looks amazing, and keeps the geeks *and* the suits happy.
100-degree hot aisles? (Score:3, Insightful)
I've never had a temp problem in a data center. Noise? yes, hot? no.
Re: (Score:3)
Agreed. I'll take sound deadening over temperature adjustement. I'm admittedly very sensitive, but for each hour I spend at the DC, my ears ring for three.
Re: (Score:3)
Go to home depot and buy a pair of ear plugs. Not the foam disposable ones, but the rubber in ear type. Just like some earbuds and they're connected by a rubber cord.
Just one thing: don't chew gum or clear your throat. It sounds just awful...
Re: (Score:2)
Go to home depot and buy a pair of ear plugs.
Nah, get a proper set of ear protectors; mine are probably the best $25 I ever spent.
Re: (Score:1)
I got a pair of electronic headphones at harbor freight tools for like 15 bucks, they are meant for a shooting range. Basically they block out loud noises and through a microphone and speaker re-create low decibel noises such as talking. So basically you can be firing your loud ass gun and hear people talking at a normal voice without having to remove the headphones . This works wonders in any environment with loud noises.
Re: (Score:3)
Better than home depot, Etymotic full frequency plugs are great:
http://www.etymotic.com/ephp/er20.html [etymotic.com]
I've also had some friends use things like these in datacenters:
http://www.amazon.com/dp/B00009363P [amazon.com]
They let you hear people talk (bandpass filter) without letting the low/high noise in.
Re: (Score:2)
I've got a pair of Peltor Tactical Sports. I use them while at the shooting range, but they're a godsend in the DC as well!
Re: (Score:2)
Are they worth it?
I have 300 win mag with a gun loudener (muzzle break) and it seems no muffs I own are up to that challenge. I end up wearing plugs and muffs at the same time.
Re: (Score:1)
Are they worth it?
I have 300 win mag with a gun loudener (muzzle break) and it seems no muffs I own are up to that challenge. I end up wearing plugs and muffs at the same time.
"Holster. Bandoleer. Silencer. Loudener. Speed-cocker. And this thing's for shooting down police helicopters."
Re: (Score:2)
You'd still want those. The plus side, you can turn them on and crank the low level sounds up, so you'll be able to hear more than your breathing when you're not firing.
Re: (Score:2)
Are they worth it? I have 300 win mag with a gun loudener (muzzle break) and it seems no muffs I own are up to that challenge. I end up wearing plugs and muffs at the same time.
A "loudener"? Seriously? Muzzle breaks do not make the gun "louder" - they redirect exhaust gases in multiple directions for felt-recoil reduction, and to quieten the firearm somewhat.
Re: (Score:2)
It reduces recoil, it also sure as hell sounds louder. Do you own such a device? I do.
Here is the first google link to actual shooters discussing this issue.
http://thefiringline.com/forums/showthread.php?t=93453 [thefiringline.com]
Muzzle breaks do in fact make a gun louder,
Re: (Score:2)
That's great, except when I'm spending most of those hours on the phone with a vendor, trying to fix the overpriced hunk of junk they sold us.
Re: (Score:1)
Exactly. (Score:2)
I managed a data center. Temperatures like that radiating from servers is bad, bad news. That is an obvious bad airflow problem.
Plus, there is emerging technology to use sound waves for refrigeration. I wonder when they'll deploy it for data centers?
Is this the future of data center design? (Score:5, Insightful)
No.
Not becoming the standard (Score:4, Insightful)
Re: (Score:3)
Human costs are important. We're forgetting that, hence idiocy like Wisconsin. If you want to be "old-school economics" about it, all costs should be accounted for, including those stakeholder humans bring up that you may not have realized (employees are stakeholders, even if not stockholders). If you want to be currently capitalist about it, don't bother accounting for any costs that aren't affecting today's golf game. Have fun watching the planet burn then.
The true "tragedy of the commons" is epitomized i
Re: (Score:2)
Re: (Score:2)
Yeah, cuz we're just in a glut of jobs at the moment. Screw this "everything's a market" crap. People should f'in organize already.
Re: (Score:2)
This is a marketing ploy to attract customers to a new data center. Ultimately cost will determine the layout. If a cube is cheaper then cubes it will be. If 100 degree hot aisles saves money vs an 85 degree hot aisle, then they'll run them hotter.
And who wants to invite visitors into a data center? What're they gonna see? Hundreds/thousands of das blinken lights?
Wimps (Score:5, Funny)
In my day Data Centers were at the top of snow mountains which you had to climb barefooted or be turned away. We built them to keep the machinery happy, not the people, whom we preferred behaved like machinery.
We liked our Data Centers the way we liked our women: Bright, White, Antiseptic, and Bitterly Cold.
Re:Wimps (Score:5, Funny)
We liked our Data Centers the way we liked our women:
Hot. And always going down.
Re: (Score:3)
We liked our Data Centers the way we liked our women:
Hot. And always going down.
Charlie Sheen, is that you?
Re: (Score:2)
Charlie Sheen, is that you?
I'm in better shape then Sheen. No booze or drugs. And I stay in good enough physical condition that I can handle three porn stars without getting a hernia.
Re: (Score:2)
With a big rack.
Hand Scanners... (Score:1)
How getting rid of hand scanners?
I used to work in colo facilities for years and the one thing that always concerned me was some person that had gone through the man trap before me had some awful bacteria/virus on their hands.
If the handle on the toilet in the airport can claim that it has an anti-bacterial coating, do you think the hand scanner manufacturers could do the same?
Re:Hand Scanners... (Score:4, Insightful)
Unless you've lived in a bubble your whole life, you're probably going to be OK...
Re: (Score:2)
I hope you don't assume that everyone washes their hands after coming out of the bathroom.
While I don't know about you, I personally would not want to shake the hands of some guy that just dropped a bomb and didn't wash his hands.
Re: (Score:3)
You would be OK, none of my coworkers have died of dyssenthery yet...
Now more seriously, relax. Look at how people has historically lived. How long have we had drinkable and controlled water in our home? Or have been almost be assured a john (it is how it is called?) nearby when we need it? And you are probably better feed, have had more vaccines and have acces to more and better physicians than 99,9% of the rest of mankind that has ever been. And even with those disavantages, most of these people did not d
Re: (Score:2)
Is there any reasonable difference there between hand scanners and doorknobs that would warrant different treatment ?
You get the same risks just by using the same door as others w/o wearing surgical gloves and discarding them afterwards.
Confused here (Score:1)
All the Data Centers I have been working at (USA and Singapore) had some kind of lounge/relaxing room with games, food vending machines, coffees, meeting rooms you can rent, showers, etc. Maybe they just forgot to mention it to their existing customers? Or maybe Equinix (not my company) is doing a better job at taking care of their customers? And they're still aiming at 30% yearly ROE, I can't see where a few dedicated rooms would hurt the bottom line so much.
Yeah, we also did that 10 years ago (Score:2)
Our data centers also had customer-friendly space. I think it was mostly inside the "show ID to a guard" area, but it was as important a part of the design as the racks and cages.
BTDT (Score:1)
Remote Management (Score:4, Insightful)
Re: (Score:3)
The last data center maintenance I did, we had to move "only" seven rack's worth of gear from one floor to another. It took place in four carefully-planned phases spanning two months of time. We had eight people working at it for the first week, then the two senior guys p
Re: (Score:2)
Re: (Score:2)
data center comfort (Score:3, Insightful)
R&D labs: yes. Classic DC? no (Score:1)
I manage an R&D lab with a few hundred racks. We constantly swap out hardware for testing. This makes sitting in the lab the most efficient way of getting things done. Unfortunately my work space where I'm currently sitting reads 90F and I'm wearing noise canceling headphones, which squeeze my head into oblivion after an hour or two. We're working on redesigning a storage room into office space to improve our quality of life. Our goal is to work within a few yards of the gear. As long as an office space
Where I worked (Score:2, Interesting)
I've worked in several data centers. An IBM one had air cooled servers (push cold air into the floor, and every rack has a massive fan pulling cold air up from the floor. It was about 20C day and night. The floor tiles would occasionally break which caused problems when pulling skids of bills over them (it was a phone bill processing and print facility). We would also go through 30 tons of paper per month (2x 1000lb skids of paper per day). There was a lot of paper dust in the place, and the paper was
Re:Where I worked (Score:4, Insightful)
You're aware this is illegal, yes? "My boss is cheap and doesn't care" isn't an excuse. Call the fire marshal and tell them about it. They'll come down and write the owner up a ticket and force him to install the safety equipment.
Always surprises me the number of idiots that have the motivation and intelligence to bitch about the unsafe working conditions on the internet, but not to the fire marshal or OSHA.
Re: (Score:2)
Every OSHA poster where I work makes it perfectly clear that retaliation of any kind is illegal. I doubt anyone would have trouble suing.
Nope (Score:3)
Re: (Score:2)
Not until admins unionize, anyway. Which, luckily, they won't, because they're all libertarians and don't value more than money.
not useing rent a cop security is good! (Score:2)
not useing rent a cop security is good!
Re: (Score:2)
What kind of security do you prefer?
These things (Score:2)
If you have some things that need certain conditions, and other things that need different conditions, then you have a problem.
Fortunately, I have a solution. It's called putting a wall between them, you fucktards.
Of course not (Score:2)
Only in those situations where it makes for additional income.
Way too darn cold (Score:1)
Re: (Score:2)
amenities (Score:4, Funny)
I want a pony.
Re: (Score:2)
I want a pony.
I want amentiies for the staff. I hear machines don't work so well when you urinate on them.
Centers have had tech-friendly amenities for years (Score:4, Informative)
I've been renting facility space in a number of data centers over the last fifteen years, including Exodus (remember them?), IBM and Equinix. In particular Equinix facilities have always provided meeting rooms, work areas, (seriously locked-down) access terminals, great bathrooms and showers for visiting techs for at least 5-7 years. OK, the actual cage areas are pretty cold, but that's the nature of the beast -- I wouldn't want my equipment to overheat. Equinix also has tools you can checkout if you forgot yours or were missing something critical, and racks of screws, bolts, optical wipes, common cable adapters, blue Cisco terminal cables... just in case. (Other than paying them for service, not affiliated with or owning stock in Equinix. But perhaps I should have.)
I would always look forward to the free machine hot-chocolate when visiting for work assignments.
Equinix (Score:2)
I'll second your opinion on Equinix. The data centers I have frequented of theirs even have arcade machines in a breakroom. I rarely see places as well-managed or designed as theirs.
admiring the skillfulness of slashdot articles (Score:2, Funny)
This troll was good, though my favorites are more like "My boss asked me to spend $5 million upgrading the machine room but I've never done this before, so do you have any advice? Should I include comfy chairs?" or "I'm considering upgrading my skills, do you think it would be worth it to learn Javascript or should I just go to grad school at MIT?" Or sometimes, "I'm having a big fight with my boss, can you give me some evidence that Erlang is really the programming language of the future?" I love slashdo
Heat and Noise (Score:1)
Heat: Data centers should be cool. Everyone wants to do things as cheaply as possible, so they spot cool the racks instead of circulating the air and cooling the entire room. Nothing short of abandoning this practice will remove the "it sucks to be in here" factor. The problem isn't so much that it's 100 degrees, but that it's 100 degrees on one side of the rack and 40 degrees on the other. Spend a bit more on cooling costs and get that to 80/60 or even 90/50 and workers will be much less miserable (an
Missing the point much? (Score:2)
Does cost matter? (Score:2)
I'm moving toward human-free data centers..... (Score:4, Insightful)
It's not that I don't like humans, hell I married one. However humans are unpredictable. Applications want and need predictable hardware to live on. Even in a "CLOUD" with floating VMs that fly around like Unicorns you want stable predictable hardware underneath.
Humans trip on things, excrete fluids and gases, need oxygen, light, are temperature sensitive and depending on who's stats you believe cause up to 70% of outages.
I see convergence, virtualization etc as a chance to finally get humans OUT of the data center. Build it out, cable everything. Then seal it. Interaction does NOT require physical access. And a team of dedicated obsessive compulsive robots or humans can replace memory, drives etc.
Data Centers need to be human FREE zones. Not the common room in a dorm.
Re: (Score:2)
Even in a "CLOUD" with floating VMs that fly around like Unicorns you want stable predictable hardware underneath.
Simply untrue. When you have a "CLOUD" that's big enough, the failure rate itself is stabilized. But you still need humans to work it. Human techs are way cheaper than machines that are so reliable that they don't need them.
That isn't always true.... (Score:2)
For some App, Data Center RAID works well. Build 10 low cost data centers, mirror 5->5, run like the wind.
However, due to the transactional nature of some Applications this is not true.
For example, Google's model works great for search.
However it didn't keep those 150,000 users from loosing their data did it?
Once the transaction is done, you can sync the results, but a failure mid transaction can be VERY bad.
No. (Score:2)
Windows Friendly (Score:2)
Shouldn't it be windows friendly since they have to be there first Tuesday of the month.
100 degrees? (Score:2)
That's boiling water temperture! I can just imagine the piles of cooked sysadmins there.
crash carts? (Score:2)
Are people still using crash carts for routine maintenance, in this day of ubiquitous KVM? In our datacenter every other row has a desk with two terminals with access to the KVM for those two rows. Unfortunately it's in the cold aisle, which makes it a little chilly to work there if you didn't bring a coat. I have been known to take breaks in the nearest hot aisle to warm up.
A crash cart should be used for diagnosis and installs. KVM for everything else.
the bandwidth is just not there for that right now (Score:2)
the bandwidth is just not there for that right now do you want pay the costs to get a fios like network in the areas that are still just on ADSL?