Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Hardware IT

Making Data Centers More People-Friendly 137

1sockchuck writes "Data centers are designed to house servers, not people. This has often meant trade-offs for data center staffers, who brave 100-degree hot aisles and perform their work at laptop carts. But some data center developers are rethinking this approach and designing people-friendly data centers with Class-A offices and amenities for staff and visitors. Is this the future of data center design?"
This discussion has been archived. No new comments can be posted.

Making Data Centers More People-Friendly

Comments Filter:
  • Where I worked (Score:2, Interesting)

    by Anonymous Coward on Wednesday March 02, 2011 @07:41PM (#35363452)

    I've worked in several data centers. An IBM one had air cooled servers (push cold air into the floor, and every rack has a massive fan pulling cold air up from the floor. It was about 20C day and night. The floor tiles would occasionally break which caused problems when pulling skids of bills over them (it was a phone bill processing and print facility). We would also go through 30 tons of paper per month (2x 1000lb skids of paper per day). There was a lot of paper dust in the place, and the paper was perforated, but on the long side (not on the short side like pc printer paper) because it would tear apart if it was ran through on the short side. There were tractor holes too, but they weren't perforated. Rotary cutters would cut off the tractor holes. The paper went through some of the equipment at about 60 miles per hour. The printers were in general, slower (IBM 3900 laser printers), as they could only print 229 pages per minute. A 2200 sheet 35 pound box of paper would go from full to empty in about 9 1/2 minutes. Fire prevention was Halon. We were told that if the Halon goes off, you probably won't die from the Halon snuffing you out, but rather the floor tiles flying up and severing body parts (they were about 2 1/2 feet square, made of aluminum about 1 inch thick, but only about 10 pounds each). I worked in another data center that had no windows. If the power went off (and it did once, but not when I was on shift), everything went black. No emergency backup lights. The room was about 80 feet wide, and at least 150 feet long, with rack and servers galore (2 operators, more than 300 machines), including DEC Alpha boxen, HP HPUX boxen, PC's, network archive servers, etc. Good luck feeling your way out of that one. While the company was very picky about losing data and running jobs at night, their main interest was making money, and if that involved cutting a power line (tech cable) to put in a road to move product temporarily, . In general, data centers are built to house computers. Operators are a second thought. If there is a problem, bosses yell at operators. Is it up yet? How about now? When? ... and if bosses come in with guests for a dog and pony, operators are chattle (it would be good if you went away somewhere). If there is a problem.... whats the problem, what did you do?

  • Re:First troll! (Score:4, Interesting)

    by EdIII ( 1114411 ) on Wednesday March 02, 2011 @08:24PM (#35363926)

    The data center I visit most right now has hot/cold aisles. It looks more like a meat processing center with all the heavy plastic drapes. They go from floor to ceiling every other aisle. On the front of the racks they even put in plastic placeholders for gaps where we don't have equipment installed yet to maximize air flow through the equipment. They did it too, we never even had to ask.

    Most of the time we work from the cold aisle with our laptop carts and it is *cold*. The article is confusing because I can't possibly understand why you need to sit with a cart in the hot aisle to work. You can install your equipment and cabling in such a way that you don't need access to the hot aisle for anything other than full server swap outs, cabling swap outs, and that's pretty much it. You can replace the hard drives from the front of the units, and maintenance the server just by pulling it out in front after disconnecting the cables if you need to do so. Most big 4U servers come with cable management arms that allow you to keep "service loops" so that you don't need to disconnect anything to pull the server out on the rails.

    Heck, if you need to just get a 15ft networking cable and thread it through into the cold aisle. You don't have to sit in the heat if you don't want to. Although, I'm a big guy and I like the cold it but its funny as hell to see the skinny bastards walking over to the hot aisle to warm up.

No man is an island if he's on at least one mailing list.

Working...