Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Data Storage IT

Data Center Designers In High Demand 140

Hugh Pickens writes "For years, data center designers have toiled in obscurity in the engine rooms of the digital economy, amid the racks of servers and storage devices that power everything from online videos to corporate e-mail systems but now people with the skills to design, build and run a data center that does not endanger the power grid are suddenly in demand. 'The data center energy problem is growing fast, and it has an economic importance that far outweighs the electricity use,' said Jonathan G. Koomey of Stanford University. 'So that explains why these data center people, who haven't gotten a lot of glory in their careers, are in the spotlight now.' The pace of the data center build-up is the result of the surging use of servers, which in the United States rose to 11.8 million in 2007, from 2.6 million a decade earlier. 'For years and years, the attitude was just buy it, install it and don't worry about it,' says Vernon Turner, an analyst for IDC. 'That led to all sorts of inefficiencies. Now, we're paying for that behavior.'" On a related note, an anonymous reader contributes this link to an interesting look at how a data center gets built.
This discussion has been archived. No new comments can be posted.

Data Center Designers In High Demand

Comments Filter:
  • by aproposofwhat ( 1019098 ) on Tuesday June 17, 2008 @10:01AM (#23822671)
    So what does a data centre designer actually do?

    Do they constrain the end user to particular hardware, or is it just basic civil engineering?

    I can see that a well planned installation can reduce cooling costs, but if Customer A insists on having his Superdome rather than a more energy efficient alternative, what does the designer do then?

  • by Kohath ( 38547 ) on Tuesday June 17, 2008 @10:08AM (#23822739)
    This is only a problem because the power grid has become very fragile.

    Electricity generation hasn't grown ahead of demand due to government meddling, atom-ophobia, and environmentalist obstruction in the courts and on planning boards.

    The rolling blackouts will be coming soon. It'll start with small ones. Then everyone will buy battery backups that draw a lot of power to recharge once power is restored. This will cause the duration of the periodic blackouts to go from a few minutes to a few hours in about 2 years.

    Not long after that, we'll start building power generation capacity in the US again.
  • by apathy maybe ( 922212 ) on Tuesday June 17, 2008 @10:15AM (#23822833) Homepage Journal
    Before someone else does it, I direct you to BOFH, http://www.theregister.co.uk/2008/05/23/bofh_2008_episode_19/ [theregister.co.uk]
    The BOFH cares about important things:

    "There are seven pubs and two Indian joints within a one block radius, a tube station a couple of blocks away and a women's fitness centre across the road."
    Like service:

    "WE COULD CUT A HOLE IN THE FLOOR OF MISSION CONTROL, INSTALL A POLE, KEEP A CAR IN THE SERVICE BAYS AND CALL IT THE BATMOBILE!"


    So, besides electricity usage, what else should you care about? How about heat? Your room can't be too hot (you can send all the heat to the swimming pool in the fitness centre...).

    What about wires? Both a OHS issue, and a potential to kill off half your servers if you trip over an exposed power cord or network line. So you lay them under the floor?

    Complicated stuff this...
  • by khallow ( 566160 ) on Tuesday June 17, 2008 @10:30AM (#23823019)

    I don't understand the peculiar emphasis the New York Times places on "endangering" the power grid. Even though a data center uses a lot of electricity, it's a high value operation that needs a stable power supply. What's wrong with the idea of paying more to insure that your power supply is sufficiently stable for your needs? The power company accepting those checks can then work on delivering that power. It's like saying that I'm somehow responsible for the stability of the oil production and distribution infrastructure because I drive a car. Perhaps, if I tweak my engine just so, I can engineer a democratic transformation of Saudi Arabia. I'll see if changing the oil does the trick.

    At some point, you have to realize that the consumer, no matter how big, isn't responsible for the supply of resources by another party. If there's a problem with how those resources are supplied, be it fixed price (regardless of demand) power transmission lines, pollution, or deforestation, then that problem should appear as an increase in cost to the consumer. If it isn't, then it's a problem with how the resource is distributed, not a problem with the consumer.

  • by bsDaemon ( 87307 ) on Tuesday June 17, 2008 @10:32AM (#23823035)
    On the other hand, it might be the final push that people need to start making their homes and businesses as energy efficient as possible, up to and including home solar and/or wind; use of more energy-efficient appliances, low-power-consumption electronics, etc.

    I would dare say that the future looks good for ARM and Via on that last account, at least.
  • by aaarrrgggh ( 9205 ) on Tuesday June 17, 2008 @10:42AM (#23823155)
    There has been a shortage of architectural engineers for the past two decades. I say architectural engineers because very few mechanical engineers go into HVAC, and very few electrical engineers do power systems. It doesn't seem quite as bad structurally.

    It us a shame because it really has a lot of great career opportunities.

    Data center work is just a subset of that-- it is hard to find people with the experience, but not impossible to train.
  • by Collective 0-0009 ( 1294662 ) on Tuesday June 17, 2008 @10:49AM (#23823257)
    We just thought about doing this for a slightly different reason. Trailers are a funny loophole in US regulations. If you pull a trailer inside a building, the building inspectors come (enviro, electric, whatever) and you just tell them it is a trailer that is currently stored inside. They assume it is regulated by the DMV or some other travel saftey org. But if you never actually drive the trailer on the roads, you never incur that regulation/inspections. Seriously, trailers are a sneaky way to avoid some regulations - legally.
  • by Sobrique ( 543255 ) on Tuesday June 17, 2008 @11:03AM (#23823457) Homepage
    Well, between uninterruptible power, and air conditioning, datacentres are probably one of the highest 'power overhead' applications. There's a hell of a lot of 'waste' there, which you can design out, in some measure.

    But yes, it's a priority application too - datacentres score as 'business critical' in most companies, so no matter how much it costs to run, it's cheaper than it 'not running'.

    Part of the point of DC design is resiliency, and therefore you _do_ have to consider available services and supplies - like the local powergrid, and how screwed you'll be if it does hit the breaking point.

  • by miller60 ( 554835 ) * on Tuesday June 17, 2008 @11:26AM (#23823747) Homepage
    On the "how a data center gets built" front, last week I had a tour of a new $250 million data center facility in Virginia that is getting ready to open later this month. The facility manager provided a walk-through of the power and cooling infrastructure, explaining the company's approach to designing these systems for energy efficiency and scale. I shot video, which is now posted online [datacenterknowledge.com]. The data center operator, Terremark, separated most of the electrical infrastructure from the IT equipment, putting them on separate floors and housing the generators in a separate facility. They have 11 generators now, but will have 55 Caterpillar 2.25-megawatt units when the entire complex is finished.
  • by aaarrrgggh ( 9205 ) on Tuesday June 17, 2008 @11:37AM (#23823919)
    All companies face those same challenges, including Microsoft and Google. I work mostly with banks, and we are always faced with micro and macro change and growth planning. With help, some banks can go from a one-quarter projection to a reasonable three year projection. Five to six years is harder, but sometimes possible.

    The biggest secret is in providing enough space to allow for growth, changing needs, and eventually equipment replacement.

    As for efficiency, I have to tell an aspiring co-lo that they will pay more for power than their OC-192s, and thatthe cost of a server is less than the power it consumes. It is easy for growing companies to ignore it at first, but it eventually catches up with you.

    The old solution was to move the servers to a place with cheap electricity. That will backfire soon; you really need to shift focus to plan for energy efficiency, even if it means your fiber runs are longer (segregate by density rather than system or function).
  • by jsailor ( 255868 ) on Tuesday June 17, 2008 @01:28PM (#23826079)
    While it may appear that you don't have to work hard to cool the data centers, you will have to work hard to humidify them if you do not want your equipment to die. This is a non-trivial cost and is the reason the "free cooling" (taking in outside air to cool a data center) is often not free.
    One answer may be heat wheels, but they are fairly new and unproven in the data center space. Take a look at http://www.kyotocooling.com/ [kyotocooling.com]

  • by Cramer ( 69040 ) on Tuesday June 17, 2008 @04:21PM (#23829271) Homepage
    Oh yes! Carpet in a server room. I wouldn't even put "NSA carpet" in there -- it has conductive filaments to ground out any EMI.

    I had the same several month long arguements in planning our new office. It's expensive. We cannot raise the ceiling (the building HVAC systems are in the plenum.) Do we really need 5ton air handlers. Do we have to have 2 of them. etc. etc. Well, my 12" floor became a 10" floor -- a compromise to make the ramp 2ft shorter, and 2 Lieberts became one because no one listened to my original specs and the landlords wouldn't buy the second one. (those things are expen$ive.)

    Btw, that single point of failure failed within *4* months requiring basically the entire office to be shutdown all day to get it fixed. It was over 100F in there in less than an hour.
  • Re:IT = Volatile (Score:3, Interesting)

    by Gilmoure ( 18428 ) on Tuesday June 17, 2008 @04:41PM (#23829687) Journal
    I've been a low level tech grunt for almost 20 years. Nice thing is, I'll always have work. Am the 21st C. equivalent of a general auto mechanic. Nice thing is, I've been able to make a decent living and pretty much pick my work environment. Have never been afraid to pick up and leave, even with family. What's funny is, there are several younger guys in my shop and they are chafing at the work. They don't like the hands on stuff and are just bitching and moaning, hoping to get a team lead position. Only one guy is doing anything about it, going to school for his MBA. Hopefully, he'll retain some sense of just what can and can't be accomplished via IT and won't become a bone headed manager.

I've noticed several design suggestions in your code.

Working...