Data Center Designers In High Demand 140
Hugh Pickens writes "For years, data center designers have toiled in obscurity in the engine rooms of the digital economy, amid the racks of servers and storage devices that power everything from online videos to corporate e-mail systems but now people with the skills to design, build and run a data center that does not endanger the power grid are suddenly in demand. 'The data center energy problem is growing fast, and it has an economic importance that far outweighs the electricity use,' said Jonathan G. Koomey of Stanford University. 'So that explains why these data center people, who haven't gotten a lot of glory in their careers, are in the spotlight now.' The pace of the data center build-up is the result of the surging use of servers, which in the United States rose to 11.8 million in 2007, from 2.6 million a decade earlier. 'For years and years, the attitude was just buy it, install it and don't worry about it,' says Vernon Turner, an analyst for IDC. 'That led to all sorts of inefficiencies. Now, we're paying for that behavior.'" On a related note, an anonymous reader contributes this link to an interesting look at how a data center gets built.
Green IT (Score:4, Informative)
Re:OK - what do they do? (Score:5, Informative)
Those of you who have been in data centers have seen forced air cooling that is not used correctly; cabinets not over vent tiles, vent tiles in the middle of the floor, cabinets over air vent tiles but with a bottom in the cabinet so no air flows.
When equipment is nearing end of life and hardly being used, it sits there and turns electricity into heat while doing nothing. There are often a grand mix of cabinet types that do not all make best use of the cooling system, undersized cooling systems, very dense blade style cabinets replacing cabinets that were not so dense unbalances the heat/cooling process in the whole data center. Not to mention what doing so does to the backup power system when needed.
There are hundreds of 'mistakes' made in data centers all over the country. Correcting them and pushing the efficiency of the data center is a big job that not many people were interested in paying for in years gone by.
If you are interested in what you can do for your small data center, try looking at what APC does, or any cabinet manufacturer. They have lots of glossy marketing materials and websites and stuff. There is plenty of information available. Here's a first link for you http://www.datacenterknowledge.com/archives/apc-index.html [datacenterknowledge.com]
Re:OK - what do they do? (Score:5, Informative)
- Power - redundancy, and resisiliency as much as 'just having enough'
- Cooling - air conditioning is a BIG deal for a data centre - you need good air flow, and it probably doubles your electric bill.
- Specialist equipment - datacentres are _mostly_ modular, based around 19" racks. But there's exceptions, such as stuff that is 'multi-rack' like tape silos.
- Equipment accessibility - you'll need to add and remove servers, and possibly some really quite big and scary bits of big iron - IIRC A Symmetrix is 1.8 tonnes. You'll need a way to get that into a datacentre, which doesn't involve '10 big blokes' - spacing of your racks might not help
- Putting new stuff in - a rack is 42U high. Right at the top of that rack, is going to require overhead lifting.
- Cabling. Servers use a lot of cables. Some are for power, some are for networking, some are for serial terminals. You've got a mix of power cable, copper cables, fiber cables. They need to fit, they need to be possible to manipulate on the fly, and they need to not break the fibers when you lay them. You also need to be aware that a massive bundle of copper cables is not perfectly shielded, so you'll get crosstalk and intereference. And every machine in your datacentre will have 4 or more cables running into it, probably from different sources, so you need to 'deal' with that.
- Operator access - if that server over there blows up, how to I get on the console to fix it. If I am on the console to fix it, how do you ensure I'm not twiddling that red button over there that I shouldn't be.
- Remote/DR facilities - most datacenters have some concept of disaster planning - things as simple as 'farmer joe dug up the cable to the ISP' all the way to 'plane flew into primary data centre'. These things are relatively cheap and easy to deal with on day one, and utter nightmares to retroengineer onto a live data centre.
- Expansion - power needs change, space needs change, technology changes and
... well, demand for servers increases steadily. It's something to be considered that you will, sooner or later, run out of space, or have to swap out assets.
That's what springs to mind off the top of my head. There's probably a few more things. So yes, civil engineering, but with a splattering of IT contraints and difficulties.Re:endanger the power grid? (Score:3, Informative)
We are building high power devices on a power grid that was designed 20 years ago before such concepts weren't even thought of.
Environmentalists won't let new plants be built. Solar, Wind are nt yet generating enough to even begin to offset the demand.
it isn't distribution it is availability. by 2015 I expect rolling blackouts during hot summers, simply to keep the air conditioners going.
Re:endanger the power grid? (Score:3, Informative)
Re:I thought it would be higher (Score:5, Informative)
Re:It's the NIMBYs. (Score:3, Informative)
Re:OK - what do they do? (Score:3, Informative)