Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Power IT

Price of Power in a Data Center 384

mstansberry writes "Much like the rest of the country, IT is facing an energy crisis. The utilities are bracing companies for price spikes this winter and according to experts and IT pros, those prices aren't going to come down any time soon. This is thefirst article in a four-part series investigating the impact of energy issues on IT."
This discussion has been archived. No new comments can be posted.

Price of Power in a Data Center

Comments Filter:
  • by Hulkster ( 722642 ) on Thursday October 27, 2005 @05:21PM (#13892483) Homepage
    I think "crisis" is a bit sensational, but yea, power is a concern and it ain't getting any cheaper. This is certainly not helped by the power consuming (and heat generating) hot chips from Intel. Note that you have to pay for that "twice" since for every BTU they consume in electricity, you have to cool it in a data center. Ironically, Part 1 does not even talk about how the CPU itself is a big issue here ... maybe they'll cover it in the rest of the series. Speaking of which, wouldn't it be better for stuff like this to wait until the series is over before posting on Slashdot?

    P.S. The submitter has a nice fishing web site and is holding about a 12" trout on his main page. Nice catch ... but I'd recommend he go on a fishing charter in Seward Alaska [komar.org] if he wants to catch some mongo fish. This trip was a major slayfest and my brother was Captain Crudd [komar.org] who knows how to fish with a beer in his hand.

    • by Tavor ( 845700 ) on Thursday October 27, 2005 @05:25PM (#13892516)
      Run the heat in the winter with Intel chips! Just do batch-processing, or some intense rendering work.
    • by Jeff DeMaagd ( 2015 ) on Thursday October 27, 2005 @05:29PM (#13892544) Homepage Journal
      The cooling expense isn't as bad as the heating. I think the theoretical efficiency of cooling is 10% of the heat to be removed, where it would take 100W to remove 1000W of heat. In practice, it is about 30%, so it's not as bad as some people think.

      One thing I am skeptical of is the need to cool to like 60 degrees F that I've heard (and felt in one room). Good cooling is nice, but I know one guy that says they don't ever see problems until the temperature is above 80F, so businesses can save a lot by not being so freaking cold.
      • by demigod ( 20497 ) on Thursday October 27, 2005 @05:54PM (#13892743)
        One thing I am skeptical of is the need to cool to like 60 degrees F that I've heard (and felt in one room). Good cooling is nice, but I know one guy that says they don't ever see problems until the temperature is above 80F, so businesses can save a lot by not being so freaking cold.

        I always considered that as buffer for when you loose one of the AC units. That way if it takes all day to get it fixed, your only up to 80F and still OK.

        • I always considered that as buffer for when you loose one of the AC units. That way if it takes all day to get it fixed, your only up to 80F and still OK.

          Or, have a redundant AC unit that only kicks in when temp > 75. That way, you're not paying to keep it at 60, and you're covered if you lose a unit.
        • by Tiger4 ( 840741 ) on Thursday October 27, 2005 @06:38PM (#13893047)
          That depends on the room and the equipment. A single small box ins a large room might take all day, or all week to heat it up to 80F. On the other hand, a lot of boxes in a small room might jump to 80F within minutes of losing the cooling. There is no substitute for good engineering. Do the calcs and set it up right.
      • I believe it's not so much a high temperature that affects electronic equipment, but fluctuations in temperature.

        If your PCB traces are expanding and contracting between say 20C at night and 40C during the day you're going to get fatigue. It's also not so good for mechanisms inside hard disk drives.

        So the HVAC guy at the local television studio tells me anyway.

    • by Rei ( 128717 ) on Thursday October 27, 2005 @05:34PM (#13892593) Homepage
      I saw a speech recently from the director of NCAR, Tim Killeen. NCAR does advanced climate change modelling. According to his speech, they know pretty much every factor now that has a relevant effect on global climate; their only limitation that they are aware of is processing power and data storage. As such, their computing requirements are growing notably faster than Moore's law.

      Their current power bill is 40,000$/mo. At their new facility (you can see a design of it in this document [ucar.edu]), it will be far more. Most of the building will be for computers and associated equipment; the building is being largely designed for dissipating all of the heat. I recall he said it was to consume about 3 MW, so at 0.8 cents/kWh, that would be about 175k$/mo.

      As an aside, it was a really fascinating presentation. They showed *their* model of Katrina (which was presented to the White House as an "experimental product"); it was spot on. Very impressive stuff indeed. At one point I asked him about proposed methods to induce global cooling such as dumping iron into iron-deficient waters. He stated that while he hadn't modelled that, their models already take into account natural mineral influxes and their effects on bacteria populations (and thus, the effects of those bacteria on the environment), so they could model that if they needed to. He also pointed me to some newer Vostok core data :)
    • by c0l0 ( 826165 ) on Thursday October 27, 2005 @05:58PM (#13892789) Homepage
      Believe it or not, but I moved into the apartment from where I'm writing right now last year, and winter's been quite harsh 10 months ago or so here in Austria. I used to heat my ~35m^2 flat with ym Pentium-4-Northwood@3.5GHz-powered PC, and have not figured out how to operate the flat's heating yet... what will turn into an annoying problem soon, cause I swapped the Intel-beast for a low-power AMD box, which is dissipating a whole lot less heat; I'm actually already freezing a little right now :-)
    • http://en.wikipedia.org/wiki/Subthreshold_leakage [wikipedia.org]

      "Subthreshold leakage is the current that flows from the drain to source of a MOSFET when the transistor is supposed to be off.

      In the past the subthreshold leakage of transistors has been very small, but as transistors have been scaled down, subthreshold leakage can compose nearly 50% of total power consumption."

      Perhaps the government should have imposed restrictions on the energy consumption of CPUs earlier. All we've done is feeding the CPU's with more powe
  • by gtrubetskoy ( 734033 ) * on Thursday October 27, 2005 @05:24PM (#13892508)

    We have a page [openhosting.com] on our site with some calculations on how much energy is being saved because we're using Linux VServer and why dedicated servers are not environmentally-friendly (at least not with the current technology - this may change). The numbers are probably off a bit, but they give you some idea.

    Also the street price for a 20A circuit in a datacenter is $200-$300, while the cost of a megabit is $100 or less. So a rack of servers that requires two power circuits and pushes 3Mbps (not an unusual scenario) costs twice as much in power than in bandwidth.

    And here's another article [eweek.com] on this issue. And another [geek.com].

    • by grqb ( 410789 ) on Thursday October 27, 2005 @05:44PM (#13892658) Homepage Journal
      I bought myself a watt meter to measure the power of some of my home electronics. So I tested my friend's laptop, it was a Dell, 15 inch monitor P4. Under linux the laptop was drawing 50-100watts (which is very high for a laptop), under windows it was drawing from 30-50 watts. Linux on desktops has the same power management as windows on desktops though.
      • It sounds like Linux was running the laptop at a higher clock rate. Many laptops have a configurable clock rate, and will turn the rate down when power savings are needed (for example, when AC power disappears and the laptop switches to battery power).

        A little fiddling with the power controls of Linux would probably get it to the same power consumption as Windows. While you measured something real, it's probably a configuration issue more than a builtin Linux vs. Windows difference.
    • If you're drawing 20A and only serving up 3 megabits, you're doing something seriously wrong.
  • by grqb ( 410789 ) on Thursday October 27, 2005 @05:25PM (#13892513) Homepage Journal
    Energy prices are going to hurt everybody.

    From here: [thewatt.com]
    "EIA expects energy expenditures will be 18% higher this winter compared to last winter, which will be 8.3% of the annual gross domestic product, a record since 1987 when it was 8.4%."

    And for those of you who want to find a way to save energy: Here's 60 Tips To Save Energy This Winter [thewatt.com]
    • I own a retail store. Distributors that used to distribute on their own trucks for free are now charging upwards of $20/trip. This is getting passed on to the consumer. Inflation due to energy prices is quite real. I've been "inflating" prices to compensate for it all day, in fact.
  • Solution? (Score:4, Funny)

    by exi1ed0ne ( 647852 ) <exile@pessim[ ]s.net ['ist' in gap]> on Thursday October 27, 2005 @05:27PM (#13892530) Homepage
    Pedal faster!!
  • Moore's law? (Score:4, Insightful)

    by ChrisGilliard ( 913445 ) <christopher.gilliard@nOSpAM.gmail.com> on Thursday October 27, 2005 @05:27PM (#13892533) Homepage
    "It's the other side of Moore's Law," Sneider said. "As the cost of [buying] these machines decreases, the cost of powering and cooling them increases."

    I don't agree with this. How power efficient was Eniac? Also, my laptop lasts much longer the one I had a few years back. I think we're making progress on the power front, but the demand for computing power is attracting more and more dollars, the power cost is largely insignificant with regards to the return on investment.
  • How much power loss is due to ludicrous numbers of layers of processing that go on in modern OSes and applications?

    Time for the OS vendors to realize that smaller, efficient code footprints will save money in real world terms.

    Then again, I code in java for a living (Ducks)
  • by Black-Man ( 198831 ) on Thursday October 27, 2005 @05:29PM (#13892543)
    And the cost of extracting a ton of coal hasn't changed much from 1995 to 2005. But it shows what a sham commodity trading is - the price of a ton of coal (because it is 'energy' related) is traded relative to the price of a barrel of oil or the cost of a cfu of natural gas.

    All this does is further underline the boom/bust cycles of the energy business and how it negatively affects the economy.

    • When demand increases, price will increase regardless of supply cost. Commodity trading isn't a sham, it's just the way the economy works. If oil and coal were mandated to be sold at a constant price regardless of demand, the supply would run out quickly as people would have no incentive to conserve or to explore for new sources.
    • by davidwr ( 791652 )
      Capitalism is all about supply and demand and the cost of buying A vs B vs doing without.

      A barrel of oil may cost $x to pump out of the ground, deliver, process, and burn and coal may cost a fraction of that for the same energy-equivalent.

      But it doesn't matter. As long as the demand at either of those prices exceeds supply, the open-market price of both will be about the same and will be higher than the "production" costs.

      When the demand is between the two "production costs" that one will be heavily favore
  • This is thefirst article in a four-part series investigating the impact of energy issues on IT.

    Does this mean three slashdot dupes forthcoming?
  • by n3umh ( 876572 ) on Thursday October 27, 2005 @05:31PM (#13892564) Homepage
    Materials needed: Fans. Flexible duct. Duct tape (of course).
    Procedure: Place fans in datacenter. Tape duct to fans. Route duct to office spaces.
    Results: Save money on heating and cooling bills.....
  • Just on the cusp of hydrogen fuel cell techonology becoming available, we're about to be hit hard with spikes in both gas and electricity. The SK crown corporation SaskEnergy asked the rate review panel for a 41% increase, but the review panel recommended "only" 27%. Auto gas prices have soared as high as $1.20/litre but have settled back at about a $1 CDN. Natural gas though is what scares Canadians, since most heat their homes with either that or electricity.

    Sask Power is running advertising imploring
    • Here's a cut and paste from a different forum where I was measuring my electric usage trying to track down why my bill was so high.

      Initial measurements of the PC are interesting

      the monitor draws 3 watts when turned off.
      it draws 60 watts when turned on and in text mode
      it draws 88 watts when in graphical mode.
      effectively the monitor is turned off for 17 hours a day and is in graphical the remainder so thats
      0.667 kwh/day or about 4 cents. this is 20 kwh/month or $1.2/month

      The cable modem draws 13 watts when on
  • HP just came out with multicore half-height blades. Their latest requirements are 30 amp, three phase power per PDU for a blade rack, with 4 pdus/rack for redundancy. That's enough power service to cover 3 modern 3000 square foot homes when you factor the energy back to 240 volts.

    Getting the power to something this silly isn't the pain. COOLING something that consumes 14KW in a 4 square foot space is the challenge anyone in data center management faces. Both HP and IBM have come out with the "innovation
  • by suitti ( 447395 )
    Much of the energy used is for air conditioning. One might think that this would be easy for data centers in Michigan, but I've worked for places that heat the building and air condition the data center in January. One place had the data center a/c die, and a box fan in a window allowed everything to run. A box fan has to be cheaper to run than a/c. So what we might see is smarter environmental control. At least in the winter, it makes sense to run outside air in, and use the waste heat to heat the res
    • Unfortunately, you can't reliably use outside air. While moisture is not typically a factor during winter, pollution is. Many data centers have particulate sensors in their fire system which will go crazy if a bus goes by. And it just dirties up the place, possibly voiding the customer's contracts.

      Second, you need the pressurized cooling system. Yeah, your window AC may keep the room at 60F but if the cabinets are expecting cool air to be pumped up through the floor to be vented out the top you can writ
  • FTA:"Historically, I haven't managed the electric bill. But now we're aware and interested in it," Doherty said. "If I told my boss that my staff wanted a 27% increase [in pay], I'd be downstairs on the carpet."

    First, it's good that he's paying attention to the electric bill now. But he should have been paying attention to it in the past (last year saw a spike in prces, too). TCO and all that. Of course, electricity may be negligible compared to other costs, depending on their setup.

    Second, it's hig
  • One question (Score:4, Insightful)

    by Spy der Mann ( 805235 ) <`moc.liamg' `ta' `todhsals.nnamredyps'> on Thursday October 27, 2005 @05:39PM (#13892623) Homepage Journal
    Shouldn't there be an initiative to certify computer systems as "low energy", i.e. using low power processors, come with LCD monitors, etc?

    Just as the state of Massachusetts chose to use F/OSS to save in office software, why not asking government offices to replace CRT's with LCD monitors?
    • It's called Energy Star, but it doesn't (yet) apply to servers.
      • Server can't exactly hibernate when there's limited activity. Last thing i want is my server shutting down. At my last job, we never brought down our solaris boxes because we were always afraid they wouldn't boot up again. 3 years uptime on them.
    • My company is a quasi-governmental agency. We actually have been pushing out most of our CRTs in favor of LCDs, especially in departments with 20"+ monitors (engineering, etc).

      20" CRT: 180W
      20" LCD: 60W

      20" CRT cost: $400
      20" LCD cost: $700

      At 8c/kWH, for a 10 hr day, it saves 9.6c/day. So to pay back the $300 it'll take 3,125 days. At 250 working days/yr, thats over 12 years to pay back the cost.

      In other words, people like to act all "green" but its just an excuse to get the new toys. The monitor probably wont
  • This isn't really a new problem, as you can see from this article [infoworld.com] dated January 5, 2001. From the article: Amazingly, a large hosted-server operation can average the same power usage as a steel manufacturing plant.
  • mention that much of the power loss and heat generation is due to thousands of power supplies in each data center. If data center racked computers used DC power, the power conversion takes place in one area, and only heat generated by power usage is generated in the data center. This reduces power losses due to multiple AC/DC conversions, as well as the heat generated in those conversions. Less heat means less AirCon is needed, so less power there too. This is such a simple thing to do as well. Most huge te
    • Check the power numbers on Intel CPUs - they exceed 100w for all of the enterprise stuff. In a dense rack using blade servers the number of processors can exceed 100 (and there are configurations of around 200). Do the math and you see than this is 10KW. If you look at the specs for a typical set up, the input for this kind of kit is around 20 to 24KW - so around 40-50% of the heat is the CPUs.
    • Be very careful when you talk about reducing the number of power supplies. DC current has much higher losses in the wire since it is much lower voltage (thus higher current for the same power), so if consolidating the power supplies lead to having long DC cables running around your machine room, you'll end up adding to head and wasted power instead of reducing it.

      In addition, computer motherboards need pretty clean DC power. Longer cables, and more devices connected to them, will lead to less accuracy in th
    • I made a typo a few minutes ago and said "LDC", but LED is the wrong technology. It's the kind of display used in clock radios with red blocky displays, if you aren't sure about the difference.

      My power company is advertising a power savings of 66% by switching to an LCD from CRT montior. And they are telling people that a screen saver does not power down a CRT so they are still paying for power while it's on.
    • Power supplies generate very little heat compared to the rest of the system and the impedance issues are much smaller with AC than with DC, meaning you can use smaller, lighter wire. also, I think you meant LCD, not LED displays. LED displays aren't here yet :)
    • Your observation about power supplies is a good one. Are there any safety issues with DC? Can you recommend a centralized DC power source that could be used for, say, a cluster of 100 PCs? Ideally it would come with wiring harnesses for cpu mainboards and hard drives. Also it would be nice if it the centralized converter could be placed outdoors.
  • Any time soon? (Score:3, Interesting)

    by MirrororriM ( 801308 ) on Thursday October 27, 2005 @05:45PM (#13892674) Homepage Journal
    and according to experts and IT pros, those prices aren't going to come down any time soon.

    Let's be realistic, they won't come down...ever. If they can get another 20% (example) out of you this year, do you think they're going to drop it 20% next year after the "crisis"? 10% even? No way. Just like any other energy business that is at a near-monopoly level (gasoline), they can raise it whenever they feel like it and blame it on whatever they want. What are you going to do, go to the competition? In the area I live in (Midland, Michigan) and the surrounding cities (Saginaw, Bay City, Flint, etc) we get ONE choice for gas and electricity - Consumer's Energy. That's it. You don't like their service or prices? Tough shit. You're stuck. There have been "alternative companies" in the past, but all they do is resell energy for Consumers Energy - it's all going through the same pipes and wires.

    It sucks, but that's the way it is.

  • Blame XML and Java (Score:2, Insightful)

    by fuzzy12345 ( 745891 )
    I've been in the coding game a long time, and a lot of the new technologies in the last few years are just insanely wasteful of computer power. Computers are, of course, supposed to releive humans of drudgery, and this includes programmers just as much as the clerk/minions that are our end-users.

    Nonetheless, having the computer repetitively recompute the exact same answers (parse that huge XML config file! JIT-compile that Java app, AGAIN AND AGAIN AND AGAIN!) is an exercise in keeping your hardware vendor

    • by TopSpin ( 753 ) *
      Who among us doubts that one AMD64 with a few gigs of RAM could, if programmed properly, calculate the payroll for the entire USA every night?

      Interesting question. Let us consider a simplified universal payroll system and see where this goes. I'll stipulate roughly 200 million US payroll employees and 52 pay periods. Lets say individuals require 200 KiB of storage (historical deductions, contributions, etc. necessary for YTD results,) and generate 1 KiB of storage each period. The necessary software doe
  • Hmm, I submitted a question about low power severs for the home just yesterday (No Yea/Neigh just yet), so I'm glad to see that I'm not the only one looking at this.

    If I look at my own guilty fact sheet, I can see that I'm guilty of the following...

    1. Getting high powered servers all of the time.
    2. Not consolidating services enough.
    3. Not encouraging enough of an emphasis on power consumption.

    On the plus side, I have converted to LCD's whenever I nave needed to replace a monitor. It's a start, but a small one.

    Sin

  • There are a couple of interesting articles at PBS.ORG on this. There is one about private vs. public utility companies [pbs.org].

    My own personal opinion is that this is something we've brought upon ourselves. Both citizens and corporations who are not in the electricity business.

    I've never understood why some industries allow things to happen that cause themselves to suffer while a small set of industries make enormous profit.

  • As some predicted the arab peninsula being covered in solar panels for the time we are completely out of oil, I would make a note on solar power hosting as a sooner or nearer future.

    I can imagine a huge installation in the middle of the desert running all the it buried below ground level in the cold, using minimum cooling energy, one level up, the staff and offices, using natural light channeled in thru tricky optics, airconditioned in natural ways and using solar power....

    OK, I am an idiot, but It would j
  • by fejikso ( 567395 ) on Thursday October 27, 2005 @06:03PM (#13892827) Homepage
    As an international student currently living in the US, it is quite shocking to see how Americans waste electric power. It is simply not logical why people have to bring sweaters to be comfortable during the scorching summer (because the thermostat is set too low) whereas in winter, buildings become furnaces.

    I won't even get started on the obscene generation of trash.

    Hopefully these crises well force Americans to find ways of making themselves more efficient.
    • As a US citizen, I agree completely, but you must understand the cultural forces at work here.

      Since the mid-80's, when energy conservation/innovation became passe, and something to be ridiculed like an 8-track tape or bell bottom jeans, the American marketing and advertising paradigm has increasingly encouraged waste and consumption. Since the 90's, when the explosion of consumer electronics really took off, American consumers have been on a binge of gadgetry, all of which require electricity. Yes, I k
  • Hey, we all have interns dont we? Give them a exercycle with a generator attached.. Problem solved.
  • Virtual Servers (Score:3, Interesting)

    by LiquidAvatar ( 772805 ) on Thursday October 27, 2005 @06:37PM (#13893041) Journal
    Recently, I've been working on virtualizing [vmware.com] data centers. One of our big selling points is the ongoing power costs associated with a large data-center. If you're running 900 logical servers on 25 physical boxes, you're saving a LOT of energy (both in powering the systems and cooling the center).

    More and more players are entering the virtual market (look at the success of Citrix over the past decade, which is a technology that comes from a similar paradigm) - and that means that more and more datacenters are converting. While the cost per kwh might be rising, the costs of running a data-center are coming back under control.

  • What to do (Score:3, Insightful)

    by burnin1965 ( 535071 ) on Thursday October 27, 2005 @07:46PM (#13893461) Homepage
    While killing services and cutting back on powered equipment is an option people should consider efficiency improvements.

    Speaking from experience, a large number of x86 boxes out there are running on power supplies which run in the 60 to 70 percent efficiency range. By replacing old low efficiency power supplies with some of the newer 80plus supplies you will save on electricity for the box and for cooling.

    I did some tests with replacing a cheap 250 watt low efficiency power supply with Seasonic 250 watt 80 plus supplies and found a 20%+ reduction in power consumption at the AC outlet. When I ran the numbers the savings in electricity to the power supply alone would pay for the new supply in one year. And that does not include the saving in air conditioning costs.

    http://www.seasonic.com/ [seasonic.com]

    And no I don't work for them or own stock. :) And there are other 80plus manufacturers, its just that this is the only one I tested.

    burnin
  • by shapr ( 723522 ) on Thursday October 27, 2005 @08:46PM (#13893798) Homepage Journal
    Perfectly reversible computing [wikipedia.org] does not produce heat.
    Ever wondered what happens to bits that you erase out of memory or a register? They get dumped out of the chip and turn into heat.
    Reversible logic reuses the electrical charge for your next computation, or for storing the next 1 that comes along.
    On the downside, reversible hardware is much harder to design, but any addition of reverible logic on today's CPUs would decrease the amount of electricity needed and heat produced.
    Electricity bills would be lower, and heat output would be smaller.
    Laptops would last much longer, desktops wouldn't need a CPU cooler.
    Even better, we could continue increasing the speed and diesize of CPUs.
    One problem right now is that AMD, Intel, IBM, etc are perfectly able to produce a CPU that they have no hope of cooling. If reversible logic were used instead, you could have a 6GHz chip with the heat output of a 4.77 MHz 8086.
    • Erasing a bit at 300 K (27C or 80F) theoretically costs a minimal energy of 2.87e-21 J. That is, an ideal non-reversive computer at 3 GHz erasing 1 GByte at each cycle would just need about 70 milliwatt. It's obvious that our current computers are very far from that limit.

      Also note that current power supplies are all but efficient. That is, a lot of the energy your computer draws from the grid doesn't even reach the CPU.

      Since our main losses are obviously not the inevitable cost of non-reversible computing,
  • Big Money... (Score:3, Interesting)

    by aaarrrgggh ( 9205 ) on Thursday October 27, 2005 @11:50PM (#13894512)
    For a financial data center ("Tier-4" class, 2(N+1)), the cost per kW is $10-15,000 for infrastructure alone. That quickly matches cost of WinTel boxes (although the depreciation cycle is considerably longer).

    Energy Consumption works out close to 3x server power consumption, so 1kW of load is equal to 3kW total energy input. That comes close to $2,700 per year in energy costs for 1kW of server power.
  • by rbrander ( 73222 ) on Friday October 28, 2005 @12:25AM (#13894621) Homepage
    This is just the kind of screwed-up priorities that cause companies to lose all competitive edge.

    Good (?) accounting tends to highlight grand total costs of small things. Good Lord, we spend $27,000 per year on paper clips! Better control them under lock & key. The lost-opportunity cost of the contract bid missed because somebody was hunting for paperclips does not, of course, appear on and ledger.

    Now somebody has summed up the electrical costs of a really large server room and come up with a sum close to a human salary. That always impresses people. (Man is the measure of all things.)

    But what is it as a fraction of total operations and capital?

    At 11 cents per kilowatt-hour (a common residential cost except in badly-gouged locales; but high for major consumers, at least until lately and those 27% increases) your rule-of-thumb for 7x24 consumption is:

    a buck per watt per year.

    500-watt average constant consumption from a basic 3u rack server = $500/year. Easy, no?

    But that's a pretty serious machine, home machines don't commonly have over 400W power supplies - and certainly don't use the 400W all the time. So we're allowing for air conditioning power in the estimate.

    But a serious server starts at $10,000 and you won't get five years out of it, so the capital cost alone is $2000/year and up.

    All but the most automated shops surely have a salaried sysadmin (and/or DBA, backup specialty guy...) for every ten machines. And those guys all cost $50,000 dead minimum. So that's another $5,000 per machined per year for care & feeding.

    So that's $7000/year, plus power at $500. Maybe skyrocketing to $700 and a full 10% of costs.

    And of course I had to assume that the $10,000 included 5 years of vendor support to keep it that low. Never mind insurance, rent on the space, huge UPS's, fire systems, air conditioning (not the power for it, the machinery). In truth, I can hardly imagine power reaching 10% of the operations cost.

    Also, I'm taking some place like NCAR as my site: gargantuan computing power at the service of a dozen professors and their retinue of grad students. Totally running their own programs, not million-dollar software packages like SAP on Oracle. In short, the normal "IT" costs of programmers, analysts, support techs, software vendors, don't exist.

    Because when they do, they dwarf the cost of running the server room and power dwindles down to being 10% of 10% of your total IT budget. Which in most companies is 5%-9% of total operating expenditure.

    Wailing about this cost - which springs out on the accounting spreadsheet because it is up a large percentage from last year - leads to classic penny-wise, pound-foolish decisions.

    Perhaps: "we'll use less power if we consolidate a dozen servers down into one big one". A lot of this has been done by IT departments, whom I swear are pining for the days of the mainframe.

    But at least where I work, business didn't move off the mainframe because it was such a high cost per compute cycle - often enough we were increasing our total computing costs to go PC and small server. We did it for the flexibility.

    And loss of flexibility could cost a business big - for want of a paperclip.

E = MC ** 2 +- 3db

Working...