Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Data Storage Earth Hardware News

Why Waste Servers' Heat? 204

mikejuk writes "A new paper from Microsoft Research (PDF) suggests a radical but slightly mad scheme for dealing with some of the more basic problems of the data center. Rather than build server farms that produce a lot of waste heat, why not have distributed Data Furnaces, that heat home and offices at the same time as providing cloud computing? This is a serious suggestion and they provide facts and figures to make it all seem viable. So when it gets cold all you have to do is turn up the number crunching ..."
This discussion has been archived. No new comments can be posted.

Why Waste Servers' Heat?

Comments Filter:
  • Not new. (Score:5, Interesting)

    by forgotten_my_nick ( 802929 ) on Saturday July 23, 2011 @08:33AM (#36856162)

    This isn't a new idea. Some buildings like this already and IIRC IBM also marked this as one of their next 5 in 5.

    • And the problem with it in most cases is that servers shrink over time. Every new generation is both smaller and more power-effiecent, or they just plain get moved.

      Which means that in 15~20 years you are barely supplying enough heat to overcome heat losses in the system. And the homes and offices have no heating.

      • And the homes and offices have no heating.

        In my office, that would be an improvement.

        In my personal experience, one of the biggest downsides to working in an office of predominately women (like I am now) is that the vast majority are freezing once the temperature drops below 75F. There are people running space heaters under their desks currently because we recently got a male office manager who, in his first managerial decision, turned the air down so that all the men weren't sweating down their backs and through their shirts all day long.

        Of cours

        • by Midnight Thunder ( 17205 ) on Saturday July 23, 2011 @09:25AM (#36856430) Homepage Journal

          Some of them are cold inspite of the actual temperature ;)

        • If the men stripped down to thongs, perhaps that would make a case for lower temperatures in the office.

        • by dbIII ( 701233 )
          Fashion does that to an extent - some women want to show off a lot of skin even in winter (nothing wrong with that) or unknowingly follow a trend inspired by that while a lot of the men are happy to just wear more stuff when it's cold. Then of course there's differences in circulation and body hair and the people that sit still in a corner all day are more likely to feel the cold.
          Most of the women where I work wear jeans and everyone seems to react the same to whatever the office temperature is. Upstairs
        • by Khyber ( 864651 )

          Plants wilting at 87F is an indicator of being rootbound or poor balance of O2 at the root zone (again caused by root binding limiting the available surface area for oxygen absorption by the roots.)

          Just FYI - put them in bigger pots with more soil.

      • They get more efficient... for the same amount of computing power. However, we ain't stopped needing more/faster computing power recently, quite the opposite.
        I would have expected Microsoft to make a proposal about power plants with computing power, not mere "smarter" buildings though.

      • Re: (Score:2, Funny)

        by ozmanjusri ( 601766 )

        servers shrink over time. Every new generation is both smaller and more power-effiecent,

        This is Microsoft research remember.

        Their bloated OSs have kept chip designers busy building faster, more complex CPUs for decades.

      • by fatphil ( 181876 )
        But 20 years ago, we just alt-tabbed between windows, and they just drew themselves as quickly as possible. Nowadays, we (not me, it's a complete abomination, IMHO) want high resolution alphablended 3D wibbly-wobbly animations in order to switch between programs. Pulling a figure out of my arse, that must be about 100x as much work. (The folk interpretation of Moore's Law supports a 57x increase in that period.)

        Likewise, some browsers are now doing web searches in the background with every character you typ
      • And the problem with it in most cases is that servers shrink over time. Every new generation is both smaller and more power-effiecent, or they just plain get moved.

        Which means that in 15~20 years you are barely supplying enough heat to overcome heat losses in the system. And the homes and offices have no heating.

        Sigh... Let's think about this shall we. The reason we currently discard our serveres every 3 years is that the operating costs per compute cycle savings exceed the capital costs of new servers. But these servers have negative operating costs. They will never go obsolete in terms of operating costs per computer cycle. They will only go obsolete when the wall clock time for a calculation becomes undesirably long. For certain kinds of servers (such as ones that are bandwidth starved) the machine will fa

        • Two points. First, it's not certain that the machines will have negative operating costs - reduced operating costs, sure, but using electricity for heating has always been a silly-expensive way to do it (exclusive of certain areas with ridiculously good hydro power). Secondly - 403(b)s? I'm pretty sure those are for educational institutions, non-profits, and the like... not businesses and utilities.
      • by watanabe ( 27967 )

        Nope.

        Energy use per cm^3 has risen dramatically over the last 20 years. By your stated measures it should be dropping. Datacenters no longer have space budgets, they have power budgets, waste heat is one THE big problems with computers and datacenters right now.

      • by Bengie ( 1121981 )

        Transistor density is doubling every 18 months, power consumed per transistor is is decreasing about 10% every 18 months. See a problem?

        While computer are getting more efficient at doing the same amount of work, their ability to do more work out paces their ability to use less peak power.

        nVidia said their next gen GPUs that will be out in ~2013, will be 16 times more gflops and 8 times less power per gflop. All I see is 2xs as much power. Except GPUs capable of ~500watts.

        Intel's trigate tech will temporaril

      • by afidel ( 530433 )
        That's not true at all for datacenters. Heat density in the datacenter is increasing, so fast in fact that datacenters built just 5 years ago are largely insufficient to power and cool bladecenters. In my datacenter we ran out of power and cooling capacity long before we ran out of physical space to fit everything, for us the saving grace has been virtualization but in the commercial datacenter sector there is a lot of floorspace going unused because of increasing power density. Some facilities are expandin
      • by Khyber ( 864651 )

        "Which means that in 15~20 years you are barely supplying enough heat to overcome heat losses in the system."

        Thermodynamics is going to set a hard limit on that. There's only so much you can put on one electron, eventually you will have wasted electrons and thus heat waste.

        And, as we get smaller and smaller, we fit more and more in the same space, effectively nullifying the advantage.

        In short, actually reusing the wasted heat will help. Sure, you're not likely to get MUCH back out of it, but any little bit

    • It is not new, but maybe getting a company like Microsoft talking about it will mean people will actually take notice?

      This is also probably why in colder climates the server farms should be downtown, where the excess heat can be taken advantage with the least loss, due to distance.

    • ...has done this for decades.
    • by N3Bruce ( 154308 )

      On a seasonal basis, this automatically happens in most homes and small businesses anyway. Heat generated by the servers helps contribute to keeping things toasty in the winter. It is not a reason in itself to not increase their efficiency though. Fuel, such as natural gas or oil burned in an onsite furnace results in 85-95 percent usable heat. The typical electricity generation cycle using a coal, oil, or natural gas boiler is about 33 percent efficient [cleantechnica.com] , Since this heat would be generated anyway, you mi

    • by rcpitt ( 711863 )
      Posted about this in 2008 Electricly Heated Home??? How about Computer Heated? [digital-rag.com]
  • What a novel idea (Score:5, Interesting)

    by suso ( 153703 ) * on Saturday July 23, 2011 @08:34AM (#36856164) Journal

    Nobody's ever thought of that before. I thought this "paper" was going to have some kind of design for a way to do it or something. Actually, recently I've been thinking about the way some barns are constructed. Where they have have windows at the apex of the roof. I guess that channels the heat up and lets it out right? Is it possible to put turbines up there that are driven by heat?

  • by AngryDeuce ( 2205124 ) on Saturday July 23, 2011 @08:41AM (#36856194)
    My PC has been doing double duty as a space heater for years.
    • During the winter I don't bother heating most of the house on average days because my computer does a good job. It gets toasty during the summer, but not until the afternoon and it just gives me an excuse to go outside and risk the evil day star's menacing photons.

      But, I think the real problem with doing this on a scale substantial enough to make a difference is really that you have them on all the time and you don't want to have to go running around to a million different server rooms monitoring that they'

      • >Most large buildings don't use central heating for a reason, they've pretty much all got heat exchangers, hydronic pumps

        Isn't hydronic central? It's certainly not a space heater.

    • by akpoff ( 683177 ) *
      No doubt. Geeks have known this for years. During it's heyday one slashdotter referred to the Pentium 4 as a space heater that emits computational products as a side effect.
    • by dodongo ( 412749 )

      Right there with ya. I used to have a P4 furnace but have upgraded to something a little newer and cooler. The GPU and TV Tuner cards have picked up the slack, however.

  • The main issues are efficiency and temperature.

    Sure, when you have something "for free" efficiency is moot. But you would still have to have pumps to transport the heat.

    Hence the 2nd point: temperature. I'm thinking you can have the water around 50C/120F tops by that method. So if you get the water at that temp, pump it out to the offices, how much of your heating needs can be fulfilled there? How much heat will be lost in transport?

    • It might be possible to use a heatpump sourcing heat from the 50c air. That would make it fantastically efficient.

  • Hardly radical. Power stations have done it for years, some other food processing factories have used the heat to warm up greenhouses to grow tomatoes.

    A radical idea would be putting data centers in a cooler climate so they can be cooled more with ambient temperatures.

    • by bz386 ( 1424109 )

      Hardly radical. Power stations have done it for years, some other food processing factories have used the heat to warm up greenhouses to grow tomatoes.

      A radical idea would be putting data centers in a cooler climate so they can be cooled more with ambient temperatures.

      For example like Google's Hamina data center? http://www.google.com/datacenter/hamina/ [google.com]

      • by cynyr ( 703126 )

        most new data centers are trying to do everything they can to use "free" cooling methods. indirect evaporative cooling, air side economizer, pre-warming the incomming city water for the heaters. litterly everything, they then also simulate the cooling systems based on outdoor temps for every hour of every day in an ASHRAE [ashrae.org] standard year for the location of the data center.

    • by Lennie ( 16154 )

      In the Netherlands there is already a project with datacenters and greenhouses.

    • by Lennie ( 16154 )

      Building datacenters in different climates doesn't really help.

      The reason companies like Google have as many datacenters isn't just about redudancy. But the biggest reason is the latency between the user and the server. Datacenters need to be close to the user to get the data to the user quickly.

      That is why many companies use CDN's and one of the reasons why Google started the SPDY project (because of TCP-slowstart, SPDY tries to make HTTP faster by re-using TCP-connections).

  • We used to have a minimal heating bill in the winter back when we kept a few racks of servers on-site. Our gas bill has gone up substantially as we've moved to virtualization.

  • by pfafrich ( 647460 ) <rich&singsurf,org> on Saturday July 23, 2011 @08:51AM (#36856242) Homepage
    Combined heat and power [wikipedia.org] (CHP) schemes are a increasingly common using the waste heat from some process to provide district heating. Temperatures from a server farms might be a bit on the low side but it changes the situation when you look at the heat as a resource to be used rather than a waste item.
    • As good as all this is, I would like to see smarter server farms. For a load balanced systems there would be hot, warm and hibernate. All this would be linked to a master controller that would rev up and down the machines as needed. Also, each server would be of the form of mini-ITX, to pack more in and designed to be reach a certain maximum threshold before bringing on supporting systems.

      I am also wondering whether in a hosted environment whether there would be a way to give each customer a virtual machine

      • by h4rr4r ( 612664 )

        Vmware does this already with power management, which is the same as heat in the end. It is something citrix xen also does and should not be very hard to get kvm to do.

  • Damn it, NONE of us in the USA are in any mood to talk about heating our damn houses or buildings. It's 45 degrees centigrade. Can't we just save this discussion for a couple of months. It's not like a new idea or anything.
    • by cynyr ( 703126 )

      HERE HERE! and that is in the great white north that is Minnesota!

      • You guys have to move to Alaska. It's a nice, comfortable 55 degrees F. And my rendering cluster (a pair of old dual xeons) is making the basement nice and comfy. The Lab is currently sleeping under the rack that the computers are on because the heat is deflected downward.
        • You guys have to move to Alaska. It's a nice, comfortable 55 degrees F.

          You and I may be polar opposites... I've just been thinking that the Southern California coast-line is too damn cold. Sure, the beach is close by, but when the temperature is usually 70 degrees, who wants to get in the water? Sure, there are the occasional 90F degree days, but even that's just warm.

          I think I need to move back out to the desert. Going for a hike in 120F degree temperatures is just more my speed. The fact that land an

    • "It's 45 degrees centigrade."

      Visiting foreigner?

    • Liar, no American would ever refer to it as 45 degrees centigrade you impostor. For one thing it's 45 degrees Celsius and for another thing nobody in the US knows that that number means.

  • The least expensive DFs will use the existing home broadband connection

    We have caps around here. And so will the rest of you, when the telecoms companies get their way.

    • by tgd ( 2822 )

      How much data do you really think something like that will move around?

      Netflix and bittorrent, and an American subculture of people who seem to need 10 hours of video entertainment a day are why caps are a problem.

      Business servers and people who go outside tend to not run into caps.

  • by julf ( 323835 ) on Saturday July 23, 2011 @09:13AM (#36856356)

    http://www.guardian.co.uk/environment/2010/jul/20/helsinki-data-centre-heat-homes

  • by bradgoodman ( 964302 ) on Saturday July 23, 2011 @09:19AM (#36856386) Homepage
    Why limit the conversation to just servers, when this occurs everywhere in common life?

    Why does my refrigerator take heat out of the inside, and dump it into my house - requiring my A/C to then take it and again put it outside?

    Why does my A/C in a house take all the heat and discharge it outside into the atmosphere, which meanwhile a pool heater is running 5 feet away using energy to generate more heat for the pool?

    Why do people call incandescent light bulbs "energy wasters", when then can (in the cooler months) defray the work needed to be done by a household heating unit?

    Why does the Pizza place down the street run their heater in the winter yet has these giant metal exhaust ducts running from their pizza ovens, venting heat to the outside world? (Why no fins/blowers on these ducts to disperse heat into the pizza-joint?)

    The point is - people think of heating and cooling on a "unit" basis - and not on a systemic basis of an overall building - or even area. HVAC systems in buildings get this - sort of - they are not single machines - but a system of different, interconnected machines which are each interconnected, performing different tasks - sort of like organs in a human body. This approach needs to be thought of everywhere where cooling is required, and/or heat is generated.

    • by kanweg ( 771128 )

      And why does the A/C try to release heat into the atmosphere at a moment when it is already hot (instead of dumping it into a cold buffer it prepared during the night), i.e. at a time when it is the hardest to get rid of that heat?

      And why does the A/C try to release heat at a time when electricity demand is already at peak level (instead of during the night)? It would save money building power plants (and lower the electricity bills) if they didn't.

      For a fraction of the defense budget, Americans could have

      • And why does the A/C try to release heat into the atmosphere at a moment when it is already hot (instead of dumping it into a cold buffer it prepared during the night), i.e. at a time when it is the hardest to get rid of that heat?

        A "cold buffer"? You know of a practical way to produce this "cold buffer"? Freeze a ton of water, then efficiently release the heat from same? Easier said than done.

        And why does the A/C try to release heat at a time when electricity demand is already at peak level (instead o

    • by fph il quozientatore ( 971015 ) on Saturday July 23, 2011 @09:50AM (#36856606)
      Still, it has always stricken me as peculiar that in the wintertime people spend energy to heat the kitchen up to 20-25 C, and inside it there is a little fridge working as hard as it can to bring the temperature back to exactly the same value as outside.
      Not to mention that this refrigerator is typically located just next to the electric cooker...
      • Yes, exactly! I'd thought of this - forgot to mention it in my post!
      • by ColdWetDog ( 752185 ) on Saturday July 23, 2011 @10:06AM (#36856742) Homepage

        Still, it has always stricken me as peculiar that in the wintertime people spend energy to heat the kitchen up to 20-25 C, and inside it there is a little fridge working as hard as it can to bring the temperature back to exactly the same value as outside. Not to mention that this refrigerator is typically located just next to the electric cooker...

        Convenience and cheap energy. For residential buildings, the money saved generally doesn't amount to enough to support the infrastructure required to transfer and control heat. However, in larger buildings, this sort of thing is rather normal. In theory, you could make smaller units for the house that would take hot air from the refrigerator and dump it into the living room in the winter or preheat the water for the hot water heater, but the ducting involved would either be rather ugly or have to be built in to the house. Wait until heating / cooling gets really expensive, then the savings might justify the hassle.

        The other big problem is that we're not talking about a lot of heat. Put your hands on the back of a modern refrigerator - it's warm, not hot. To move energy with low heat values gets harder (read bigger ducts / fans) and less worthwhile. Put your hands on the exhaust of a city sized natural gas fired thermal power plant and you've got some significant BTUs pumping out - it then becomes worth your while to do something with it.

    • Why limit the conversation to just servers, when this occurs everywhere in common life?
      Why does my refrigerator take heat out of the inside, and dump it into my house - requiring my A/C to then take it and again put it outside?

      Why does my A/C in a house take all the heat and discharge it outside into the atmosphere, which meanwhile a pool heater is running 5 feet away using energy to generate more heat for the pool?

      Mostly because trying to use small temperature differences is difficult; solutions tend to be

    • Why do people call incandescent light bulbs "energy wasters", when then can (in the cooler months) defray the work needed to be done by a household heating unit?

      Because they are energy wasters. The furnace in your house is FAR more efficient at producing heat than the incandescent bulbs. It's not even a close comparison. Your basic point about how we should be using (and re-using) waste heat as much as possible is a good one but that isn't a reason to use energy wasting technologies for their by-products.

      I do love the idea of using waste heat in useful ways but let's not generate waste heat on purpose.

      • So, silly question. Where does the energy "go"?

        My impression was that a 100kW bulb uses XkW to generate light, and the rest is "wasted" as heat. So, in fact - whereas my furnace wastes some energy turning the blower motor, and some energy running the thermostat (albiet a very little), and some energy from heat which is lost in the chimney duct work in my basement, and even more as the heated exhaust is expelled through the roof - couldn't it be said that a light-bulb expends *all* of it's energy in either

        • by hackertourist ( 2202674 ) on Saturday July 23, 2011 @01:39PM (#36858178)

          The 'aside' you mention is actually the main point. Even the most efficient power plants top out at 60% efficiency. Assuming your house is heated with gas, not electricity, this means that the light bulb is slightly over half as efficient as your gas furnace.
          For a recent (less than 15 yo) gas furnace over here (.nl) efficiency is in the 95%+ range, thanks to government incentives towards more efficient systems. Dunno about the US situation.

          Also, heat and light needs don't overlap: you'll be running those same lights in the summer, when your AC will be running overtime to pump out the excess heat from the damn bulbs.

          Finally, the effect of light bulbs on heating is negligible. My central heating is rated at 25 kW, and I have 13 light fixtures. If I installed 100W incandescent lights everywhere I would generate ~1300W in heat, or 5% of the peak capacity I need.

    • Why does my refrigerator take heat out of the inside, and dump it into my house

      Because venting a refigerator limits where you can place a refrigerator and still keep the costs within reason. It adds weight, bulk and complexity to the refigerator itself.

      Your central heating and A/C has efficient duct work, fans, etc, for use year-round. How much does a 12 to 16 cubic foot refigerator or freezer add to the load?

      Why does my A/C in a house take all the heat and discharge it outside into the atmosphere, which meanwhile a pool heater is running 5 feet away using energy to generate more heat for the pool?

      Chances are good the pool won't be a bare five feet away and was sited for precisely the opposite reasons that you sited the A/C: You wanted the A/C shaded and the pool in th

    • Why does my A/C in a house take all the heat and discharge it outside into the atmosphere, which meanwhile a pool heater is running 5 feet away using energy to generate more heat for the pool?

      Not everyone has a pool to heat. I'd say it's less than 10% of the North American population has one. If it's hot enough for the A/C to be turned on, turn if off and go hang out in the cold pool instead of trying to heat the pool.

      Why does the Pizza place down the street run their heater in the winter yet has these giant metal exhaust ducts running from their pizza ovens, venting heat to the outside world? (Why no fins/blowers on these ducts to disperse heat into the pizza-joint?)

      Trust me, you don't need to blow it into the dining area. They are hot enough on their own. The ventilation makes it bearable.

      Why do people call incandescent light bulbs "energy wasters", when then can (in the cooler months) defray the work needed to be done by a household heating unit?

      I live in an Apartment Building which provides hotwater/steam baseboard heat free included in my rent. CFL's save me money because I pay for my power. Every

    • Why do people call incandescent light bulbs "energy wasters", when then can (in the cooler months) defray the work needed to be done by a household heating unit?

      Because the power source for a lightbulb is electricity. That means heat from a light bulb can never be more efficient than that from an electric heater. And electric heaters are much more costly than something like natural gas.

      It's probably more cost efficient to get CFLs and let your gas furnace produce that little extra heat that your incandescen

    • Why do people call incandescent light bulbs "energy wasters", when then can (in the cooler months) defray the work needed to be done by a household heating unit?

      Because resistive electric heating is the least-efficient method in common use. A natural gas, propane, or diesel furnace is much more efficient and economical. For fully-electric heating, a heat-pump is much more efficient, is just about a free add-on if you need an air conditioner anyhow, etc. And throwing geothermal in, along with a heat-pump

  • by 1sockchuck ( 826398 ) on Saturday July 23, 2011 @09:21AM (#36856402) Homepage
    As others have noted, there are many good examples of data center reusing waste heat. Here's a list of examples [datacenterknowledge.com] of server heat being recaptured to warm homes, offices, greenhouses and even swimming pools. This is common enough that The Green Grid recently released guidelines on the best way to integrate heat recapture in key efficiency metrics like PUE (Power Usage Effectiveness).
  • by airfoobar ( 1853132 ) on Saturday July 23, 2011 @09:28AM (#36856450)
    It was a sofa. [google.co.uk]
  • nothing more to say about this.

  • I realized this at the end of winter when I had 8 high-power GPUs running in my condo mining Bitcoins, and my central heating was not running anymore. You put your hand behind one of the quad-GPU computers on full load, and it feels like a blowdryer, running 24 hours per day. Seemed to have no problem heating 1200 sqft. This seems to apply to GPUs more than anything, though. I don't know how many CPU servers can produce 1.5 KW of heat...
  • By increasing the number-crunching we increase the power requirements of the CPU and the energy in, and heat out.
    This much is self-evident.

    However, is number-crunching work? If so how much work?
    Feeding n watts into a system and crunching math at m mips. Does that leave us with n-k*m power out where k is some magic constant not necessarily unrelated to m already and varying with the machine?
    Do you then want a low k-factor or will they simply increase the power in so that you don't freeze?
  • The idea, I think is that these servers are your cloud infrastructure, rather than being used for any local purpose.
    Imagine a future where computer technology is a bit more stable than it is now, so a server has a 10 year or so useful life before becoming obsolete. Also you have fibre to every apartment building or office block.

    Now, you want to convert some electricity into heat for whatever reason. So you buy/rent a "brick" of servers of suitable size, probably an all-solid state affair with no moving part

  • Just the way the house got laid out, unfortunately. I had to switch to a more thermal efficient system just so the heater would actually run in the winter time.
  • The is similar to power plants using waste heat in the form of steam exiting the turbine(s) to heat local negiborhoods. It makes sense to recycle the heat but in the summer your back to trying to get rid of it.

    I think its possible that the cooling water from such a system would be hot enough to operate a turbo expander [wikipedia.org] type power generation system. You could attempt to turn some of the waste heat back into electricity. That would be useful year round and in hotter climates where heating is unnecessary.

  • Last year, I entered a Microsoft-sponsored contest that promoted home energy efficiency. You had to demonstrate how you had improved your home's efficiency, and explain what you would like to improve. (The prize was a few thousand dollars to Home Depot, IIRC.)

    Among other things, I demonstrated that in my basement, where I had recently added insulation and replaced the old windows with new energy efficient ones, I have a home office with no heating in it. Yet just leaving my server on and at full processo

  • "Your honor, since even on Windows the latest crop of computers has become way too energy-efficient to heat the dorm halls, that clustered brute-forcing of the **AA master key by our CS class must have been a completely unintended by-product of their attempts to survive the winter term without freezing..."
  • thanks to my 2 computers, I7 cpu's, my 2 monitors, my big ass 1080p tv, I never turn the heat on in my apartment (studio, live in seattle). In fact, i usually have my windows open all year around. I don't pay for heat. Sure, during the Summer it can suck a bit, mainly if we hit over 90, and if no air is blowing can stay too hot at night, but hell, I don't pay for heating.

    I do though, seem to have a nice high electric bill.

    • You probably can't do this in a studio, but I got a Power monitor [amazon.com]. Literally paid for itself in the first week. It is hot as shit this year in VA , however my electric bill is only about $200, down from $300 last year. I adjusted the settings based on the meter. And guess what, the A/C works better at these new settings. For your studio, you can get a Kill-a-Watt and see which of your computers is drawing the most juice. You have those nice cool summers in Seattle, so buy a window fan to get some that
  • In summer of 1981, I worked for Charter Information in Austin TX.

    They ran a Xerox Sigma 6 (nice machine for the day). They'd moved down from Woburn MA a few years earlier. When they'd set up their offices in Woburn, they'd run a duct from the computer cooling air exhaust to the building HVAC ducts. They reported that they didn't have to start their oil burners at all during Massachusetts winter: the waste heat from the Sigma was enough to heat their entire office suite.

    That was over thirty years ago.

    You

  • http://groups.google.com/group/openmanufacturing/msg/4298e48e35b7efc0?hl=en [google.com]
    "Now, there are probably lots of ways you could do this [grow cell cultures as agricultural liquids like orange juice] that you know more about that I. But here is what I envision for home use (as opposed to big industrial use).
    You have two versions. One is for outdoors, and is a big machine you set up in you yard with a glass top that has photosynthesizing algae that either produce the liquids directly or produce some

  • My boiler broke this past winter and I had to run 2 space heaters a week until it was fixed.

    Let me tell you, electrical heat is super expensive. Those space heaters for that week cost more than the 1955 natural gas boiler did in a month.

  • by AlejoHausner ( 1047558 ) on Saturday July 23, 2011 @08:10PM (#36860068) Homepage
    Ingredients:
    1 lb of beef chuck, chopped into 1-inch pieces
    2 cloves garlic, minced
    1 onion, chopped
    2 tbsp oil for frying
    one bay leaf
    salt and pepper to taste
    water
    Directions:
    1. Attach a large pan directly to the server CPU with heatsink compound, and brown the beef, a few pieces at a time, to avoid steaming them. Set aside.
    2. Detach the pan from the CPU about 5 mm, and sautee the onions until golden brown, about 5 minutes.
    3. Add the garlic, sautee 1 minute.
    4. Add beef, salt and pepper, bay leaf, and water to cover.
    5. Place pan over 1kW multi-GPU exhaust, and simmer two hours, or until meat is tender.
  • At two of our locations we have Mitsubishi R2 heat pump systems that are capable of running both heat/cool modes simultaneously.

    The beauty? The waste heat that is removed from my server room in the winter via cooling mode is exchanged within the system and makes the heat side more efficient, transferring that heat energy to the rest of the building.

    So instead of paying for the electricity to turn it into bits and bytes, and then paying AGAIN for even MORE electricity to move that waste heat to the roof vi

  • There's a company out there selling 100kW gas turbines where the waste heat is used to power absorption chillers, a complete datacenter solution without reliance on grid power.
  • My room can go up to 90F degrees upstair in Los Angeles/L.A. area. :(

  • The summary says "this is a serious...", which always strikes me as odd. Despite the ubiquity of laugh tracks, jokes shouldn't need an introduction; if you don't get it, you probably don't know you don't get it. In the absence of something absolutely ridiculous, phrases like ' this is serious ', stand out as an indicator of ' this is quite stupid '.

    Imposing the word 'work' has a similar implication. The telephony system didn't need to superimpose 'work' into its terminology to assert that telephones a

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...