Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Data Storage

The Data Dome: A Server Farm In a Geodesic Dome 62

1sockchuck writes In a unique approach to data center design, the new high-performance computing center in Oregon is housed in a geodesic dome. The new facility at the Oregon Health and Science University requires no mechanical air conditioning, using outside air to racks of servers reaching densities of 25kW per cabinet. The design uses an aisle containment system to separate hot and cold air, and can recirculate server exhaust heat to adjust cold aisle temperatures in the winter. It's a very cool integration of many recent advances in data center design, combining elements of the Yahoo Chicken Coop and server silo in Quebec. The school has posted a virtual tour that provides a deep technical dive.
This discussion has been archived. No new comments can be posted.

The Data Dome: A Server Farm In a Geodesic Dome

Comments Filter:
  • by Mr D from 63 ( 3395377 ) on Monday August 18, 2014 @03:55PM (#47698341)
    I wonder if they have any issues with moisture from constantly cycling in outside air? Its being heated, so I guess it won't condense, but still seems like it could be a concern over the long term. Is the air filtered? Particulates would be another concern, or would they just do some sort of cleaning?
    • Re:Moisture? (Score:5, Informative)

      by Dan East ( 318230 ) on Monday August 18, 2014 @04:01PM (#47698367) Journal

      In the video the narrator specifically states that the incoming air is filtered.

    • Re:Moisture? (Score:4, Informative)

      by JWSmythe ( 446288 ) <jwsmythe@@@jwsmythe...com> on Monday August 18, 2014 @04:22PM (#47698507) Homepage Journal

      You can take a look at their official page. http://www.ohsu.edu/xd/about/initiatives/data-center-west.cfm [ohsu.edu]

      The tour video and text talk about plants outside filtering. The video around 3 minutes, shows additional filtering inside.

      I suspect prevailing winds will really screw with the site cooling.

      The "Virtual tour" has more details than the rest. Nothing about humidity.

      Their security seems odd. They talk about the security being very strict. The video shows the inside of each "pod" to be open to the common hot air area in the upper part of the roof. So they have security, but you can get around it by not going through the doors. {sigh}

      I never got the idea of sticking square boxes in a round hole. They're wasting a lot of good real estate by leaving all that extra space between the servers.

      It seems like it was drawn up with an ideal world in mind, which usually doesn't translate well to the real world.

      • I never got the idea of sticking square boxes in a round hole. They're wasting a lot of good real estate by leaving all that extra space between the servers.

        What you call "wasted space" I call "ventilation".

        Also not factored in is how much space traditional HVAC equipment takes up in a normal data center.

        Just the fact that this kind of building doesn't have the same power drain as HAVC facilities means you could have one in more places than a "normal" data center.

        • As described, after looking at their materials, I don't see an advantage to the radial design over a grid design. There is nothing to that which would improve airflow, and it leaves huge underutilized areas.

          On the other hand, a traditional grid design optimizes the space, and it would still allow for the same airflow.

          It's not a matter of being round, or having dead space, it's simple things we teach children. Square boxes don't fit through round holes. Round objects don't stack optimally.

          One of the Equi

          • I guess if they have money to burn and real estate to waste

            Domes are cheaper to build and use less material than a traditional building since you need no load bearing walls (although the design they had was not a full dome).

            You are simply handwaving away the significant energy savings this design brings to the table.

            I've lived in a dome before, the round walls do not waste THAT much space. And you need room to move around equipment anyway.

            • Did you look at their floorplan? There are huge wedge shaped gaps.

              Or lets do math. For the sake of argument, lets say that the diagram in their virtual tour was to scale. We're also going to say that each rack is a standard 19" rack, taking up 22" each. That can be wrong, but it's what I'm using for measurement.

              The entire circular structure has an area of 24,052 sq/ft.
              A square structure on the same property would be 30,625 sq/ft
              The circular structure wastes 6,573 sq/ft.

              Each pod, with a 3' buffer on ea

    • Beaverton, Oregon, is near Portland, and it can get very humid there, so adding the additional humidity inherent in evaporative cooling could cause a potential problem. Of course, since there is no air conditioning, you don't have a condensation problem. I looked at some other sites that use evaporative cooling to cool servers, and the systems only need to cycle on ten minutes in thirty, even in 100F weather, so you could use the fans to mitigate humidity to some extent.

      However, that being said, I don'
      • since there is no air conditioning, you don't have a condensation problem

        No, it's not HVAC induced condensation. Meteorologists call it the dew point.

        Right at this moment, the temp is 53.3F with a relative humidity of 78%. The dew point is 47F.

        You're suppose to run a datacenter between 40% to 60% relative humidity. Without a system in place to dry the air, they're asking for corrosion on parts.

        You can't say computers are corrosion proof either. When I worked in a computer store, we had computers come

        • Now there's a justification for me to keep the air conditioning on in my house all summer long. Wouldn't want to corrode the lap tops . . . Thanks for the info.
          • It's good for your house too. I've seen houses where the homeowner never ran their A/C and they were proud that they saved money. They also had problems with mold, paint peeling, drywall falling apart, and various wood things in their house warping.

            At one place I lived, there were ceiling fans throughout the house, which was nice. There were also some on our back porch. The ones inside stayed in almost original condition. The ones outside had rust on the metal parts, and the blades warped.

            But this wa

  • Every time the video showed a door, the narrator had to say that the door is locked. I get it. Doors can be locked. It just seemed there was an agenda in the video to point out to some specific audience the trivial and standard physical security involved, as blatantly obvious as that should be to anyone.

    • Did you notice that he talked about the doors to the warm side? Controlled and logged access. And just a couple seconds later he says the top of the pods are all open to the common upper area. I'd hope they'll have something in the way, but I doubt it would be anything that bolt cutters (or just tin snips) and a few minutes would have a problem with.

      • What he didn't say - this data center is actually in a primate research lab - the entire campus is surrounded by 20 foot high electrified fence with mesh so tight it makes it difficult to scale.

        Plus the entire place is coated in surveillance cameras (like every fence pole had a cluster of several sort of thing). I suspect you could leave the doors unlocked and it would probably more secure than many data centers you read about.

        No I don't work for OHSU, but I live close to this campus.

        • I should make an obligatory reference to Jurassic Park. :)

          I was guessing by the fact that they had employees accessing the building, and parking lots, that it was a facility that had some sort of access.

  • Sounds like the data center of the future, circa 1975. I wouldn't mind working in it though, but I wonder how they control humidity. Lack of cooling may work well in their area, but I see problems in hotter/ more humid places.
    • by hey! ( 33014 )

      Oh, come on; everything's more futuristic in a geodesic dome.

      • by jd ( 1658 )

        What about a TARDIS?

        • TARDIS racks are great! I can get 932U of equipment in the space of a normal 42U rack, and I get the results before I enter my data!
          • TARDIS racks are great! I can get 932U of equipment in the space of a normal 42U rack, and I get the results before I enter my data!

            I don' t need a rack of equipment, I just use the spare cycles in my screwdriver and let it cogitate for 400 years on the problem. Sometimes linear programming works better than parallel.

      • Oh, come on; everything's more futuristic in a geodesic dome.

        Pyramids used to be the future.

    • 1955. The Manchester Computing Centre was designed to be one gigantic heat sink for their computers in the basement, using simple convection currents, ultra-large corridors and strategically-placed doors to regulate the temperature. It worked ok. Not great, but well enough. The computers generated enormous heat all year round, reducing the need for heating in winter. (Manchester winters can be bitingly cold, as the Romans discovered. Less so, now that Global Warming has screwed the weather systems up.)

      The design that Oregon is using is several steps up, yes, but is basically designed on the same principles and uses essentially the same set of tools to achieve the results. Nobody quite knows the thermal properties of the location Alan Turing built the Manchester Baby in, the laboratory was demolished a long time ago. Bastards. However, we know where his successors worked, because that's the location of the MCC/NCC. A very unpleasant building, ugly as hell, but "functional" for the purpose for which it was designed. Nobody is saying the building never got hot - it did - but the computers didn't generally burst into flames, which they would have done if there had been no cooling at all.

      • http://en.wikipedia.org/wiki/D... [wikipedia.org]
        "The Dymaxion was completed in 1930 after two years of development, and redesigned in 1945. Buckminster Fuller wanted to mass-produce a bathroom and a house. His first "Dymaxion" design was based on the design of a grain bin. ... The Siberian grain-silo house was the first system in which Fuller noted the "dome effect." Many installations have reported that a dome induces a local vertical heat-driven vortex that sucks cooler air downward into a dome if the dome is vented pro

  • In OEM specs? (Score:4, Interesting)

    by mlts ( 1038732 ) on Monday August 18, 2014 @04:07PM (#47698415)

    Where the rubber meets the road is if the machines are in temperature and humidity specifications for the equipment, so warranties are not voided.

    If this is workable, even during the winter or when it is extremely rainy/humid, this might be a useful idea. However, there is only a limited set of climates that this would work in. The PNW with its moderate temperatures makes sense for this. However, if I attempted to do the same thing in Texas, come summertime, I'd have a building full of BBQ-ed servers.

    • by raymorris ( 2726007 ) on Monday August 18, 2014 @04:20PM (#47698499) Journal

      In Portland, it's reasonably cool MOST OF THE TIME.
      Temperatures reach or exceed 90 F (32 C) on 14 days per year and reach or exceed 100 F (38 C) on 1.4 days per year on average.

      I'm thinking this project will last about 350 days.

      • You then need an identical data center in South America, and switch which one you use every half year. The cloud.
      • by Anonymous Coward

        From the article:

        "When outside air temperature is too warm for free cooling, the data center’s adiabatic cooling system kicks in automatically to help out. Beaverton, Oregon (where the facility is located), experienced some 100 F days recently, and the evaporative-cooling system cycled for about 10 minutes at a time at 30-minute intervals, which was more than enough to keep supply-air temperature within ASHRAE’s current limits. Gleissman said he expects the adiabatic cooling system to kick in se

        • > From the article:
          > "When outside air temperature is too warm for free cooling, the data center’s adiabatic cooling system

          Which is funny, because the word adiabatic means something that does not get rid of heat, or draw in heat, from the outside.
          An adiabatic system would cool the building by drawing a vacuum, sucking all of the air out of the building. The decreasing air pressure would lower the temperature for a few minutes. Since you can't keep lowering the air pressure below absolute vacuum

      • The article explicitly states when the temperatures REACHED (as in already happened) 100+, the water cooling units designed for that very purpose cycled on 10 minutes out of every thirty to keep incoming air within tolerance to cool the servers.

      • by jd ( 1658 )

        Portland is cool, yes. But that's mostly down to the bookshops and tea shops. Temperature-wise, it doesn't get "hot" per-se, but it does get very humid. And the air is horribly polluted. Spent the time moving up there reading about dangerously high levels of mercury in the air, the deadly pollutants in the river, the partially dismantled nuclear reactor and highly toxic soil (the reactor has since gone, the soil remains drenched in contaminants), the extremely high levels of acid rain due to excessive cars

        • When you say 'high levels of mercury' are you sure you weren't confusing air pollution with printed page pollution? :) Not sure which is more dangerous to consume, best to avoid both

          What part of NE Portland were you living in anyhow?

          • by jd ( 1658 )

            Southwest. Above Jake's Seafood (which, incidentally, serves terrible food at absurd prices for the benefit of the ambiance of an old brick building).

      • There are 8760 hours in a year. If, as you say, you exceed 32C on 14 days per year you don't exceed 32C for the full 24 hour block, instead you may be over 32C for late-morning through to late-afternoon - call that 11AM to 5PM or 6 hours. That means a total of 84 hours where you have to run active cooling systems out of a year, which is approximately 1% of the year. If you can also specify that any hardware installed is certified at the upper limit of the ASHRAE standards then your thermal window increases

        • You realize datacenters normally run at 23-25C, right? In the middle of the DC. The incoming air is a couple degrees cooler.
          You're thinking of the maximum allowable temperature inside the case, in a rack, and at the back of the case, by the hot aisle. The cold aisle needs to be cooler than the hot aisle. Those days when your cold aisle hits 90F are the days you're GUARANTEED to destroy hardware if you don't take action. Most of the rest of June - August you'll need cooling to stay within SAFE temps.

          • Basic ASHRAE standards have a recommended range of 18C to 27C, but a maximum allowable range of 15C to 32C. If you specify an A2 or A3 ASHRAE compliance when buying your hardware you can stretch that allowable range all the way up to 35C (A2) or even 40C (A3),

            Most datacentres these days are looking closely at the ASHRAE limits and at monitoring to raise the average cold aisle temperature and make major savings. There are a lot of steps on this path if you're bringing an older datacentre up to the modern way

  • by PopeRatzo ( 965947 ) on Monday August 18, 2014 @04:07PM (#47698417) Journal

    They should put the data center in a pyramid. Then, the servers would last forever!

  • Chicken coop? (Score:5, Insightful)

    by TWX ( 665546 ) on Monday August 18, 2014 @04:39PM (#47698607)
    Why did the chicken coop have two doors?


    .


    .


    Becauase if it had four doors, it'd be a chicken sedan!
  • The article lists the requirements for the structure, which include things like massive air flow, high heat density, high electrical power density, etc. Constraints like that tend to point toward structures with high surface area to volume ratios. A sphere (or section of a sphere in this case) has the MINIMUM surface area to volume ratio. So why would you want to put this structure into a dome rather than a long, low building?

    (And if you really insisted on getting all fancy, architecturally, you could st

    • A dome doesn't care which way the wind blows, because it's round. Your long low building might have issues with that.
      • by tomhath ( 637240 )
        Looking at the picture (maybe an artist's drawing) I see a roundish 2 1/2 story structure sitting behind some trees. So I doubt the wind is a factor. Plus the article talks about fans pulling in outside air.
  • by bobbied ( 2522392 ) on Monday August 18, 2014 @05:06PM (#47698819)

    Not because I would object though.. But because it gets pretty hot here from time to time.

    So, if you move it north, why not? Heck, the south pole is pretty cold most of the year..

    I have a better idea, how about we just put server farms out at sea, then just use seawater from a few hundred feet down for cooling. That works great, even in the tropics.

  • I mean, come on, we're having such a hard time getting frickin' lasers for our sharks! Let me guess, an army of accountants to figure out how much we're going to save on our taxes?

No spitting on the Bus! Thank you, The Mgt.

Working...