Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Data Storage

The Data Dome: A Server Farm In a Geodesic Dome 62

Posted by samzenpus
from the keeping-it-cool dept.
1sockchuck writes In a unique approach to data center design, the new high-performance computing center in Oregon is housed in a geodesic dome. The new facility at the Oregon Health and Science University requires no mechanical air conditioning, using outside air to racks of servers reaching densities of 25kW per cabinet. The design uses an aisle containment system to separate hot and cold air, and can recirculate server exhaust heat to adjust cold aisle temperatures in the winter. It's a very cool integration of many recent advances in data center design, combining elements of the Yahoo Chicken Coop and server silo in Quebec. The school has posted a virtual tour that provides a deep technical dive.
This discussion has been archived. No new comments can be posted.

The Data Dome: A Server Farm In a Geodesic Dome

Comments Filter:
  • by Anonymous Coward on Monday August 18, 2014 @03:38PM (#47698255)

    According to the ebola story, niggers should be able to invent the same things, right? What have they actually invented? The stick? Drinking corpse water?

  • by Mr D from 63 (3395377) on Monday August 18, 2014 @03:55PM (#47698341)
    I wonder if they have any issues with moisture from constantly cycling in outside air? Its being heated, so I guess it won't condense, but still seems like it could be a concern over the long term. Is the air filtered? Particulates would be another concern, or would they just do some sort of cleaning?
    • Re:Moisture? (Score:5, Informative)

      by Dan East (318230) on Monday August 18, 2014 @04:01PM (#47698367) Homepage Journal

      In the video the narrator specifically states that the incoming air is filtered.

    • Re:Moisture? (Score:4, Informative)

      by JWSmythe (446288) <jwsmythe AT jwsmythe DOT com> on Monday August 18, 2014 @04:22PM (#47698507) Homepage Journal

      You can take a look at their official page. http://www.ohsu.edu/xd/about/initiatives/data-center-west.cfm [ohsu.edu]

      The tour video and text talk about plants outside filtering. The video around 3 minutes, shows additional filtering inside.

      I suspect prevailing winds will really screw with the site cooling.

      The "Virtual tour" has more details than the rest. Nothing about humidity.

      Their security seems odd. They talk about the security being very strict. The video shows the inside of each "pod" to be open to the common hot air area in the upper part of the roof. So they have security, but you can get around it by not going through the doors. {sigh}

      I never got the idea of sticking square boxes in a round hole. They're wasting a lot of good real estate by leaving all that extra space between the servers.

      It seems like it was drawn up with an ideal world in mind, which usually doesn't translate well to the real world.

      • by SuperKendall (25149) on Monday August 18, 2014 @04:46PM (#47698669)

        I never got the idea of sticking square boxes in a round hole. They're wasting a lot of good real estate by leaving all that extra space between the servers.

        What you call "wasted space" I call "ventilation".

        Also not factored in is how much space traditional HVAC equipment takes up in a normal data center.

        Just the fact that this kind of building doesn't have the same power drain as HAVC facilities means you could have one in more places than a "normal" data center.

        • by JWSmythe (446288) <jwsmythe AT jwsmythe DOT com> on Monday August 18, 2014 @05:21PM (#47698937) Homepage Journal

          As described, after looking at their materials, I don't see an advantage to the radial design over a grid design. There is nothing to that which would improve airflow, and it leaves huge underutilized areas.

          On the other hand, a traditional grid design optimizes the space, and it would still allow for the same airflow.

          It's not a matter of being round, or having dead space, it's simple things we teach children. Square boxes don't fit through round holes. Round objects don't stack optimally.

          One of the Equinix datacenters in Los Angeles (previously Pihana Pacific) has all of it's cooling on one side of the room, and returns on the other side. Each row is basically a wind tunnel. There is no appreciable temperature difference between the two sides. Both the front and back of the cabinets have the same airflow, and maintain roughly the same temperature.

          As far as the total power load, they could keep the load the same, and have almost half of the building for just storage.

          Of course, a square building that the industry uses as a standard for this kind of work, would not make the news. No one would be talking about it.

          I guess if they have money to burn and real estate to waste, it doesn't matter what shape they make it or how much space is underutilized.

          • by SuperKendall (25149) on Monday August 18, 2014 @08:55PM (#47700267)

            I guess if they have money to burn and real estate to waste

            Domes are cheaper to build and use less material than a traditional building since you need no load bearing walls (although the design they had was not a full dome).

            You are simply handwaving away the significant energy savings this design brings to the table.

            I've lived in a dome before, the round walls do not waste THAT much space. And you need room to move around equipment anyway.

            • by JWSmythe (446288) <jwsmythe AT jwsmythe DOT com> on Wednesday August 20, 2014 @01:47AM (#47710081) Homepage Journal

              Did you look at their floorplan? There are huge wedge shaped gaps.

              Or lets do math. For the sake of argument, lets say that the diagram in their virtual tour was to scale. We're also going to say that each rack is a standard 19" rack, taking up 22" each. That can be wrong, but it's what I'm using for measurement.

              The entire circular structure has an area of 24,052 sq/ft.
              A square structure on the same property would be 30,625 sq/ft
              The circular structure wastes 6,573 sq/ft.

              Each pod, with a 3' buffer on each end, and a 3' buffer between rows would have a footprint of 768.4 sq/ft. Since I only included one aisle buffer on each (since they share common aisle), add one more aisle at 102 sq/ft.

              The total datacenter rack space is really 3,944 sq/ft.

              In the difference between the round and square structure, you could put all the racks and aisles. and still have 26,681 sq/ft.

              Or about the size of two Olympic size swimming pools.

              Or 0.017 LoC.

              Or 53,362 bread boxes one layer deep.

              Or you could tile the floor of the wasted space with approximately 106,724 AOL CDs, which coincidentally is about half of the total number of AOL CDs received in Centennial, Colorado in one bulk mailing. Unfortunately, it will be very ugly, because you're trying to tile a square floor with round objects which has lots of wasted space.

              I could dazzle you with more numbers, but you've already started cursing me, and I really don't care.

      • by Anonymous Coward on Tuesday August 19, 2014 @06:21AM (#47702027)

        It seems like it was drawn up with an ideal world in mind, which usually doesn't translate well to the real world.

        You think?

        "and can recirculate server exhaust heat to adjust cold aisle temperatures in the winter."

        That sentence alone from the summary tells me they know nothing about computer thermals. You don't ever pump hot air back in the cool aisle, EVER! If the cold aisle drops below 68 F that's a good thing not a bad thing. You get more cooling for less cost. Last time I checked, unless we're talking about temperatures below 40 F there's nothing wrong with the cold aisles getting colder than human comfort levels. With 25kW (17 hair dryers on full blast in a phone booth sized space) I don't think they have to worry about below 40 F temps. Relative humidity is a major issue, but it sounds like they have that covered, although with the hot air thing in the cold aisle I would be skeptical that they did that correctly too!

    • by turning in circles (2882659) on Monday August 18, 2014 @10:47PM (#47700751)
      Beaverton, Oregon, is near Portland, and it can get very humid there, so adding the additional humidity inherent in evaporative cooling could cause a potential problem. Of course, since there is no air conditioning, you don't have a condensation problem. I looked at some other sites that use evaporative cooling to cool servers, and the systems only need to cycle on ten minutes in thirty, even in 100F weather, so you could use the fans to mitigate humidity to some extent.

      However, that being said, I don't see this idea working in really hot and humid climates, such as Alabama.
      • by Anonymous Coward on Tuesday August 19, 2014 @08:14AM (#47702565)

        In summer, prevailing winds are from the west and temperatures are mild. High pressure dropping down from Canada causes hot, dry wind from the east, but that's the perfect situation for evaporative cooling.

      • by JWSmythe (446288) <jwsmythe AT jwsmythe DOT com> on Thursday August 21, 2014 @09:01AM (#47720067) Homepage Journal

        since there is no air conditioning, you don't have a condensation problem

        No, it's not HVAC induced condensation. Meteorologists call it the dew point.

        Right at this moment, the temp is 53.3F with a relative humidity of 78%. The dew point is 47F.

        You're suppose to run a datacenter between 40% to 60% relative humidity. Without a system in place to dry the air, they're asking for corrosion on parts.

        You can't say computers are corrosion proof either. When I worked in a computer store, we had computers come in all the time that were in houses with no HVAC, so they were exposed to outdoor humidity.

        I left some old gear in a friends garage for a while. One of the units was a used Catalyst 5000, with cards I didn't really care about. When I put it in the garage, it was in functional condition.

        I decided to bring it back up to play with. There was corrosion on the line card handles, and I'm sure corrosion inside. Nothing looked bright and clean. There was visible corrosion on the cat5 pins (for the cat5 ports). When I took it out, it barely worked with lots of errors. Reseating the cards didn't help at all. I don't know (or care) which parts went bad, I sent it off for electronic scrap recycling.

        Someone's going to be really pissed off when they spent a fortune on servers that have to be trashed because they stop working properly.

        There are other parts machines in the garage too. I only go to them for fans, power supplies, etc. I had already pulled out all the memory and CPUs. Sometimes they still work. Sometimes they don't.

        Specs have some wild numbers on them. Some say they operate in 10% to 90% humidity. Sure, they *can* run in it for a while. They aren't expected to survive in that kind of temperature indefinitely. I've seen some specs that say they'll operate over 120F. Sure, for a very short time. I had one place argue with me because the spec showed wild numbers, but they were already experiencing hardware failures for operating servers in an uncooled server room (the HVAC broke, and they didn't want to fix it).

        • by turning in circles (2882659) on Thursday August 21, 2014 @10:07PM (#47725993)
          Now there's a justification for me to keep the air conditioning on in my house all summer long. Wouldn't want to corrode the lap tops . . . Thanks for the info.
          • by JWSmythe (446288) <jwsmythe AT jwsmythe DOT com> on Friday August 22, 2014 @12:04AM (#47726409) Homepage Journal

            It's good for your house too. I've seen houses where the homeowner never ran their A/C and they were proud that they saved money. They also had problems with mold, paint peeling, drywall falling apart, and various wood things in their house warping.

            At one place I lived, there were ceiling fans throughout the house, which was nice. There were also some on our back porch. The ones inside stayed in almost original condition. The ones outside had rust on the metal parts, and the blades warped.

            But this was a discussion about datacenters, so I talked about the corrosion problems with IT equipment.

  • by Dan East (318230) on Monday August 18, 2014 @04:00PM (#47698359) Homepage Journal

    Every time the video showed a door, the narrator had to say that the door is locked. I get it. Doors can be locked. It just seemed there was an agenda in the video to point out to some specific audience the trivial and standard physical security involved, as blatantly obvious as that should be to anyone.

  • Sounds like the data center of the future, circa 1975. I wouldn't mind working in it though, but I wonder how they control humidity. Lack of cooling may work well in their area, but I see problems in hotter/ more humid places.
  • In OEM specs? (Score:4, Interesting)

    by mlts (1038732) on Monday August 18, 2014 @04:07PM (#47698415)

    Where the rubber meets the road is if the machines are in temperature and humidity specifications for the equipment, so warranties are not voided.

    If this is workable, even during the winter or when it is extremely rainy/humid, this might be a useful idea. However, there is only a limited set of climates that this would work in. The PNW with its moderate temperatures makes sense for this. However, if I attempted to do the same thing in Texas, come summertime, I'd have a building full of BBQ-ed servers.

    • by raymorris (2726007) on Monday August 18, 2014 @04:20PM (#47698499)

      In Portland, it's reasonably cool MOST OF THE TIME.
      Temperatures reach or exceed 90 F (32 C) on 14 days per year and reach or exceed 100 F (38 C) on 1.4 days per year on average.

      I'm thinking this project will last about 350 days.

      • by stewsters (1406737) on Monday August 18, 2014 @04:40PM (#47698617)
        You then need an identical data center in South America, and switch which one you use every half year. The cloud.
      • by Anonymous Coward on Monday August 18, 2014 @04:43PM (#47698635)

        From the article:

        "When outside air temperature is too warm for free cooling, the data center’s adiabatic cooling system kicks in automatically to help out. Beaverton, Oregon (where the facility is located), experienced some 100 F days recently, and the evaporative-cooling system cycled for about 10 minutes at a time at 30-minute intervals, which was more than enough to keep supply-air temperature within ASHRAE’s current limits. Gleissman said he expects the adiabatic cooling system to kick in several weeks a year."

        • by Anonymous Coward on Monday August 18, 2014 @04:52PM (#47698721)

          How do swamp coolers keep things operable if the humidity is 100%? There is a reason they are used in only dry climates.

        • by raymorris (2726007) on Monday August 18, 2014 @05:12PM (#47698869)

          > From the article:
          > "When outside air temperature is too warm for free cooling, the data center’s adiabatic cooling system

          Which is funny, because the word adiabatic means something that does not get rid of heat, or draw in heat, from the outside.
          An adiabatic system would cool the building by drawing a vacuum, sucking all of the air out of the building. The decreasing air pressure would lower the temperature for a few minutes. Since you can't keep lowering the air pressure below absolute vacuum, the servers would melt after a few minutes.

          Perhaps they meant to say "diabatic cooling system". A diabatic system is one that gets rid of heat (or draws in heat). Of course that's also the definition of "cooling", so if that's what they meant, it's a snobbish way of saying "cooling cooling system". With the a prefix, it's "non-cooling cooling system", which is gibberish. Unless of course by "abiatic (non-cooling) cooling system", they mean "cooling system that doesn't col, one that doesn't work". If on 100F days they are relying on a abiatic aka non-cooling aka broken cooling system, I don't think I want my servers there. I had a taste of that when I had servers at Alphared.

      • by SuperKendall (25149) on Monday August 18, 2014 @04:51PM (#47698711)

        The article explicitly states when the temperatures REACHED (as in already happened) 100+, the water cooling units designed for that very purpose cycled on 10 minutes out of every thirty to keep incoming air within tolerance to cool the servers.

      • by Anonymous Coward on Monday August 18, 2014 @04:51PM (#47698713)

        100F is no problem if the dew point is low.
        Then a water mist will bring the air temp down to the dew point.
        The technology is called a 'swamp cooler'

        Works great in the S/W, but in the S/E, I'm skeptical.

        Presumably, they have compared historical climate data to their equipment specs.

      • Portland is cool, yes. But that's mostly down to the bookshops and tea shops. Temperature-wise, it doesn't get "hot" per-se, but it does get very humid. And the air is horribly polluted. Spent the time moving up there reading about dangerously high levels of mercury in the air, the deadly pollutants in the river, the partially dismantled nuclear reactor and highly toxic soil (the reactor has since gone, the soil remains drenched in contaminants), the extremely high levels of acid rain due to excessive cars (which are driven by maniacs) and the lethal toxins flowing through the rivers that have been built over to level out the ground.

        In short, I landed there a nervous wreck.

        Things didn't improve. I saw more dead bodies (yes, dead bodies) in Portland and heard more gunfire in my five years there than I heard in the suburbs of Manchester, England, in 27 years. You will find, if the archives let you get back that far, that I was almost normal before that time.

      • by SpankyDaMonkey (1692874) on Monday August 18, 2014 @10:42PM (#47700723)

        There are 8760 hours in a year. If, as you say, you exceed 32C on 14 days per year you don't exceed 32C for the full 24 hour block, instead you may be over 32C for late-morning through to late-afternoon - call that 11AM to 5PM or 6 hours. That means a total of 84 hours where you have to run active cooling systems out of a year, which is approximately 1% of the year. If you can also specify that any hardware installed is certified at the upper limit of the ASHRAE standards then your thermal window increases and you can drop that 1% down even lower.

        Free air cooling, it just makes sense provided your filtering and air handling systems are up to it. Keeping in the limits for humidity are a little harder to manage as you don't want to see condensation in your cold aisle in the summer, or to pick up static shocks when you touch anything in the winter.

        • by raymorris (2726007) on Monday August 18, 2014 @11:25PM (#47700869)

          You realize datacenters normally run at 23-25C, right? In the middle of the DC. The incoming air is a couple degrees cooler.
          You're thinking of the maximum allowable temperature inside the case, in a rack, and at the back of the case, by the hot aisle. The cold aisle needs to be cooler than the hot aisle. Those days when your cold aisle hits 90F are the days you're GUARANTEED to destroy hardware if you don't take action. Most of the rest of June - August you'll need cooling to stay within SAFE temps.

          • by SpankyDaMonkey (1692874) on Tuesday August 19, 2014 @12:43PM (#47704867)

            Basic ASHRAE standards have a recommended range of 18C to 27C, but a maximum allowable range of 15C to 32C. If you specify an A2 or A3 ASHRAE compliance when buying your hardware you can stretch that allowable range all the way up to 35C (A2) or even 40C (A3),

            Most datacentres these days are looking closely at the ASHRAE limits and at monitoring to raise the average cold aisle temperature and make major savings. There are a lot of steps on this path if you're bringing an older datacentre up to the modern ways of thinking, including strict hot/cold aisle separation, re-alignment of hot aisles to match CRAC / CRAH units, implementation of live temperature, pressure and humidity monitoring, all the way up to a fluid dynamics analysis of airflow. You only have one chance to get it right and a huge number of ways to get it wrong so it's a very conservative approach. On the other hand being able to make a $500K annual saving by raising the overall temperature by 2C in the cold aisle and still deliver the same service is the sort of numbers that make a lot of sense.

            In addition the point I was making is that it's only during daylight that the external air temp will mean you need additional cooling, at night the temperature drops - so you only need to run your cooling plant for a percentage of the day, and the temperature at night is absolutely fine.

            Disclaimer: I'm a Certified DC Designer and a Certified DC Management Professional with 8 years experience running a blue-chip datacentre, so I live this stuff every day

    • by Anonymous Coward on Monday August 18, 2014 @05:35PM (#47699021)

      Ummm. He said BBQ. I must be from Texas, because I love BBQ. Oh, wait, I am.

  • by PopeRatzo (965947) on Monday August 18, 2014 @04:07PM (#47698417) Homepage Journal

    They should put the data center in a pyramid. Then, the servers would last forever!

  • by Anonymous Coward on Monday August 18, 2014 @04:10PM (#47698435)

    The exit is a wormhole at the bottom of a cliff at the end of a tunnel under the school.

  • by Anonymous Coward on Monday August 18, 2014 @04:14PM (#47698459)

    they should put servers in space. Space is cold, right? And private space means you can launch a 3D printer for peanuts and 3D print all the computers you need in orbit from asteroids, right?

    • Re:Nonsense (Score:4, Informative)

      by bobbied (2522392) on Monday August 18, 2014 @04:58PM (#47698761)

      It's only cold in space when you are in the shade. Direct sunlight is pretty hot stuff, but if you use reflective surfaces it limits the absorbed energy.

      The problem with space though, is it is a vacuum and usually weightless. No convective cooling, only radiative cooling. Which is why they put a huge ammonia based cooling system on the ISS that drives external hot plates they keep in the shade when they can. So apparently, cooling stuff in space isn't all that easy or cost effective.

  • Chicken coop? (Score:5, Insightful)

    by TWX (665546) on Monday August 18, 2014 @04:39PM (#47698607)
    Why did the chicken coop have two doors?


    .


    .


    Becauase if it had four doors, it'd be a chicken sedan!
  • by pz (113803) on Monday August 18, 2014 @05:01PM (#47698787) Journal

    The article lists the requirements for the structure, which include things like massive air flow, high heat density, high electrical power density, etc. Constraints like that tend to point toward structures with high surface area to volume ratios. A sphere (or section of a sphere in this case) has the MINIMUM surface area to volume ratio. So why would you want to put this structure into a dome rather than a long, low building?

    (And if you really insisted on getting all fancy, architecturally, you could still make the long low building into a ring and retain most of the advantages.)

  • by bobbied (2522392) on Monday August 18, 2014 @05:06PM (#47698819)

    Not because I would object though.. But because it gets pretty hot here from time to time.

    So, if you move it north, why not? Heck, the south pole is pretty cold most of the year..

    I have a better idea, how about we just put server farms out at sea, then just use seawater from a few hundred feet down for cooling. That works great, even in the tropics.

  • I mean, come on, we're having such a hard time getting frickin' lasers for our sharks! Let me guess, an army of accountants to figure out how much we're going to save on our taxes?

What this country needs is a dime that will buy a good five-cent bagel.

Working...