Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Data Storage Hardware

How Data Center Operator IPR Survived Sandy 50

Nerval's Lobster writes "At the end of October, Hurricane Sandy struck the eastern seaboard of the United States, leaving massive amounts of property damage in its wake. Data center operators in Sandy's path were forced to take extreme measures to keep their systems up and running. While flooding and winds knocked some of them out of commission, others managed to keep their infrastructure online until the crisis passed. In our previous interview, we spoke with CoreSite, a Manhattan-based data center that endured even as much of New York City went without power. For this installment, Slashdot Datacenter sat down with executives from IPR, which operates two data centers—in Wilmington, Delaware and Reading, Pennsylvania—close to Sandy's track as it made landfall over New Jersey and pushed northwest."
This discussion has been archived. No new comments can be posted.

How Data Center Operator IPR Survived Sandy

Comments Filter:
  • PPPPP (Score:5, Insightful)

    by Anonymous Coward on Monday November 19, 2012 @02:30PM (#42030277)

    Proper Planning Prevents Poor Performance

  • by crazyjj ( 2598719 ) * on Monday November 19, 2012 @02:34PM (#42030325)

    My friend Rahul and Sameer will take care of your needs, and are to be speaking excellent English, most also.

  • by joeflies ( 529536 ) on Monday November 19, 2012 @02:50PM (#42030559)
    In his response to the question "So you suffered no downtime at all?", the business development manager provided a non-answer to a yes/no question. The interviewer should have followed that question up to clarify.
  • by kiite ( 1700846 ) on Monday November 19, 2012 @02:53PM (#42030613)

    "We didn't do anything special; our power never went out."

    • by mcgrew ( 92797 ) *

      Of course their power never went out, they had three separate electric companies wired in, if one went down a second kicked in, if that went down one of their two generators kicked in.

      • Re:tl;dr version (Score:4, Informative)

        by kiite ( 1700846 ) on Monday November 19, 2012 @03:20PM (#42030931)

        Right. And, according to TFA, none of their supplies ever went out. I live in NYC. A lot of the city lost power, sure. The transit system was knocked out, sure. There was a lot of flooding in fringe areas, where most data centers weren't. This guy is talking about NYC like it got demolished by the storm. Slashdot already did a piece on how a NYC data center mitigated power loss; reading TFS, I was hoping for a point of view from a more heavily battered standpoint. Instead, I got, "We had back-ups, and we think they work because we test them regularly, but we didn't actually have to do anything."

      • Still... whole sections of the grid went down in the NYC area, electrical companies own portions of that, it sounds like where they were at, they still had access to power, most didn't, it was generator or shut down, and for most it was the latter. I'm not trying to bash them by any means though, 2n is hella impressive in scope and investment. The question now is will other datacenters follow suite, despite a hurricane up there being a 1/2 century type event.
      • by hawguy ( 1600213 )

        Of course their power never went out, they had three separate electric companies wired in, if one went down a second kicked in, if that went down one of their two generators kicked in.

        I wonder if that's true - I can believe that they bring in power from several different substations, but if there were a widespread grid outage, it seems like that would have taking out all of their substations.

        I don't understand this comment:

        We potentially have two power grids. We actually have three. The third we would never go to, I don’t think.

        Why would they connect to a power source that they'd never go to?

        • Re:tl;dr version (Score:5, Informative)

          by Rich0 ( 548339 ) on Monday November 19, 2012 @03:49PM (#42031241) Homepage

          Yeah, Reading PA. Go look it up on a map. They weren't going to be having multiple substation outages that far inland. My own workplace didn't lose power and is about 20 miles further East. It doesn't hurt that they're about 200 yards from a substation and that both the substation and the plant site are fed by transmission lines on steel towards that stand WAY above the height of nearby trees.

          Most of the outages for Sandy were due to flooding or downed trees. The former was only a problem along the coast or near rivers, and really a big problem for NYC where they have transmission equipment underground. Trees are horrible for the last mile of power delivery, but aren't an issue for the major substations, since if you drive by one of them you'll note that the transmission lines are WAY up in the air, and the trees are trimmed back a huge distance on either side of them anyway. The towers themselves are steel and on concrete foundations - they're not going to fall unless they're hit by something like a tornado.

          The reason so many lost power wasn't because of transmission being cut, but by a bazillion downed trees taking out every other telephone pole in the region. If you want an IT analogy imagine if all your big network feeds and datacenter are intact, but some vandal walks around your building and sticks a firecracker next to every single network port.

          For an inland location like Reading PA, this was just a matter of having either good power connectivity, or generators. Wilmington is next to the Delaware Bay and would be at more risk, but as long as you're at reasonable elevation and above-ground you'd be fine.

          • by hawguy ( 1600213 )

            Yeah, Reading PA. Go look it up on a map. They weren't going to be having multiple substation outages that far inland. My own workplace didn't lose power and is about 20 miles further East. It doesn't hurt that they're about 200 yards from a substation and that both the substation and the plant site are fed by transmission lines on steel towards that stand WAY above the height of nearby trees.

            Sure, not for this disaster, but how about the next one that might be a tornado that goes through Reading. Or an earthquake. Or an east coast ice storm that downs powerlines (including steel towered transmission lines) throughout the region.

            Locating a datacenter well outside of a disaster zone just shows that they were lucky for this particular disaster.

            • While you are right, the big selling point for a lot of data centers is physical location. IO Data here in Scottsdale for instance prides itself on the fact that there really is no severe weather in the area. Historically the area is geologically stable, not prone to flooding, no where near any forest fires. So their location is their first defense against disaster with N+3 redundancy as additional defenses.

              Disaster planning is hard, some things you take for granted during normal times simply aren't availa

            • by Rich0 ( 548339 )

              I think the most likely disaster scenario for Reading PA would be a meteor impact. That area of the country just doesn't get much in the way of natural catastrophes unless you happen to be right next to a river or creek that can flood. I think a tornado makes the news about once every three years and is generally confirmed by the lawn furniture being dispersed in a non-linear pattern.

              Sure, it can happen, but it is about as uneventful an area as you'll find.

              Oh, ice storm is another failure mode for sure -

      • In fact the standard commercial power in much of the area didn't go out. There were presumably the usual power lines hit by trees or other local outages, but the power grid stayed up. It's too far from the ocean for tide and storm surge flooding, and much of the storm energy either didn't head their direction or got expended on New Jersey.

  • by Anonymous Coward

    generators + diesel, it's not rocket science.

    • Oops, the basement is completely underwater and the fuel tanks are flooded. What do you do now?

      Oh, you put the fuel tanks up high? No you didn't, that's against the fire code (bad idea having flammable liquid above people's heads in a fire).

      It's OK, your tanks didn't leak, and you were clever enough to put your generators up high. But the fuel pumps shorted out.

      Alright, you got lucky and the pumps were fine. But now you're out of fuel, as is everyone else, and travel is difficult since tunnels are flooded s

      • by hawguy ( 1600213 )

        What does Mr. Anonymous Coward, Site Reliability Engineer Extrordinaire, do now? More importantly, did you think of it before this hundred-year storm?

        I think he'd do exactly what this ISP did -- locate their main facility outside the city so they aren't constrained by urban high-rise fire codes and expensive real estate costs.

        Note that their Wilmington facility is in a high-rise building and only has around 10000 gallons of fuel. At 3MW, that gives them around 48 hours before they need to refuel, so if they experienced flooding and power loss at that site, they would have had the same problem as the NYC datacenters. Their suburban Reading facility has o

        • Sure, but then most of the transit is in cities, and it's either more expensive or slower to build outside. It's a tradeoff, to be sure.

          Nobody's suggesting that it's impossible for a DC in NYC to weather the scenario that ended up happening, but it's into the diminishing returns so it's a lot more expensive. Even if the claim could be made that the DC made mistakes, they weren't trivially stupid mistakes - which is what the GP was implying.

      • by mlts ( 1038732 ) *

        This is why that business critical stuff does work with more than one data center. There is only so much that can be done at one location.

        Yes, the generator may fire up, but even when the diesel tank is full, assuming no trucks available to refill it, how long will it last, especially if power is out for weeks. There is always the ability to use natural gas for a generator, but on a DC level, it would take some large pipes to handle the gas coming in, and this assumes the lines are pressurized.

        Having mult

      • by Hobadee ( 787558 )
        A well planned data center will have a fuel-delivery contract that says something along the lines of: "After X days, you must be able to deliver Y fuel every Z days. If you don't or can't deliver Y fuel every Z days, you pay us for the downtime we incur."

        As long as they have enough fuel onsite to last X days, they are fine; The fuel delivery company is on the hook if they go down. (Assuming everything is regularly tested and in good working order.)
        • I can't believe any sane fuel delivery company would sign a contract as simplistic as that. I'd at the very least expect a real contract to have "Force majeure" clauses and a set price for each day of downtime.

    • by hawguy ( 1600213 )

      generators + diesel, it's not rocket science.

      I'd say that locating outside of the hurricane's path was the better choice - having backup power does no good if the carriers that serve you can't power their equipment (like the earlier anecdote from the other datacenter about one of their carriers having their generator confiscated by the NYPD)

      • by ATestR ( 1060586 )

        I'd say that locating outside of the hurricane's path was the better choice

        If it isn't a hurricane, it's an Earthquake. If not that, then a nasty Blizzard. Or a Tornado. You can't avoid them all, and its best to prepare as best you can for the events possible at the location that you choose. No one can prevent all disasters, but you can mitigate the risk.

        • by hawguy ( 1600213 )

          I'd say that locating outside of the hurricane's path was the better choice

          If it isn't a hurricane, it's an Earthquake. If not that, then a nasty Blizzard. Or a Tornado. You can't avoid them all, and its best to prepare as best you can for the events possible at the location that you choose. No one can prevent all disasters, but you can mitigate the risk.

          And the best way to mitigate risk is to have your DR site in a completely different geographical area. Relying on a single datacenter to keep your company running during a large scale disaster is foolish.

    • Re:generators (Score:4, Interesting)

      by LoRdTAW ( 99712 ) on Monday November 19, 2012 @04:48PM (#42032077)

      I work next to a Verizon data center out here in Farmingdale, Long Island (supposedly all the Verizon cell phone traffic for long island). They recently built extensions to the building and had two large diesel generators installed, a 15,000 gallon fuel tank along with two large cooling systems. Turns out they needed it.

      During the aftermath they didn't run out of diesel because they bought in an additional on site 15,000 gallon fuel tank (in a 40 foot container). Plus they had semi trucks with sleeper tractors from out of state with trailers full of diesel ready to fill the tanks back up on site 24/7. Armed guards manned the premises 24/7 and lived out of a mobile home. They also bought in generator powered flood lights to keep the surrounding property lit up like it was day light. Those generators sounded like a pair of locomotives running, and probably because they use engines of similar size. My manager found out they burn 4000 gallons of diesel a day keeping the building going. They ran the generators until the 7th or 8th. So they burned something like 40,000+ gallons of fuel in that time frame.

      If you have the money and the right infrastructure, you can keep the power going as long as you need.

  • by Anonymous Coward on Monday November 19, 2012 @03:20PM (#42030929)
    I had perfect 100% uptime during Sandy. No packet loss, no adverse effects, no fuss, no muss.

    Please write a Slashdot article about me. Also, please conveniently ignore the fact that my datacenter is in Kansas City, MO.
    • by Bigby ( 659157 )

      I had no packet loss either. It's kind of hard to lose something that was never sent.

      I live just south of Hoboken.

  • Not being in the path of the worst of the storm. When everything around you stays up and running, your DC probably will too. These guys are genius. Who interviewed these tools and why?
  • They're built for redundancy, if any integral systems went down in both THAT would be news. They could have lost one complete physical location and their clients would be upset but not out cash....

    This isn’t a sales pitch, but if you do it yourself, you only have yourself to yell at, to complain to.

    Then why does it read exactly like that with no real substance and mediocre answers?

  • If a generator's sitting idle in a data center, and the power never goes out, is it working?

    You know, I heard data centers in Bangalore also had perfect uptime during Hurricane Sandy. Well, at least the ones that weren't suffering from brownouts.
    • by mcgrew ( 92797 ) *

      If a generator's sitting idle in a data center, and the power never goes out, is it working?

      Yes, they test them regularly, especially if they're expecting a big storm.

  • by sl4shd0rk ( 755837 ) on Monday November 19, 2012 @03:25PM (#42030995)

    1) On-site Diesel to power ops for 48hours
    2) Tanker of Diesel pump->doorstep within 12hours
    3) Generators
    4) Backup generators
    5) 48hours worth of food for staff + repair guys
    6) nearby lodging reservations staff + repair guys

    • by Keruo ( 771880 )
      Or you could

      1) place physically similar datacenters around the world
      2) make your datacenter virtual, so you can keep the applications running at any place, and verify that hot-migrate works
      3) ignore localized storms, since you have capacity and uptime on global scale

      Sure, you notice that the datacenter goes down, but you don't have to waste diesel on generators, since the services have already been handed over to the next datacenter to handle.
      Your crew can stay at home sleeping in their own beds rather

    • by funkboy ( 71672 )

      Weeelll, the problem with #2 in NYC was that the city wouldn't let fuel trucks for the datacenters in lower Manhattan into the area until the debris for their were cleared (which makes sense as they didn't want to have to deal with stuck fuel trucks too). Most of the NYC DCs that ran out of fuel ran out because of this reason.

      Which brings up a few common rules for ultra-high availability datacenters:

      - don't build them in the middle of a city (riots, strikes, traffic, WTC being kamikazied, etc)

  • by sjames ( 1099 ) on Monday November 19, 2012 @04:36PM (#42031913) Homepage Journal

    I did just fine during Sandy as well. I have a laptop with a good battery and I can always run it from a cigarette lighter adapter in the car, but I never lost grid power. Of course, I live in Ga. but that's beside the point.

  • this is nothing new. I built this 10 years ago: http://patentscope.wipo.int/search/en/WO2003090106

Never test for an error condition you don't know how to handle. -- Steinbach

Working...