Follow Slashdot stories on Twitter


Forgot your password?
Microsoft Hardware IT

Microsoft Mulling Portable Data Centers 137

1sockchuck writes "An architect of the Windows Live team has published a presentation advocating portable container-based data centers as the future of data center infrastructure. James Hamilton, who previously was GM of Microsoft Exchange Hosted Services, contends that a distributed network of unmanned modular units 'transforms data centers from static and costly behemoths into inexpensive and portable lightweights. ... Multiple smaller data centers, regionally located, could prove to be a competitive advantage.' Both Sun and Rackable have rolled out prototypes of container-based 'data center in a box' products, and Hamilton notes that large generators are also available in trailers."
This discussion has been archived. No new comments can be posted.

Microsoft Mulling Portable Data Centers

Comments Filter:
  • by RobertM1968 ( 951074 ) on Thursday April 05, 2007 @06:36PM (#18628205) Homepage Journal
    How do they plan on making that easy on an OS that needs regular attention? This isnt a Linux, OS/2, Sparc, AIX, BSD machine that you can dump in a closet (or container) for months at a time...
    • HEY! Don't forget Netware.

      • Net....what?

        Is that anything like NetBEUI?

        (as /me ducks and runzlakhell from the gathering mob of angry CNA/CNE's...)


    • by JordanL ( 886154 )
      What about Sarbanes-Oxley requirements for data security and integrity? Call me crazy, but being portable is somewhat at odds with the text of this law.
      • Not really.
        But, rather than maintaining 3 or 4 big-ass DC's or even moderate DCs you can maintain one central DC that mirrors and backs up stuff, while the satellite units can provide local access to reduce latency, and branch center operations.
        • Re: (Score:3, Interesting)

          by tomhudson ( 43916 )

          It will certainly make it easier to hijack someone's "web browsing experience" - just hook a semi up to the trailer and drive away with it.

          Adds a whole new take to "never underestimate the bandwidth of a station wagon full of backup tapes."

      • Re: (Score:3, Funny)

        by tinkertim ( 918832 ) *

        What about Sarbanes-Oxley requirements for data security and integrity? Call me crazy, but being portable is somewhat at odds with the text of this law.

        I got the same weird vision of a bunch of sea containers on wheels, too. So , the 'meet-me' room would be in the back of the Semi's cab where the driver sleeps? or will that be put in the passenger seat, instead?

        # ping
        PING (x.x.x.x) 56(84) bytes of data.
        Reply From y.y.y.y : datacenter got a flat tire en route
        Reply From z.z.z.z : driver

    • Re: (Score:2, Insightful)

      by figleaf ( 672550 )
      Thats a Myth. Microsoft is averaging 10 employees per 50000 machines for Live.
    • well strictly speaking you can leave a windows box unattended for that long, provided you have a trustworthy third party management tool installed.

      Not that this matters. This is just Microsoft trying to find another way to stay ahead by taking other peoples ideas. I suspect it'll fail. Why? Cost, if nothing else, they always end up more expensive.
    • How do they plan on making that easy on an OS that needs regular attention?
      There are remotely managed power strips for the rackmounts. Also, a blade chassis has this capability on its own. With PXE and storage arrays, it becomes a matter of hardware just dying -- which can happen to any OS.
      • remotely managed power strips?

        I think IPMI is more industry standard:
        • I haven't seen IPMI yet on HP server hardware. They use iLO. And iLO can lock up too. BTW, iLO isn't scriptable via a terminal session, and it didn't look like IPMI provides this either when I skimmed the website. Don't knock a remotely managed power strip until you try it.

          As far as IPMI becoming an industry standard, I haven't seen IPMI on Sun or IBM either.

    • Re: (Score:2, Offtopic)

      by bendodge ( 998616 )
      A properly configured and firewalled XP Pro machine that isn't used for email or downloading anything will sit for a long, long time.

      I do not download from untrusted sources, use a Kaspersky-based anti-virus, a hardware firewall and Windows Defender, and I have never had a virus on my XP Pro machine (I manually check logs and the registry to be sure.)

      I think most of the Windows "security holes" people complain about stem from porn downloads and shady websites (esp on admin accounts), where malware is to be
      • That's sounds like a lot of security measures you had and you could have done with much less. I think a hardware firewall is sufficient in most cases (if you are using your computer as a client machine). As for people encountering "security" problems, I think it is a lack of understanding of computer systems and security as a whole.
    • by fm6 ( 162816 ) on Thursday April 05, 2007 @08:40PM (#18629525) Homepage Journal
      If you have the right hardware, you don't need to be on-site. Serious servers come with something called lights out management []. This utilizes a self-contained ROM-based system that's always running, even when the main system is shut down (or displaying a BSOD). As long as the system is getting power and there's an Ethernet cable connected to its management port, a remote user can do anything that an onsite user can do, provided it doesn't require opening the cover of the system. You can even re-install the operating system, used remote ISO and floppy images.

      I'm the documentation lead for a server [] with a LOM [] that's very fancy indeed. There's a graphic terminal service that supports things like interacting with the BIOS, or logging into the server's GUI. There's a LOM command line you can access using a serial connection or over SSH. The LOM also supports IPMI [], which is kind of a basic necessity when you have a lot of servers, even if they're all down the hall.

      This server is certified for Windows 2003 (and I understand a lot of our customers buy it for that fell purpose), so it would be ideal for Microsoft's container. However, we have a our own competing container product [].

      And yes, the company I work for is Sun, and yes, we're selling Windows-based systems now. Shocking, isn't it?
      • And yes, the company I work for is Sun, and yes, we're selling Windows-based systems now. Shocking, isn't it?
        Not really, it's simply economics. Oh, and the beginning of the end for the company. In 5 years, Sun will be just another Intel box shifter.

        • by fm6 ( 162816 )
          If it were simple economics, it would have happened 10 years ago. There was (and is) a lot of resistance to admitting that Windows is an OS you have to support. This is true both inside and outside Sun. It's particularly true on Slashdot!

          Also, building systems based on commodity chip is not the same thing as being an "Intel box shifter". (Or in this case, an AMD box shifter.) There's a lot more to designing a high-end server than picking the CPU!
    • by kjart ( 941720 )

      How do they plan on making that easy on an OS that needs regular attention? This isnt a Linux, OS/2, Sparc, AIX, BSD machine that you can dump in a closet (or container) for months at a time...

      As a member of the NOC for a network of hundreds of Linux servers I can say that you are likely exaggerating on both counts (and based on experience, very much so on the latter).

      • Well, the Linux world of servers I do not have that much experience in. But, AIX and OS/2 I do (measured in over a decade each) and have seen servers of both varieties on stable hardware (RS/6000s and Netfinitys respectively) go years without any intervention or maintenance. The numerous years old (ie: 5 to 12 years) RS/6000s where I work never have any *needed* human interaction - the company though, chose to do both off-site automated backup and on-site tape backup. The only human interaction with the mac
  • Google? (Score:5, Insightful)

    by jkonrad ( 318894 ) on Thursday April 05, 2007 @06:37PM (#18628215)

    Hasn't Google already been doing this for a couple years now?
  • ummmm... (Score:1, Offtopic)

    by sulphurlad ( 772436 )
    Didn't I read about a year or so ago, google doing this, part of the whole dark-fiber-purchase-thingy....
    • by navtal ( 943711 )
      Yep your right google is already doing this.....kinda like the new vista widgets....and a windows based GUI.....
  • Hmmm... didn't Microsoft once propose small, decentralized computing clients only to come back to more centralized computing via Windows Terminal Server?

    It's deja vu all over again.

  • Borg (Score:5, Funny)

    by jeevesbond ( 1066726 ) on Thursday April 05, 2007 @06:45PM (#18628319) Homepage

    This takes Microsoft one step closer to becoming the Borg. Just wait until one of these mobile data-centre 'cubes' appears outside a rival software company, the voice of Ballmer comes booming out of a loudspeaker: 'We are Microsoft. Open your doors and surrender your intellectual property. We will take your technological innovation and call it our own. Your culture will adapt to service ours. Resistance is futile.'

    In fact, didn't I see one parked-up outside Novell HQ recently?

    • I was thinking that if there was ever a story needing the tag itsatrap....
    • Re:Borg (Score:5, Funny)

      by MillionthMonkey ( 240664 ) on Thursday April 05, 2007 @07:48PM (#18629017)
      Mobile data centers are nothing new for Microsoft. I know a guy who drove a Luxury Car (forget what kind) and this car was sooooo wonderful it needed an operating system: Windows CE.

      It didn't broadcast Bill Gates speeches on the road, but it had the same problem as all Microsoft software- features you didn't ask for, that don't work, that can't easily be removed or disabled. He would park this thing in his garage, and once a month some process would turn on at 3 AM to condition the battery or something silly. It would crash midway through and he kept waking up in the morning to a BSOD and a dead battery powering the dim blue glow of the pixels with its last gasp.

      He kept having to take his car to the shop for patches. We loved hearing about this stuff at work, because the car always crashed for something different, but he was getting sick of it, like everyone else at the dealership. Finally one day it screwed something up again- left his windshield washer pump going all night or something- and he took it in for the last patch. The ride home was Linux powered and the fun stories came to an end.
  • by hxnwix ( 652290 ) on Thursday April 05, 2007 @06:46PM (#18628331) Journal
    Portable data centers? They can't even get portable music players right!
    • Re: (Score:1, Troll)

      by adamruck ( 638131 )
      Honestly, I would imagine that portal music players are a hell of lot harder to get "right", than portable data centers.
  • by FudRucker ( 866063 ) on Thursday April 05, 2007 @06:46PM (#18628345)
    Sun had a shipping container that was painted black, i work for a construction company that has dozens of those shipping containers and they get hot as hell inside during the summer, who ever implements these things in a shipping container (especially black ones) better get a badass air-conditioner to keep those things cool...
    • by MichaelSmith ( 789609 ) on Thursday April 05, 2007 @06:52PM (#18628421) Homepage Journal

      Sun had a shipping container that was painted black, i work for a construction company that has dozens of those shipping containers and they get hot as hell inside during the summer, who ever implements these things in a shipping container (especially black ones) better get a badass air-conditioner to keep those things cool...

      The first job I had was building a portable data centre for the Australian air force. When operating in a remote area they needed a way to analyse all the engineering data from their aircraft.

      Now for me, that made sense. The shipping container is a bad environment to work in but the military know how to cope with problems like that, and they have a genuine need for mobility.

      These days for civilian applications it should almost always be easier to get a fast line to your site and use a fixed data centre somewhere, or a combination of systems.

      • I would imagine that having your data centers in physical locations with:

        a) cheap energy
        b) cheap land
        c) cheap connections

        is *REALLY* important to running a successful business. So much so that building data centers with the ability to follow these requirements as they move around, is worth it.

        Also the ability to say "We have three extra data centers parked out back" is pretty awesome.
        • i'd imagine that cheap comms and cheap energy are probablly more important than cheap land. And theese containers sound like they will be far from an efficiant use of land anyway unless they are stacked really high.

          i'd also imagine that the cost of fast comms has a rather large upfront component offsetting the cost of moving.

        • by Znork ( 31774 )
          Cheap land tends to have neither cheap connections nor easily accessible energy. There's a reason why it's cheap.

          Really tho, I'll betcha the portable datacenter fad will last 'til the first datacenter theft ring starts up. I mean, sure, you can chain them down but considering the guys taking datacenters on would probably be the same kind of guys that take armoured transports on they'd just bring some big-ass construction equipment along and demolish any anchoring. They could get it and be out of there befor
          • I'll betcha the portable datacenter fad will last 'til the first datacenter theft ring starts up.

            the military has already solved this problem to a certain extent... just about all of their big systems are wired in some fashion with thermite []. you hit the panic button (or pull the panic pin) and the whole thing becomes a ball of white hot sparks and molten iron. i'm not sure if this would an effective civillian theft deterent or not... on the one hand no one in their right mind would steal something that

      • i was with military intelligence units when i was in the army, and i have seen some pretty cool systems built into truck trailers. the trouble with a portable data center is that as a general rule, really stable, powerful equipment isn't portable, and really portable equipment isn't very powerful or stable... just ask most laptop users. like an operating room at a hospital, your typical datacenter is very clean, controlled, and monitored environment. like a mobile OR, you are going to sacrifice contol o

    • In an article that I can't remember where it's from, I saw that it's black simply for marketing 'coolness'.
      The actual products that will be delivered to customers will be white.
    • I took some decent shots of the event, here: 7594333338018/ []

    • by jvagner ( 104817 )
      Uh, yeah, Sun thought of that.

      The actual units come painted white. The black is just a marketing detail given its name.

      The units also devote a significant portion of the space inside to a chilled water cooling system. In fact, the water has to be chilled to 55 degrees.

      There's a tour online somewhere that has more details. Google it if you're interested.
  • Wow, Microsoft states that the future is something someone has already invented! Sweet How Novel.
    • by haruchai ( 17472 )

        You, sir, are clearly incapable of appreciating the YAMI (Yet Another Micro$oft Innovation) paradigm.
  • by joesilicon ( 213295 ) on Thursday April 05, 2007 @06:50PM (#18628389) []

    A Novel Datacenter Concept

    Project Blackbox packages compute, storage, and network infrastructure capabilities into scalable, modular units outfitted with state-of-the-art cooling, monitoring, and power distribution systems. Customers will be able to order a variety of standard and custom configurations of systems, storage, networking, and software. Housed in a standard 20-foot shipping container for maximum flexibility, Project Blackbox will be easily transported using common shipping methods. Simple hookups for water, AC power, and networking will enable customers to quickly deploy Project Blackbox upon delivery.

    • A Novel Datacenter Concept

      Which is apparently based on a Cringely article [] from 2005, which may or may not have been lucidly based on a Google project.

      Innovation at its finest.
    • by kjart ( 941720 )

      Yes, exactly like that - it's the first link in the article (nice research). That is the kind of thing the person from Microsoft is advocating - I doubt they are seriously considering making a competing product.

    • Back in about 95 we put about 200+ servers and and workstations in huge steal trailers that looked like shipping containers on wheels. We then removed the sides and daisy chained them in the desert in Saudi Arabia after that HVAC & Aircraft style PDU's were attached and a satellite link was setup to the US. We would process & store data from the U2 sensors. In the US, intelligence agencies could get that data. IT was called CARS (Contingency Airborne Reconnaissance System). It was a mobile data cent
  • Wonder if MS is going to patent this "new" technology. Oh, wait... Prior art. A Google container project [] was mentioned about a year and a half ago. From TFA []:

    The probable answer lies in one of Google's underground parking garages in Mountain View. There, in a secret area off-limits even to regular GoogleFolk, is a shipping container. But it isn't just any shipping container. This shipping container is a prototype data center. Google hired a pair of very bright industrial designers to figure out how to cra

  • *snore* (Score:4, Informative)

    by Tom ( 822 ) on Thursday April 05, 2007 @06:56PM (#18628465) Homepage Journal
    Sun has one one the market, and it's been around for a couple months: "Project Blackbox" [].

    As usual, the "visionaries" at MS simply feed us what others have invented as their great ideas.
    • by kjart ( 941720 )

      As usual, the "visionaries" at MS simply feed us what others have invented as their great ideas.

      He's advocating the use of such technologies, not claiming to have invented them.

      By the way, there have been ill founded MS bashing posts on the market for years - maybe you should try innovating a new kind of post?

  • It's one thing to administer a low-maintenance UNIX box with SSH a long distance, laggy, crappy connection (it's same old same old and works almost as good as being there). It's another very, very different thing to hold a Remote Desktop session on those conditions (you'll want to stab yourself with MSDN CDs after a few minutes).
    • Re: (Score:3, Informative)

      by afidel ( 530433 )
      Uh, I administered a Windows server in Puerto Rico that had a bad T1 line, we were measuring up to 40% packet loss at times and while RDP dropped it auto-resumed once the packetloss went back down. I can't imagine trying to use SSH or X to do the same. RDP also works acceptably over 28.8 dialup, I haven't seen any flavor of X do that. You can bash MS for many things but RDP is not one of them, of course it's a good technology that they stole from Citrix but.....
      • by guruevi ( 827432 ) <> on Thursday April 05, 2007 @07:54PM (#18629071) Homepage
        I don't know where you get that, but RDP DOES NOT work acceptably over 56k. I've done it in the past (cell phone in laptop made a dial-up connection) and it is laggy and crappy. Right now I work from home, remote into my machine at work using WiFi and I have to use a VPN solution, I can't imagine doing that over anything slower than 128k.

        X is neither a good solution for that, there is something out there that is comparable to X and lightweight, but I forgot the exact name. SSH works great over 28k... if you don't have too much of stuff scrolling through the windows (cat /var/log/messages for example). SSH can stand quite some seconds of packetloss unless the whole connection breaks down, but if you got that much packetloss, then RDP is not going to help either. That is why we have utilities like screen. Still, either on Windows or Unix, SSH or something comparable (Terminal) works always better on low-bandwidth than anything VNC-like.
        • by afidel ( 530433 )
          force 8 bit color 800*600 with cache enabled and themes and sound off, turn off drive and printer mapping. As long as you aren't using IE it works fine over 28.8. I did it for several years before cable became available out where I live (too far for DSL).
        • Generally speaking cellphones are much much higher latency. The first-hop latency on my phone is close to half a second, and while it has over 100kbit of raw bandwidth you never get to actually use that because the latency sucks ass.

          RDP is significantly faster than VNC and X when it's used on a remote connection, and bear in mind that these servers are going to be installed in a DATACENTER. You aren't going to hook this container up to a 56k modem.
  • With Project Blackbox [], it's obvious that Sun is paying attention to their customers. Need to expand the datacenter, but don't have the space? Use their portable container setup. It's sheer genius, esp. for emergency contigencies/disaster situations. If I were a CIO/CTO, I would be taking a SERIOUS look at Sun's product as part of my data/computing landscape.

    (And no jokes about hijacking the container with a forklift or breaking into it... That's why you hire 24/7 security if the data is important to you
    • I know numerous Windows server administrators who'd be surprised at the claim that you'd need to visit such a datacenter 'every few days', or even 'every few weeks'. You might try sticking to the facts rather than FUD before you accuse Microsoft of spin. Glass houses and all that.
  • More of the same, "Oh yeah? Ours will be better AND cheaper!" talk from Microsoft.

    Someone needs to explain to me who is rushing to buy these things.

    High-voltage lines into the box and having air-conditioning running 24-7 just sitting in a parking lot will probably inspire a visit from the local city inspector.

    Certainly after the neighbors complain.
    • I doubt our neighbors would complain (like most other business, the only thing around our parking lot is other businesses and parking lots), but last week a phone company truck managed to hit not one, but two, parked cars in our parking lot. Happily, mine was not one of them.

      That's all I need to know to 86 the idea of putting a portable data center in my parking lot. Oh, and there are all those 50+ foot tall eucalyptus trees, too. Hate to have one of those fall on my portable data center.

      I know Hamilton, he
  • by greg_barton ( 5551 ) * <> on Thursday April 05, 2007 @07:00PM (#18628511) Homepage Journal
  • White elephant (Score:3, Insightful)

    by SuperBanana ( 662181 ) on Thursday April 05, 2007 @07:02PM (#18628521)

    Both Sun and Rackable have rolled out prototypes of container-based 'data center in a box' products, and Hamilton notes that large generators are also available in trailers."

    This strikes me an awful lot like a white elephant- it's not terribly hard to stuff a bunch of computers and an air conditioner/heating system into a shipping container with (physical) shock isolation. For Sun, it sounds like they didn't do much more than install water blocks in their servers ("cyclonic cooling", my ass.)

    More laughs:

    It's not completely plug-n-play, however. The "data center in a box" requires chilled water to support the cooling system, in addition to Internet connectivity and appropriate power infrastructure. Markoff's story notes that the prototype "sits in a container case adjacent to a Sun office building here (Menlo, Park, Calif.), connected to two large fire hoses for water cooling and 500 kilowatts of redundant power."

    500kW (which at 220V is over 2,000 amps- which is a HUGE hookup) of power is probably just for the computers. Figure at least some sizable chunk of that for cooling...

    Power, cooling, security...this seems rife with problems...

    • You're forgetting: Sun offers CoolThreads technology []. Do the math: The power savings and heat reduction on AMD's Opteron servers is considerable, esp when running Solaris. Check the benchmarks and claims - Sun is the solution to use.

      Not bad for x86 and x64 hardware that's rated by Sun on over a dozen OSes. Sun is covering their bases well. (Even if you want to run Windows within that portable datacenter.)
    • by jbengt ( 874751 )
      FTFA: ". . . and Hamilton notes that large generators are also available in trailers."

      Has he ever priced out the cost for generator trailers or A/C trailers? Good for emergencies, but not a long term strategy.
  • From TFA:

    But incorporating liquid cooling into high-density racks can address cooling challenges, and Hamilton argues that removing support personnel from the data centers will improve reliability, noting that "human administrative error causes 20% to 50% of system outages."
    What, humans with remote access won't screw up? Maybe they don't trust MCSE's with screwdrivers... This initiative doesn't address the real problems in large-scale installations, energy density and power conditioning.
  • Didn't Sun already do this? [].

    Embrace and extend, indeed.
  • Unmanned? (Score:4, Funny)

    by cxreg ( 44671 ) on Thursday April 05, 2007 @07:36PM (#18628889) Homepage Journal
    Who's going to reboot the machines every other day?
  • How interesting that this article comes out of all days on today. Sun's Project Blackbox is in San Diego right now across the street for display. []

    I took the tour today, got a neat tee shirt and free lunch too!

  • Wow, what an original an innovative idea! It's too bad that Google didn't think of this []. Or I'd really expect that Sun would have come up with something like this by now []. But it took those geniuses from Redmond to deliver true innovation!
  • James Hamilton, who previously was GM of Microsoft Exchange Hosted Services,

    Am I the only one that read that last bit as "Microsoft Hostage Exchange Services"? I mean, I know MSFT likes to lock up your data in proprietary formats, but that's going a little too far....
  • some photos I shot of the sun blackbox thing: 7594333338018/ []

    its really cool to see in person. the equipment is very tightly placed inside. the water input and output valves are equally impressive.
  • This concept has a strong point, and that's the ability to transport a pre-configured data center by truck, train, ship, or cargo plane. If you suffer a fire, flood, or other accident that knocks your site out of commission, just write a check and have your business running again tomorrow from a temporary data center parked out back (you do have up-to-date off-site backups for your data, right?). With the incredible lost-business cost of down-time there may even be a market for "insurance plans", where a po
  • Rackables is crap (Score:4, Informative)

    by twigles ( 756194 ) on Thursday April 05, 2007 @11:57PM (#18630893)
    My company almost bought a TON of Rackables. We're growing really fast and are building out multiple big DCs (>1k square feet) in the next year. These guys came in saying they could not only deliver a rack of servers on wheels, negating our data center operations team's need to rack everything, but also that they could double the number of servers we could fit in a rack.

    The number of servers per rack is constrained by electricity. For a while we couldn't figure out how they fit 48 servers into the same amount of electricity that our current server vendor used to power 24 + 1 switch. That is until we pulled a server apart and saw that they are using LAPTOP CPUS. The servers don't perform nearly on par with normal ones. They were, and are, selling snake oil.
  • So the original XBOX doesn't count?

    There must be a size limit... I for one wondered how something so massive as the original XBOX could even exist. Shouldn't it exceed the Chandrasekhar limit? Perhaps that's it. Perhaps XXBBOOXX don't exist in a form that we can understand. The bloody things may very well have collapsed into singularities that float about the universe, consuming hapless victims, tearing them from reality with their merciless, Stygian flows. Oh noes, I'm feeling the most peculiar dra
  • Woohoo, just what I need. Keeping a baby like that running will be a bitch. Permanent employment in a box...
  • And the reason to regularly move these ("portable") data centers is...?
    And Microsoft's experience with unmanned, yet critical path, infrastructure, that actually works without operator intervention is...?

    Next they'll be selling us flying cars.
  • I for one don't want my servers in a data center that is easy to move and hence easy to steal. Think about it, they just steal the datacenter, then they can take their time getting say cc and other personal data of your customers off the systems at their leisure.
  • This thing seems to be another copy of someone else's strategy. Namely Google. I didn't know about Sun's version. However the one advantage that Google has over Sun and MS is that they already have the infrastructure in place to execute this better. Since they have all that dark fiber, they can put one into place in the field a lot easier. Sure Sun and MS could purchase the bandwidth they need but Google seems to have thought of the whole concept not just one piece of it.

"Life sucks, but death doesn't put out at all...." -- Thomas J. Kopp