Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
HP Hardware IT

Web Servers Getting Naked, For Weight Savings 101

1sockchuck writes "Cloud computing is causing servers to get naked. HP today announced a 'skinless' server optimized for customers packing thousands of servers into cloud or HPC environments. This follow the lead of SGI/Rackable, which ditched the cover when it introduced bare bones servers for its CloudRack (previously discussed here). HP says the skinless design makes servers far lighter, which is apparently an issue when shipping them by the rackload."
This discussion has been archived. No new comments can be posted.

Web Servers Getting Naked, For Weight Savings

Comments Filter:
  • Hottttt! (Score:5, Funny)

    by The_mad_linguist ( 1019680 ) on Thursday June 11, 2009 @07:08PM (#28302585)

    Servers getting naked - IN YOUR EMAIL.

    Just sign up for our newsletter, and all of those from our affiliates, co-conspirators, third party hordes, and lawyers...

    Sure, you can find naked servers at google, but don't you prefer the personal touch?

  • Wait... (Score:3, Interesting)

    by gbarules2999 ( 1440265 ) on Thursday June 11, 2009 @07:09PM (#28302595)
    Naked doesn't quite equal skinless, unless you're saying that you're naked of skin. Or...?
    • Re: (Score:3, Funny)

      by Rog-Mahal ( 1164607 )
      Yeah, I'm thinking skinless porn appeals to quite a different subset of the public than naked porn. And yes, this did degenerate to porn already.
      • Yeah, I'm thinking skinless porn appeals to quite a different subset of the public than naked porn.

        Like zombies or something?

      • "There's a place down the street; Seven Xs. What does that mean? Maybe it's... girls without skin."
        --Tom Waits

      • Re: (Score:1, Funny)

        by Anonymous Coward

        Yeah, I'm thinking skinless porn appeals to quite a different subset of the public than naked porn.

        Wait a minute, I used to work for a guy like that. "It puts the server in the rack or else it gets the hose again."

        Worst job ever.

      • And yes, this did degenerate to porn already.

        Flynt's Law: All articles with the words "getting naked," "virtual reality," or "tactile response" in the title will quickly degenerate into discussions of porn.

    • Re:Wait... (Score:4, Funny)

      by QuantumG ( 50515 ) * <qg@biodome.org> on Thursday June 11, 2009 @07:40PM (#28302881) Homepage Journal

      Damn you Robbie Williams!

  • Blade? (Score:5, Insightful)

    by TopSpin ( 753 ) * on Thursday June 11, 2009 @07:12PM (#28302629) Journal

    The new 'blade'; 19" wide and 1.75" tall.

    I see discrete Ethernet phys, VGA, USB, etc.; all the horrible stuff blades are supposed to consolidate away. Turns out all the proprietary silicon, software and exotic backplanes necessary to make that real costs too much and is creepy.

    And you can quit calling it "cloud" now... they're just hosting providers and you know it.

    • Everybody's coming with the 1/2 width 1U servers now. HP has quite a few of them. They're more dense even than HP blades, and they use standard interconnects.

      The recently announced DL1000 [hp.com] is another one, with the connections in the back as is traditional, and up to 16 SAS drives.

    • And you can quit calling it "cloud" now... they're just hosting providers and you know it.

      Hmmmm.....actually that reminds me. While eating breakfast this morning, I had this crazy idea: if you were doing compute/web hosting with whitebox servers, would that be "White Cloud computing"? Just wondering.

  • Is the Airflow OK? (Score:4, Interesting)

    by Philip_the_physicist ( 1536015 ) on Thursday June 11, 2009 @07:16PM (#28302653)

    This makes sense, since the dust should already be filtered, which removes a large part of the need for a case. However, I do wonder about the airflow, since an ordinary case helps to direct the airflow through the kit rather than over the top, which might be a problem. On the other hand, without a case, the ventilation will be much better, so what is lost on the swings may be gained on the roundabouts.

    This is a nice idea though, and would make sense for rackmount routers/switches, since these usually sit in an enclosed cupboard anyway.

    bTW: first?/p

    • by Yvan256 ( 722131 ) on Thursday June 11, 2009 @07:25PM (#28302749) Homepage Journal

      It's for THE CLOUD COMPUTING, there's not going to be any airflow problems!

    • by fuzzyfuzzyfungus ( 1223518 ) on Thursday June 11, 2009 @07:33PM (#28302819) Journal
      Given the fact that there don't seem to be any fans pictured in the chassis itself, and given that this is targeted at very large scale customers, I'd assume that the use case for these things is "inside a specially designed rack with power and cooling, with that specially designed rack completely full" which would allow the OEM to just validate against that case.

      If you just put one of these on a table, it'd probably overheat; but, if you want to do that, HP wants to sell you a pedestal server instead.
      • Re: (Score:2, Insightful)

        by SpudB0y ( 617458 )

        Apparently you and everyone else who modded you up didn't read enough of TFA to see the second picture showing the fans.

        • by fuzzyfuzzyfungus ( 1223518 ) on Thursday June 11, 2009 @08:19PM (#28303139) Journal
          You might want to RTFA a little more closely: The fans in the second picture are embedded in a larger chassis, into which the module in the first picture is inserted. There are no fans in the server modules. There are fans, and power, provided by the larger enclosure into which they are inserted. I was wrong to speculate that it was full rack(the z6000 enclosure is only 2u, for reasons I can't quite fathom); but the thermal performance of the bare server does, indeed, depend quite closely on the enclosure into which it is inserted, just as I speculated.
          • Additionally, fans without some kind of enclosure wouldn't provided for proper airflow. AMD's and Intel's system design recommendations both state that a proper enclosure is necessary and even give details as to how that closure should work with regard to airflow.

          • by SpudB0y ( 617458 )

            You said if you put it on a table it would overheat. I don't think it would do much of anything without a power supply.

      • Re: (Score:3, Insightful)

        So essentially, the whole data center or the rack itself becomes the cooling case. I like it.

    • The product page is here [hp.com]. There are pictures. And specifications. And video. I think we can skip the rest of the blogofrenzy and get our info from the source.

      BTW, this looks pretty much like a DL1000 installed backward. It's probably different in some other meaningful way. I didn't look too close.

    • Re: (Score:2, Insightful)

      by karbonKid ( 902236 )

      However, I do wonder about the airflow

      The floor of the case above forms the roof of the case below when mounted in a rack. Airflow problem solved. IIRC, Google was the first to implement this.

    • I wonder about the structural integrity and the electrical grounding.

  • by Optic7 ( 688717 ) on Thursday June 11, 2009 @07:20PM (#28302699)

    My friend used to run a BBS way back when, and he told me he would just hang the motherboards and other components on a pegboard on the wall. Similar idea, but I think he was doing it to save money on cases and possibly to save space as well.

    • It is a cool look when your datacenter doubles as a social gathering place. I've done it too. Not so cool when somebody falls into a server, the cat decides to play with the CPU fan, that sort of thing. Definitely want to go with the non conductive remote control helecopter.

      I heard Google started with restaurant racks or something like that.

      Oh. And the direct from the source link is go extremescale [hp.com]

    • My server boards and other pieces of kit are cable tied to Canadian Tire plastic shoe racks.

      Now *that* is cheap...

    • Re: (Score:3, Funny)

      A chum of mine swore by running his computer "naked" in a somewhat modified Walmart mini-fridge for some unconventional cooling. Well, until he spilled a 1L of pepsi on his motherboard.
    • I'm kinda doing the same thing right now, I've got my Mobo sitting in its box, the HD PSU and DVD are all sitting on the desk.
  • Big deal (Score:3, Funny)

    by Anonymous Coward on Thursday June 11, 2009 @07:32PM (#28302807)

    Company charges more for servers with less steel - film at eleven.

    • Reminds me of those "diskless workstations" which cost more than an equivalent computer. It always seemed like a scheme to sell computers with no hard drive and make a killing.

  • by selven ( 1556643 ) on Thursday June 11, 2009 @07:44PM (#28302901)
    Makes the servers more serviceable, and in a server closet there isn't much that would require a skin to protect against.
  • by Datamonstar ( 845886 ) on Thursday June 11, 2009 @07:48PM (#28302931)
    ... time. It's cooler, faster, lighter, cheaper and better for the environment and it looks a hell of a lot bad ass when you open up a system that's got it's guts exposed and just start hot-swappin' like a mofo. Sad thing is that it's driven by $$$$ and the need for companies to shave even a few pennies off their TCO when I've been doing it to my systems for years now for the above-state reasons.
    • ...it looks a hell of a lot bad ass when you open up a system that's got it's guts exposed and just start hot-swappin' like a mofo

      A mechanic was removing a cylinder-head from the motor of a Harley motorcycle when he spotted a well-known cardiologist in his shop. The cardiologist was there waiting for the service manager to come take a look at his bike when the mechanic shouted across the garage "Hey Doc, want to take a look at this?"

      The cardiologist walked over to where the mechanic was working on the motorcycle. The mechanic straightened up, wiped his hands on a rag and said, "So Doc, look at this engine. I open its heart, take the valves out, repair any damage, and then put them back in, and when I finish, it works just like new. So how come I make $25,000 a year and you get $160,000 when you and I are doing basically the same work?"

      The cardiologist paused, smiled and leaned over, then whispered to the mechanic...

      "Try doing it with the engine running!"

      • Re: (Score:1, Informative)

        by Anonymous Coward

        Actually, cardiologists DO stop your heart with a carefully timed defibrillator (or should I say fibrillator). Not the big paddles like you see on the ER shows. First they crack open your chest, then they use these tiny metal paddles, which don't need much juice to do the trick because there's much less resistance.

        Annnd my captcha word is Autopsy. How apt!

        • Re: (Score:3, Interesting)

          by adolf ( 21054 )

          Right. But in cases when the heart needs stopped, there's a heart lung machine [wikipedia.org] plumbed into place in order to take over for it. And if anything stops for any real length of time, the patient dies.

          It's like rebuilding a Harley motor, with no battery, without losing the radio[1] presets, and while maintaining a functional and running (if substitute) driveline the entire time, while ensuring that nothing ever stops because if it does, the bike will die. And then, all the kings horses and all the kings men,

      • by Shatrat ( 855151 )
        Is the joke the fact that the probability of an H-D motor running is close to 0?
  • Naked? (Score:3, Insightful)

    by gmuslera ( 3436 ) on Thursday June 11, 2009 @07:52PM (#28302967) Homepage Journal
    I would say skinless instead. Probably for servers "naked" should mean without OS installed, what looks specially attractive since you can choose to install on them open clothes.
  • Nice rack (Score:5, Funny)

    by Anonymous Coward on Thursday June 11, 2009 @07:56PM (#28302987)
    Nice rack
    • a complete new meaning for "let me see you stripped".

      has anyone mentioned rule 34? a whole new world of geek porn...

  • by egcagrac0 ( 1410377 ) on Thursday June 11, 2009 @08:19PM (#28303133)

    They added a 12V only power supply and a 12V battery, integrating the UPS as well. All the 12V stepdown can happen on the mainboard!

    Totally OK if the battery is an optional replacement for the second hard drive.

    • Re: (Score:3, Funny)

      by elronxenu ( 117773 )

      +1 Google.

    • Re: (Score:2, Interesting)

      by metallurge ( 693631 )
      It has seemed to me for some time that there would be a market for somebody to manufacture a little VRM module that plugged into the ATX motherboard connector, and had two screw terminal inputs to hook up, say unregulated +12/24/48V as from say solar panels or batteries. A more deluxe model could also have auxiliary outputs for CDROM/HDD/FDD/VIDPWR connectors.
    • The concept you describe is implemented in some data centres--the rack contains not only shared cooling smaller number of much larger sized fans) but also the power supply, where AC goes in and the DC goes out to all the machines in the rack.

      On a smaller scale I've done this at home: At the telephone demarc point where the electrical panel is I mount my DSL modem, a switch and an old FlexATX board to perform router/firewall functions. Instead of a PC power supply and multiple "wall warts" there is one pow

  • Not really new (Score:4, Informative)

    by chuckymonkey ( 1059244 ) <charles@d@burton.gmail@com> on Thursday June 11, 2009 @08:28PM (#28303195) Journal
    Origin 3000 series servers did this a long time ago. The bricks in the system were just fans on the front and a base plate to mount the hardware onto. They were pretty easy to work on in this configuration, you could pull the brick out and replace anything inside within a couple minutes. IRIX on the other hand.........
    • by stox ( 131684 )

      IRIX had its problems, but it still did things that Linux does not. I hope that Rackable decides to open source it.

  • by johncandale ( 1430587 ) on Thursday June 11, 2009 @08:40PM (#28303295)

    "Cloud computing is causing servers to..."

    What's with calling everything by near meaningless terns like 'cloud computing' all the time now?. The coverless servers are not due to 'cloud computing', they are just a different technic for server farms. It could be for databases, large analysis, supercomputing, regular network hosting, etc. There is nothing about this that makes it exclusively meant for 'cloud computing' , it's just an idea for large arrays. Unless you a a marketing tool stop saying cloud computing just because it's the hot new phrase. Save it for when it's relevant

    • It's the buzzword quotient, you know, the BWQ. A tech article is measured in how "cool" it is, especially for the droids working in IT management -- you know, the ones with no computer skills whatsoever that used to work in marketing and now head whole IT departments because they thought they could get a cool title like CIO and matching paycheck? -- by it's BWQ. The more buzzwords, the higher the BWQ, the higher the BWQ, the more likely it will be read those former marketing people with the title "CIO".

      Th

  • by Lord Byron II ( 671689 ) on Thursday June 11, 2009 @08:43PM (#28303313)

    Why does a computer have an external skin anyways? It's helpful for desktops to prevent damage from spills, but in the rack mounted environment, unless the skin increases cooling somehow, it's actually worse than useless.

    • Re: (Score:1, Informative)

      by Anonymous Coward

      Cases (skins to you sonny) do increase cooling by channeling airflow. Without proper channeling moving air is like herding cats.

    • Re: (Score:1, Informative)

      by Anonymous Coward

      RF shielding.

    • Protection (Score:3, Insightful)

      by TheLink ( 130905 )
      If someone comes by to fix just one server on a live rack, it helps prevent stuff like screwdrivers etc from falling into the other servers. Or cables from tangling with the wrong stuff...

      Skinless is fine when you can treat each server/blade as a "card" in the "computer" (rack). Or you're running one of those massive sites that only changes stuff "by the rack". Then you just wheel out the entire rack and replace it with a new one :).

      It's not so good in "messier" and more heterogeneous server rooms - where s
      • by mlts ( 1038732 ) *

        Maybe the answer for this would be getting PC makers to agree on a standard rack enclosure setup, and a standard form factor for inserted blades, regardless if the blade is a general purpose PC, a router, a switch, or another appliance. The enclosure would handle the power supply, RF protection and signal grounding, and the blade makers make sure that their machines adhere to the usual engineering specs (dimensions, airflow, power reqs, tolerances, noise, etc.)

        This way, enclosure makers can make frames not

    • It probably depends upon the cooling strategy. An external skin (i.e. case) can act as a sort of fan shroud to direct air over certain components within the chassis or force a general flow of air through the enclosed space that might not be as efficient or effective without the shrouding provided by the enclosing case.
  • by DeadS0ul ( 784783 ) on Thursday June 11, 2009 @09:02PM (#28303435)
    ...You can see their bare circuits!
  • Naked servers have been running for ages, motherboard disk etc. plugged together without any sort of case. Do not forget to correctly ground every component although.

    Google even had their first stacks of hard drives running naked, they kept them in rack build of Lego blocks to allow air circulation I would presume. Motherboards were probably naked too although I can't tell for sure.
     

  • Things usually have covers for a reason.

    I can see admins trying to install a new server and accidentally dropping one of the mounting screws into the rack. Poof! Admins would probably just try to RMA the unit. I see this costing HP some money because of that.

    • Not to be picky, but if you're buying these you're probably buying a rack at a time, and from what I can judge, it all but gets rid of the screws.

      Also:
      The last HP server I installed the rails didn't need screws, they just snapped in.

      The last HP server I built (yes, I built about 12 HP servers because they decided to send us all the parts SEPARATELY and we had to assemble them in house) needed just the wrench they include on the top of the case. None of the screws were loose such that you could dro
  • by dilvish_the_damned ( 167205 ) on Thursday June 11, 2009 @10:22PM (#28304015) Journal

    Except these are designed to arrive with the hoods "pre-lost", saving us the trouble.

  • Running servers naked is likely to make them a larger spewer of RFI (Radio Frequency Interference). All those computers are nice until the FCC comes along and says you have to shut it down because you're interfering with the neighbors.
    • Re: (Score:3, Insightful)

      by Kaboom13 ( 235759 )

      The special enclosure (that also supplies cooling) shields it to FCC standards I'm sure. It's not like one of the biggest computer hardware manufacturers doesn't know about FCC regulations.

    • I would think properly mounted in a rack that the rack iteself would provide sufficent RFI sheilding. The fronts and backs acting almost like a faraday cage, and sides being pure metal.
  • RTFA (Score:4, Insightful)

    by adolf ( 21054 ) <flodadolf@gmail.com> on Thursday June 11, 2009 @11:32PM (#28304403) Journal

    I see a few confused posts here about "WTF? Cooling?"

    Just RTFA, folks. It's a blade server arrangement, not a standalone computer. These "naked" computers are nothing more than a pair of dual-proc computers, in a 1U-ish chassis without a lid, which needs to slide into the appropriate rack-mounted housing in order to work. This housing includes all of the cooling and power supply goodness one would expect, and (of course) includes a top panel to promote useful airflow and limit RFI.

    I don't see much "new" about these things at all, since AFAICT most/all "blade servers" were already naked since their inception.

    Color me unimpressed.

    • by afidel ( 530433 )
      Actually HP's b and c class blade servers had/have a full enclosure for each blade.
    • Ewwww, look at you! All trying to live up this "NEWS for Nerds". It's Friday, dammit, let us have our Gossip for Nerds!
  • The blade itself is called an ExSO.

    Is the frame it lives in called an ExSO Skeleton?

  • Is this thing running on DC?

  • Back in the good old days we first had computers that were made out of cased modules. Then we made computers out of PC boards that fitted into racks with backplanes. Then (in our case) we combined 3 CPU/memory boards into one chassis, with common fans and PSU, only we were too young and ignorant back in 1981 to know that Marketing had to call it a "blade system".
  • It's a bit silly: for scalable, cheap, replaceable systems, it makes sense, especially by putting the IO in the front for both systems that are in that box. And it avoids some of the single point of failure and high add-on expenses of blade servers, especially the cost of the multiple internal switches and remote KVM capabilities. (Has anyone else run into the "turn off spashimage in grub" problem to get access to the serial port boot console? That is a royal pain.)

    But I'm concerned that people will install

  • But of more importance to trolls: is the server also petrified?

Like punning, programming is a play on words.

Working...