Supercomputers - Does the Cabling Matter? 95
papaia asks: "Having watched, for a while, the development in the area of high-density server hardware solutions (i.e. blade servers), like IBM's 'top gun', and their increased presence in Data Centers, I have been wondering if anybody has had any experience (thus comments) in regards to how important - in such highly priced solutions - is (or could be) the [always neglected] cabling, connecting the servers. One such comment caught my attention, in this regard. Slashdot, how important is the server cabling infrastructure in your Data Centers, and how do you resolve the cable management aspect of it?"
Cray (Score:2, Interesting)
The ol' Cray-X supercomputer where round (their cabinets where placed in a circle) so that the length of the cabling could be kept down. Back then the synchronisation between pulses in different cables where a problem. And there really where a *snakepit* of cables between the cabinets.
Crosstalk (Score:3, Interesting)
If nothing else, in an extremely complex environment, if you use a quality cable and quality connectors (skillfully attached) you can eliminate the bus as "one more thing to check" if you are getting unexplained slow downs. It is a nice way to shorten the troubleshooting to do list when you are up to your eyeballs in alligators and the pager wont stop buzzing.
Use high-quality cabling, but don't overspend. (Score:3, Interesting)
Cable quality will affect both digital signals and analog signals alike. A bad quality cable will generate a good share of dropped packets, or corrupted data, causing more resends or less accurate data. Also, take care if crimping your own cables, make sure you untwist wires as little as possible, and break the insulation and sheilding as little as possible.
With that said, don't be like a crazy audiophile (key word here crazy) and spend thousands of dollars just on cabling (I know an audiophile who spent 500 dollars on a 6" cable, when a $25 monster cable has the exact same specs. He claims to hear a difference, but I call b.s. on him.). Spending more means getting better, but only to a point.
You don't want to raise the ire... (Score:4, Interesting)
People don't seem to want to realize that digital implies lossless or error-corrected. They don't understand that the "premium quality sound" transmitted between devices can be done using the cheapest electronics equipment available.
Digital, maybe, but you don't want to raise the ire of the analog stereophiles: You'll get everything from Stereo cables make a difference [thesafetyvalve.com] to Debunking the Myth of Speaker Cable Resonance [audioholics.com], not to mention forests worth of dead tree sacrifices for Speaker Cable Face Offs [audioreview.com].
And please, please, please, please: Don't get them started on Solid State -vs- Vacuum Tube...
Re:literally speaking, no (Score:3, Interesting)
First, there is no such thing as a digital signal. You can't send numbers through a wire, you can only send voltage levels. This is an example of an analog signal. Poor quality cabling or interference can and will cause errors in transmission.
Second, nobody uses 0/+5V signaling for anything modern. This is not compatible with high bitrates. For example, USB 2 high speed uses 400mV differential signaling at 480Mbps. Cable and connector quality is critical, and poor quality cable will not work at all.
Third, most digital interfaces have no error correction capability. Digital audio (SPDIF, which is what you were talking about) has no error correction OR detection capability. If you have bad cable, it will cause sound glitches, crackling, and other nastiness. Also, SPDIF transmits the master clock over that cable. If the cable is of poor quality, it will cause excessive clock jitter, which reduces sound quality and causes distortion.
By the way, you can't use audio cable for SPDIF. SPDIF requires coaxial video cable (75 ohm impedance). It will not work well with anything else.
Cabling is a critical component (Score:2, Interesting)
If your datacenter is 24/7, doing costly (financial), life critical (healthcare) or corporate production, then cabling ranks right up there with A/C and power. In fact all three of these are more important than apps or server platforms.
I mean, most signal cabling is now part of a network, (IP, FC, ESCON, Token Ring, etc.). A single cable failure can lead to a network failure which, like an A/C or power failure, affects a good portion of the datacenter.
I've seen poor cabling take out a datacenter on a couple of occassions. In one case, the engineers had loosely laid fiber cable for their network backbone under the computer floor. The cable draped over metalic power conduit. A year later, the datacenter contracted to have the power upgraded. The electrician pulled out the old conduit taking about ten fiber pairs with it. The company lost a good portion of their IP connectivity for several hours. Cabling is critical.
Cabling should be well thought out and properly run. The best systems I've witnessed are seperate trays under a computer floor for copper signal, fiber signal and a third for power. Cable runs go down the rows under the backs of the racks. All trays have proper feeds for each rack. All new cabling is quoted, and contracted before installation. Any equipment removal entails cable removal.
The best cable management system I've ever seen was at a TV station. The chief engineer kept several different cable lists depending on the cable function. Each cable was given a number. Once the cable was run, on his inkjet printer, he printed up cable labels using a Brady label sheet. The label identified the use, local connection, remote connection and number. There were never any problems disconnecting or reconnecting equipment.
Cables tell the story. If you are ever going to contract a datacenter for rack space, a visual check of the cabling will tell you more about the establishment than any brochure or spec sheet. If the cables are well run, you can bet - the power and A/C are properly spec'ed and redundant, their bandwidth adequate, and their building secure environmentally and physically.
Re:Always neglected??? Speak for yourself... (Score:3, Interesting)
Cabling Strategies (Score:3, Interesting)
colors and numbers (Score:3, Interesting)
One thing I wish we did is have unique serial numbers on both ends of each (and every!) cable. While it's possible to trace cables using the tried-and-true tug-and-feel method, in reality it sucks and printed documentation is difficult to keep in sync with reality.
I've also seen cables color-coded for other purposes, but these haven't worked as well e.g.: one color is for network, another for KVM, another for switch uplinks, etc. This works well until you need a KVM cable, but don't have the right length in the right color so substitute "temporarily", blowing the scheme completely since 'temporary' is a synonym for 'permament' in most datacenters. another example: Use every color available randomly in the hope that there are only so many hot-pink cables with a green stripe in your datacenter making it easier to trace things. In reality this last example doesn't scale well and makes patch panels look really untidy.
As far as what I *think* you were asking, which is whether there is some qualitative difference between cables -- there is. Make sure you get 'certified' cables from a trusted vendor, preferably each one individually tested with the results pasted on a sticker on the (sealed) bag each cable comes in. Also make sure you get 'plenum' cables where necessary to comply with fire codes and just plain common sense. I'd say any permament infrastructure cables (not patch cables) should be plenum whether they are legally required to be or not -- if you have a fire you'd be better off without a few hundred extra pounds of fuel to keep it going. Beyond plenum/pvc and tested cables there isn't much else to stress over -- thank god "Monster" doesn't make patch cables with 24k gold connectors to hoodwink unsuspecting people -- if the cable tests good the rest doesn't matter.