Forgot your password?
typodupeerror
Intel Networking Hardware Technology

Rethinking Computer Design For an Optical World 187

Posted by timothy
from the optical-floptical dept.
holy_calamity writes "Technology Review looks at how some traditions of computer architecture are up for grabs with the arrival of optical interconnects like Intel's 50Gbps link unveiled last week. The extra speed makes it possible to consider moving a server's RAM a few feet from its CPUs to aid cooling and moving memory and computational power to peripherals like laptop docks and monitors."
This discussion has been archived. No new comments can be posted.

Rethinking Computer Design For an Optical World

Comments Filter:
  • dumb monitor (Score:3, Insightful)

    by demonbug (309515) on Wednesday August 04, 2010 @02:48PM (#33141578) Journal

    The extra speed makes it possible to consider moving a server's RAM a few feet from its CPUs to aid cooling and moving memory and computational power to peripherals like laptop docks and monitors

    Why would I want to pay for computational power in my monitor? When I buy a monitor I want it to do it's job - show the best quality images for the cheapest cost possible. A good monitor should last much longer than the associated computer driving it (unless we suddenly have a huge increase in the rate of development of display technology). Why would I want added cost in my monitor that will only make it out of date more quickly?

  • Re:dumb monitor (Score:4, Insightful)

    by jack2000 (1178961) on Wednesday August 04, 2010 @02:50PM (#33141618)
    So you can buy a new monitor again, and again and again. I bet this is what went through Steve Jobs' head when he they made macs hard to upgrade, that and a huge thunder of Ka-ching Ka-ching Ka-ching Ka-ching Ka-ching Ka-ching ...
  • by idontgno (624372) on Wednesday August 04, 2010 @02:59PM (#33141740) Journal

    because this appears to be another aspect of Wheel of Reincarnation [catb.org].

    I'm old enough to remember a time where a computer was a series of bitty boxes tied together with cables. Then someone decided to integrate a lot of the stuff onto a motherboard, with just loosely-related stuff connected by cables to the motherboard. Then the loosely-related stuff got put into cards that plugged into the motherboard. Then that stuff just got integrated into the motherboard.

    And now it's being reborn as stuff in bitty boxes connected together with cables.

    I wonder what enlightement will be like, because karma appears to have been a bitch.

  • Speed of whatever (Score:5, Insightful)

    by overshoot (39700) on Wednesday August 04, 2010 @03:08PM (#33141866)

    I don't think that light travels that much more quickly than electrons do.

    Yes and no. In a vacuum, electrons aren't terribly useful unless you're driving them with a particle accelerator. In wires, electrons aren't really doing the work anyway: electrical signals effectively travel as waves in the dielectric surrounding the wires and in particular between signal pairs. In that case, the signal travels at around half the speed of light in a vacuum (faster if you use expensive insulation like Teflon, slower for other plastics.)

    Light in optical fiber is also slowed by the refraction coefficient of the material and by path-length extension in multimode fiber. However, on balance it's a bit faster.

    The real gotcha is that electrical signals at outrageous bandwidths suffer from some really horrible losses due to both skin effects on the wires and dielectric losses in the insulation. At 50 Gb/s and 30 cm, you're doing well to detect the resulting signal, never mind decode it. Worse, the losses are highly frequency-dependent, so you have to do all sorts of ugly things to pre- and post-condition the signal to make it usable. Some of this can be overcome by cranking up the transmit power, but then you get into that property of wires known as "antenna." All of that processing at both ends takes time, too.

    Just not worth doing, generally.

    Likewise, putting a bunch of streams out in parallel requires all sorts of cleverness to put the separate lanes together again after transmission skew. A single optical stream is much easier to use, sort of like the communications equivalent of Amdahl's Law.

  • by feepness (543479) on Wednesday August 04, 2010 @03:11PM (#33141906) Homepage

    Same goes for optical interconnect to memory: the flood may be Biblical when it arrives, but while waiting for it to arrive the processor isn't doing anything useful.

    That's the thing though, isn't it? There isn't a "the processor", there's 8, 16, 32, 128 processors. So stalling one may not be that great a loss.

  • by mhajicek (1582795) on Wednesday August 04, 2010 @03:15PM (#33141940)
    I like the mention of putting memory and such in a dock. So you have 8GB RAM in your laptop on the go, but when you get home or to the office and dock you have 32GB. You could also have your hot and power hungry CAD / gaming GPU in the dock and a lesser on built in.
  • by Mordok-DestroyerOfWo (1000167) on Wednesday August 04, 2010 @03:16PM (#33141958)

    Note that I am not a physicist, and not much of an electrical engineer. I may be entirely wrong.

    I'm not qualified enough to say whether you're right or wrong, but you stated your case eloquently and if there's one thing that Hollywood, politics, and Star Trek have taught me, sounding right is more important than being right.

  • Re:dumb monitor (Score:3, Insightful)

    by ceoyoyo (59147) on Wednesday August 04, 2010 @03:21PM (#33142014)

    For ages I avoided Macs and built my own machines with upgrades specifically in mind. Turns out I rarely ever actually upgraded any of them anyway, except occasionally the video card and, more often, hard drives and memory. It was usually more economical to sell the old machine to someone and buy or build another.

    When I started grad school the lab used all Macs. I've never missed the ability to upgrade.

  • by chrb (1083577) on Wednesday August 04, 2010 @03:21PM (#33142020)

    Same goes for optical interconnect to memory: the flood may be Biblical when it arrives

    But it won't be - the system is fundamentally limited by all of the rest of the components. A top end front-side bus can already push 80Gb; scaling that upto the 400Gbit that this optical link promises will probably be practical within a few years, but the latency of encoding and decoding a laser signal and pushing it over several meters is going to be a killer for computational applications. It will be great for USBX, and for high end networking it will challenge Infiniband (which currently tops out at around 300Gb). Infiniband is already used for networking high-performance computational clusters, but nobody is using it for the CPU to memory bus because of the high latency. Even with high bandwidth, computation still has to be carried out on the data, and so it still makes sense to put the data and processor as close together as possible.

    In the last decade there were many research papers proposing that co-processors would be placed on DRAM cards, or Embedded DRAM would allow CPU and processors to be fabricated on a single die (e.g. 1 [psu.edu], 2 [stanford.edu]). But if you have a processor and DRAM connected to similar units via an optical interconnnect, guess what - the architecture begins to look awfully similar to a regular network with optical ethernet. So, it looks likely that this will be just another incremental improvement in architecture rather than the radical shift that TFA envisions.

  • by 0100010001010011 (652467) on Wednesday August 04, 2010 @03:31PM (#33142172)

    Or made like LEGO Blocks. Need a quad core CPU? Go buy one and snap it onto your others.

  • Re:dumb monitor (Score:3, Insightful)

    by derGoldstein (1494129) on Wednesday August 04, 2010 @03:34PM (#33142204) Homepage
    What about the ability to re-use a good power supply and case? I've had my PSU/Case combo for 3 computers now. When I say that I've "upgraded my computer", I often mean that I've replaced the motherboard, CPU, and RAM to a new architecture. Many/most of the other components remain the same -- I often have no reason to upgrade the storage, video card, optical drives, and, as mentioned above, the PSU/case. It's more flexible and modular, even if it does take some more work.
  • Re:dumb monitor (Score:1, Insightful)

    by Anonymous Coward on Wednesday August 04, 2010 @03:39PM (#33142314)

    I on the other hand bought a really nice 21 inch lcd in the year 2000, I still have the LCD but where is the 350mhz k6/2? or the other 7 machines I have owned since then

    monitors do not need to be smart, they do not need to be tied to the computer, unless your in a situation where an all in one appliance (not computer) makes since, such as a university, where you have some for students to type papers do research and whatnot

  • Re:Speed limit (Score:4, Insightful)

    by vlm (69642) on Wednesday August 04, 2010 @03:40PM (#33142328)

    Now admittedly electricity usually only travels at about 0.5c, IIRC, but I think she was giving the speed-of-light delay, not the speed-of-electrons delay.

    Don't confuse propagation velocity of electromagnetic waves, which depends on dielectric constant and is around 0.8c in normal conductors, with drift velocity of electrons which is maybe a meter per hour.

    http://en.wikipedia.org/wiki/Speed_of_electricity [wikipedia.org]

    http://en.wikipedia.org/wiki/Drift_velocity [wikipedia.org]

    http://en.wikipedia.org/wiki/Velocity_of_propagation [wikipedia.org]

    Electrons really move slowly in metal. In a vacuum tube like a CRT, pretty quick.

  • by tantrum (261762) on Wednesday August 04, 2010 @03:53PM (#33142554)

    might split things up into something reminding onboard ram and external swap though.

    I don't need my 24gb swap space much at the moment, but it would be sweet to have it respond in something like 20ns anyways :)

  • by Jah-Wren Ryel (80510) on Wednesday August 04, 2010 @03:54PM (#33142578)

    Uh yeah, this isn't the first time around. The computer industry is constantly rediscovering previous designs. Timesharing, batch jobs, client-server, intergrated/distributed processing, etc, etc. Nothing new under the sun, just smaller and faster is all.

    I wonder what enlightement will be like, because karma appears to have been a bitch.

    It's called retirement - you get out of the loop and eventually you go out like a the flame of a candle.

  • Re:DRM (Score:3, Insightful)

    by Jesus_666 (702802) on Wednesday August 04, 2010 @05:09PM (#33143710)
    That would kill Blu-Ray. People flocked from VHS to DVD in droves because it didn't just offer higher quality, it offered greatly improved convenience as well. Look at the DVD-to-Blu-Ray switch: Many people are still happily using their DVDs, content with what they have. Blu-Ray only offers a modest increase in quality with no convenience increase and isn't quite as universally loved as DVD.

    Of course, Blu-Ray requires you to have compatible equipment. That's a bother (and another reason why some people don't want to upgrade) but once you have your equipment and have figured out how to make HDMI work the way you want it, you're set.

    Now imagine they arbitrarily invalidated parts of your equipment. "Sorry, but to see this movie you'll need to replace the TV set you paid 3,000 Euros for last year with one compatible with HDCP 4.1 Home Cinema or higher." The only things many people would replace would be their Blu-Ray player with a PC and the physical media with The Pirate Bay. If they don't know how to work it they'll get their children to do it.

    People expect a certain level of convenience. If you ask them to become home cinema technicians in order to watch a movie they simply won't bother to purchase any more movies from you.

"Gotcha, you snot-necked weenies!" -- Post Bros. Comics

Working...