Rethinking Computer Design For an Optical World 187
holy_calamity writes "Technology Review looks at how some traditions of computer architecture are up for grabs with the arrival of optical interconnects like Intel's 50Gbps link unveiled last week. The extra speed makes it possible to consider moving a server's RAM a few feet from its CPUs to aid cooling and moving memory and computational power to peripherals like laptop docks and monitors."
dumb monitor (Score:3, Insightful)
The extra speed makes it possible to consider moving a server's RAM a few feet from its CPUs to aid cooling and moving memory and computational power to peripherals like laptop docks and monitors
Why would I want to pay for computational power in my monitor? When I buy a monitor I want it to do it's job - show the best quality images for the cheapest cost possible. A good monitor should last much longer than the associated computer driving it (unless we suddenly have a huge increase in the rate of development of display technology). Why would I want added cost in my monitor that will only make it out of date more quickly?
Re:dumb monitor (Score:4, Insightful)
Computer architecture must have the Bhudda-nature (Score:5, Insightful)
because this appears to be another aspect of Wheel of Reincarnation [catb.org].
I'm old enough to remember a time where a computer was a series of bitty boxes tied together with cables. Then someone decided to integrate a lot of the stuff onto a motherboard, with just loosely-related stuff connected by cables to the motherboard. Then the loosely-related stuff got put into cards that plugged into the motherboard. Then that stuff just got integrated into the motherboard.
And now it's being reborn as stuff in bitty boxes connected together with cables.
I wonder what enlightement will be like, because karma appears to have been a bitch.
Speed of whatever (Score:5, Insightful)
Yes and no. In a vacuum, electrons aren't terribly useful unless you're driving them with a particle accelerator. In wires, electrons aren't really doing the work anyway: electrical signals effectively travel as waves in the dielectric surrounding the wires and in particular between signal pairs. In that case, the signal travels at around half the speed of light in a vacuum (faster if you use expensive insulation like Teflon, slower for other plastics.)
Light in optical fiber is also slowed by the refraction coefficient of the material and by path-length extension in multimode fiber. However, on balance it's a bit faster.
The real gotcha is that electrical signals at outrageous bandwidths suffer from some really horrible losses due to both skin effects on the wires and dielectric losses in the insulation. At 50 Gb/s and 30 cm, you're doing well to detect the resulting signal, never mind decode it. Worse, the losses are highly frequency-dependent, so you have to do all sorts of ugly things to pre- and post-condition the signal to make it usable. Some of this can be overcome by cranking up the transmit power, but then you get into that property of wires known as "antenna." All of that processing at both ends takes time, too.
Just not worth doing, generally.
Likewise, putting a bunch of streams out in parallel requires all sorts of cleverness to put the separate lanes together again after transmission skew. A single optical stream is much easier to use, sort of like the communications equivalent of Amdahl's Law.
Re:Here we go again (Score:3, Insightful)
Same goes for optical interconnect to memory: the flood may be Biblical when it arrives, but while waiting for it to arrive the processor isn't doing anything useful.
That's the thing though, isn't it? There isn't a "the processor", there's 8, 16, 32, 128 processors. So stalling one may not be that great a loss.
Re:Interesting, but... (Score:3, Insightful)
Re:a few extra feet (Score:5, Insightful)
Note that I am not a physicist, and not much of an electrical engineer. I may be entirely wrong.
I'm not qualified enough to say whether you're right or wrong, but you stated your case eloquently and if there's one thing that Hollywood, politics, and Star Trek have taught me, sounding right is more important than being right.
Re:dumb monitor (Score:3, Insightful)
For ages I avoided Macs and built my own machines with upgrades specifically in mind. Turns out I rarely ever actually upgraded any of them anyway, except occasionally the video card and, more often, hard drives and memory. It was usually more economical to sell the old machine to someone and buy or build another.
When I started grad school the lab used all Macs. I've never missed the ability to upgrade.
Re:Here we go again (Score:4, Insightful)
Same goes for optical interconnect to memory: the flood may be Biblical when it arrives
But it won't be - the system is fundamentally limited by all of the rest of the components. A top end front-side bus can already push 80Gb; scaling that upto the 400Gbit that this optical link promises will probably be practical within a few years, but the latency of encoding and decoding a laser signal and pushing it over several meters is going to be a killer for computational applications. It will be great for USBX, and for high end networking it will challenge Infiniband (which currently tops out at around 300Gb). Infiniband is already used for networking high-performance computational clusters, but nobody is using it for the CPU to memory bus because of the high latency. Even with high bandwidth, computation still has to be carried out on the data, and so it still makes sense to put the data and processor as close together as possible.
In the last decade there were many research papers proposing that co-processors would be placed on DRAM cards, or Embedded DRAM would allow CPU and processors to be fabricated on a single die (e.g. 1 [psu.edu], 2 [stanford.edu]). But if you have a processor and DRAM connected to similar units via an optical interconnnect, guess what - the architecture begins to look awfully similar to a regular network with optical ethernet. So, it looks likely that this will be just another incremental improvement in architecture rather than the radical shift that TFA envisions.
Re:Interesting, but... (Score:3, Insightful)
Or made like LEGO Blocks. Need a quad core CPU? Go buy one and snap it onto your others.
Re:dumb monitor (Score:3, Insightful)
Re:dumb monitor (Score:1, Insightful)
I on the other hand bought a really nice 21 inch lcd in the year 2000, I still have the LCD but where is the 350mhz k6/2? or the other 7 machines I have owned since then
monitors do not need to be smart, they do not need to be tied to the computer, unless your in a situation where an all in one appliance (not computer) makes since, such as a university, where you have some for students to type papers do research and whatnot
Re:Speed limit (Score:4, Insightful)
Now admittedly electricity usually only travels at about 0.5c, IIRC, but I think she was giving the speed-of-light delay, not the speed-of-electrons delay.
Don't confuse propagation velocity of electromagnetic waves, which depends on dielectric constant and is around 0.8c in normal conductors, with drift velocity of electrons which is maybe a meter per hour.
http://en.wikipedia.org/wiki/Speed_of_electricity [wikipedia.org]
http://en.wikipedia.org/wiki/Drift_velocity [wikipedia.org]
http://en.wikipedia.org/wiki/Velocity_of_propagation [wikipedia.org]
Electrons really move slowly in metal. In a vacuum tube like a CRT, pretty quick.
Re:Here we go again (Score:3, Insightful)
might split things up into something reminding onboard ram and external swap though.
I don't need my 24gb swap space much at the moment, but it would be sweet to have it respond in something like 20ns anyways :)
Re:Computer architecture must have the Bhudda-natu (Score:3, Insightful)
Uh yeah, this isn't the first time around. The computer industry is constantly rediscovering previous designs. Timesharing, batch jobs, client-server, intergrated/distributed processing, etc, etc. Nothing new under the sun, just smaller and faster is all.
I wonder what enlightement will be like, because karma appears to have been a bitch.
It's called retirement - you get out of the loop and eventually you go out like a the flame of a candle.
Re:DRM (Score:3, Insightful)
Of course, Blu-Ray requires you to have compatible equipment. That's a bother (and another reason why some people don't want to upgrade) but once you have your equipment and have figured out how to make HDMI work the way you want it, you're set.
Now imagine they arbitrarily invalidated parts of your equipment. "Sorry, but to see this movie you'll need to replace the TV set you paid 3,000 Euros for last year with one compatible with HDCP 4.1 Home Cinema or higher." The only things many people would replace would be their Blu-Ray player with a PC and the physical media with The Pirate Bay. If they don't know how to work it they'll get their children to do it.
People expect a certain level of convenience. If you ask them to become home cinema technicians in order to watch a movie they simply won't bother to purchase any more movies from you.