Intel Devises Chip Speed Breakthrough 465
Chad Wood writes "According to the New York Times (free reg. req.), Intel has demonstrated a research breakthrough, making silicon chips that can switch light like electricity. The article explains:''This opens up whole new areas for Intel,' said Mario Paniccia, a an Intel physicist, who started the previously secret Intel research program to explore the possibility of using standard semiconductor parts to build optical networks. 'We're trying to siliconize photonics.' The invention demonstrates for the first time, Intel researchers said, that ultrahigh-speed fiberoptic equipment can be produced at personal computer industry prices. As the costs of communicating between computers and chips falls, the barrier to building fundamentally new kinds of computers not limited by physical distance should become a reality, experts say.'"
Still binary.. (Score:4, Insightful)
damn universe.. (Score:4, Insightful)
I think the universe might disagree. The speed of light is a limiting factor. The speed of electrons/transistor switching is what we're hitting now. (takes more than one clock cycle for a signal to propogate accross a chip) We will exchange that for a the light/photothingie switching speed that will be higher. This is not limitless.
Also, not limited by physical distance? Are these guys on crack? My Quake game is limited by physical distance. It takes 100ms to go across the country and back. Latency is the killer here.
-molo
Not much effect on distances (Score:5, Insightful)
What Intel seems to be discussing is much faster transmission rates though the line (ie: bandwidth), which in itself is a really good thing if it's being done at reasonable heat and power levels.
New Class of Computing Applications? (Score:2, Insightful)
When they say, "new class of computing applications" I take that to mean that this is the type of technology that Microsoft would take advantage of to facilitate a
If the transfer speeds are fast enough for this type of technology, couldn't we expect it to eventually get fast enough to replace set top boxes? We could be buying and running services instead of programs within the next decade, theoretically killing software piracy. Scary.
Still electro-optical (not all optical) (Score:5, Insightful)
Rather than create all-optical processors, this technology will be useful for building gigabit fiber interfaces directly into everyday silicon chips. I'd think that the next step for this stuff will be cheap fiber connections between peripherals and interal subsystems (Optical ATA anyone?) Then they will look to create optical traces that connect Intel processors, cache, RAM, I/O chips (if they can figure out how to mass-produce a optical fiber traces on a PCB).
This breakthrough more of an interconnection technology than a computation technology.
Re:Can someone tell me.... (Score:3, Insightful)
Not even LEDs are 100% efficient. However, for an optical system, the heat production is related to the duty cycle of the lamps, rather than the switching speed, so the heat production should remain constant regardless of clock speed.
On the one hand, this means you don't need to improve cooling to overclock. On the other, it means that you can't improve the overclock level with improved cooling.
Much hate for the big Intel (Score:1, Insightful)
I get the feeling they could announce the invention of time travel and there would still be 100 posts regarding temperatures, monolpolies, power consumption and AMD love.
Oh, and dont forget the 20 posts from bedroom engineers letting us know why it just wont work. - Thanks guys.
Re:damn universe.. (Score:2, Insightful)
The first level: The speed of light is slower in copper than air, and probably silicon. So, you get an immediate boost by changing mediums.
Second level: Light can travel 30,000,000m or 300,000km in 100ms. This is about 18,600 miles in 100ms. That's pretty darn good. Good enough to bridge the whole globe with acceptable lag times.
Third level: Right now, most of the lag in long distance communications is due to the speed at which your hardware processes the data. In fact, some benchmarks from my computer architecture class show that you can easily get 20% latency decreases by streamlining the hardware. This is just another way to do it.
Fourth level: We really may not be limited by the speed of light. If we use quantum coupling (Einstein's "spooky action at a distance"), then we might be able to send information faster than the speed of light. There's some other theories out that there might also let this happen. In this case, the lag effectively becomes zero.
Fifth level: Going optical reduces EMF and transmission lines within the chip. Stuff combatting transmission lines is getting pretty hefty -- going optical could significantly reduce the size of a chip by eliminating the pieces of the chip that counter-act transmission line related problems.
Face it, this is a pretty damn cool new technology that will likely have some type of impact on the industry. Going optical affords huge advantages -- and the industry will eventually go there. There are immediate tangible benefits despite what you may think.
Re:Still binary.. (Score:5, Insightful)
I think you will find the whole point of binary is that the increased noise margins of having two states means the speed can generally be increased in a way that more than makes up for the reduced information capacity of two states, compared to multiple states. (Multi-level memory cells are actually low speed / duty cycle devices.)
A 'bit' is a mathematical abstraction. In reality, a 'bit' is an analog pulse who's signal-to-noise ratio is just enough to discern two states (read up on eye diagrams).
Re:Not much effect on distances (Score:3, Insightful)
The speed of light is relevent too, but usually only for the number of wait states you need at the start of a bus transaction.
Re:damn universe.. (Score:2, Insightful)
Re:Not much effect on distances (Score:3, Insightful)
Come to think of it, electrons through copper are about 2/3 the speed of light through air, so unless they're way slower through semiconductors, it's not a speed of travel issue, it's a data/time issue.
For anyone not familiar with the difference, propogation is the time it takes any particular bit to get from a to b (and is the big downside of using satellites). Transmission is the number of bits per second sent. It's like two cars going from a to b. They can both get there in 10 minutes, but the one carrying 5 passengers is transmitting more than the the one with the lone driver.
Re:Still binary...he's onto something (Score:2, Insightful)
But the article was about communications, not logic. What if we had broadband optical fiber transmission, where a single pulse has, say, 128 frequency levels that could be gated? Sure, you'd have to have an array of controls on both ends, but it would be linear (N gates for N levels) and in fact, this is part of the significance of Intel's announcement. They claim the gates can be made more cheaply in masked silicon wafers instead of the more expensive current technology, and that's reasonable.
They claim a 2 ghz clock cycle on the gating; imagine a light pipe transmitting 128-bit words at that rate. That's a fat pipe.
Optical Speed Limit... (Score:2, Insightful)
Unless there have been actual optical logic gates designed (ie two optical sources going into a single non-electric device that will only output a single value (bounded by and/or/xor/xand theory), I dont see how this can increase speed.
Re:Intel's secret breakthrough (Score:3, Insightful)
MODS ON CRACK (Score:5, Insightful)
So the immediate question that I have is, "Why would I, a consumer, want that?" One possible answer is that I have fiber to my house.
Short of that, why would I want it? Would I want to convert my existing network to optical. Nope, I want less wires instead of more wires. One of the quotes even talks about people being able to watch multiple views of the Superbowl.
No, the mod that said this was on topic is full of crap.
A quote from the article to chew on (Score:2, Insightful)
Ok is it just me or has anyone else thought of the possibilities behind this statement? It could mean a few things but what rings for me is the end of the "personal" computer and the beginning of the "personal computing" service. Where The HP's and Dell's etc of the world keep all the systems while you purchase their own branded access to the system. Essentially you don't have a computer any longer but only client access. The end result is still much the same for all intents and puposes but no longer a physical system sitting on your desk. Like Citrix, VNC or rdesktop on crack.
That idea could be way out to lunch but all the same I can't say I really care for it. Hmm...
Re:Still binary.. (Score:5, Insightful)
Photonics have tremendous advantages over electronics... starting with the possibility of insanely high clock rates (think of the difference between microwaves and UV light!!!) Photonic signal pathes can be multiplexed, that is light pulses of countless frequencies can run down the same channel. Photonics are not at all limited to binary, or any other arbitrary base. Pick one you like... like decimal, and have a party. Photonics can perform massively parallel calculation inside photonic arrays. Those calculations can be used to control logic flow, and data organization, allowing a new hierarchy of computing which doesn't even exist in current solid state devices (i.e. self modifying, self optimizing hardware tuned to recursive operational analysis.)
As for the whole waste heat conversation... Remember, in a photonic, the light passing through the device doesn't necessarily produce significant heat. Photons passing through a transparent medium don't interact with matter the same way electrons do... resistance to currents of light aren't anything like electrons in their ability to produce heat, that is, as long as the light passing through an optical gate doesn't fluoresce (re-emit light) in the far infrared, there is no reason to expect that gate to get warm. The only true source of light on the chip will be the clock (not exactly true considering pumps, and amplifiers, but the concept is operationally correct), and that doesn't need to be a high wattage source (a 5mw tuneable laser should more than sufficient as a clock source.) Photonics run cool!
Comparing photonics to electronics is missing the whole point of why we want to do photonics in the first place... photonics rock!
Genda Bendte
"And then he said let there be light! And it was good!"
Re:Google link (KW) (Score:4, Insightful)
I guess there's a happy medium somewhere in-between, eh?
Re:Google link (KW) (Score:2, Insightful)
Is Intel seriously sweating? (Score:3, Insightful)
AMD comes out with a nice 64 bit CPU, Intel takes their highest end 32bit CPU, repackages it for a desktop, at twice the price, and barely competes.
AMD's 64 bit solution looks to beat the pants off of Itanium... Intel's statement that they're working on an x86 64 bit CPU says everything we need to know.
Sun partners with AMD - smartest move they could have made, especially if they jointly develop the next generation of AMD CPUs. Can we say massively SMP processing added to a fast core?
Re:Afloat you say? (Score:4, Insightful)
Debt is the accumulation of previous deficits.
A deficit is the net loss for a specific time period (say 1 year).
For example, the US may have had a $6B debt in 1999. But that year government expenditures where $100M less than revenue. Therefore they had a surplus.