The Death of the Silicon Computer Chip 150
Stony Stevenson sends a report from the Institute of Physics' Condensed Matter and Material Physics conference, where researchers predicted that the reign of the silicon chip is nearly over. Nanotubes and superconductors are leading candidates for a replacement; they don't mention graphene. "...the conventional silicon chip has no longer than four years left to run... [R]esearchers speculate that the silicon chip will be unable to sustain the same pace of increase in computing power and speed as it has in previous years. Just as Gordon Moore predicted in 2005, physical limitations of the miniaturized electronic devices of today will eventually lead to silicon chips that are saturated with transistors and incapable of holding any more digital information. The challenge now lies in finding alternative components that may pave the way to faster, more powerful computers of the future"
I'll... (Score:5, Insightful)
Re:I'll... (Score:4, Insightful)
Let them speculate ... (Score:5, Insightful)
In the meantime, other researchers will figure out ways to make silicon work smarter, not harder.
Comment removed (Score:5, Insightful)
Re:I'll... (Score:5, Insightful)
Yes, and we're so damned good at manipulating it. All this newfangled stuff is pie-in-the-sky at this point. Yes, I suppose we'll eventually replace it for the likes of high-end processors, as you say, but everything else out of silicon for a long time to come.
People keep bring up Moore's Law, as if it's some immutable law of physics. The reality is that we've invested trillions of {insert favorite monetary unit here} in silicon-based tech. Each new generation of high-speed silicon costs more, so that's a lot of inertia. Furthermore, if Guilder's Rule holds true in this case (and I see no reason why it shouldn't) any technology that comes long to replace silicon will have to be substantially better. Otherwise, the costs of switching won't make it economically viable.
Much sillio articulo (Score:5, Insightful)
Re:I'll... (Score:5, Insightful)
The real advantage of silicon for many years was that SiO2 was/is a decent gate materal for mosfets and insulator for insulating the metal from the main body of the IC and could be grown easilly on the surface of silicon. But afaict this advantage has dwindled as we need CVD deposited insulators for insulating between multiple metal layers anyway and as processes have got smaller there is a push to switch to other gate materials for better performance.
The main advantage of silicon right now is probablly just that we are very used to it and know what does and doesn't work with it. Other semiconductors are more of an unknown.
Even if silicon gets displaced from things like the desktop/server CPU market though I suspect it will stick arround in lower performance chips.
Not every chip needs speed (Score:3, Insightful)
Re:Not again (Score:3, Insightful)
Birth vs. Death (Score:4, Insightful)
Re:I'll... (Score:3, Insightful)
ECHO! Echo! echo! (Score:5, Insightful)
This has been getting bandied about every time someone comes up with a new, spiff-tastic technology/material to build an IC out of.
"THIS COULD REPLACE SILICON! WOOT!"
Yet it keeps NOT happening. Again, and again (and again).
The trailblazers keep forgetting, the silicon infrastructure has a LOT more money to play with than a given exotic materials research project. And, in many cases, what's being worked on in exotics can be at least partially translated back to silicon, yielding further improvements that keep silicon ahead of the curve in the price/performance ratio. Additionally, we keep getting better at manufacturing exotic forms of silicon too.
So, until silicon comes to a real deal-breaker problem that nobody can work their way around, I SERIOUSLY doubt that silicon IC is going anywhere. Especially not for a technology that has taken several years, and recockulous amounts of money simply to get a single flawless chip in a lab.
Not so fast... (Score:3, Insightful)
For something else to replace silicon it will have to not only be better, but so much better that it will justify the investment, or it will have to offer other, significant benefits, like being cheaper to produce, using less power or being smaller. Of these, I think speed is probably the least important, at least for common consumers.
Personally, I still haven't reached the point where my 3 year-old machine is too small or slow - not even near. It wouldn't make sense to upgrade, simply. I think most people see it that way, they would probably be more interested in gadgets than in a near-super computer.
Four years, eh? Then what? (Score:3, Insightful)
When someone makes a nanotube 80486 that I can buy and use, THEN I'll start to believe we're close to a technology shift. Hell, give me a 4004 - at least it's a product.
Bottom line: We're not there yet.
Re:I'll... (Score:5, Insightful)
Silicone was expensive to refine and manufacture at one point too. Like all new technologies the REAL cost is the in manufacturing and the cost goes down once we've manufactured enough of it to refine the process until we know the cheapest and quickest ways to do it.
Re:I'll... (Score:3, Insightful)
Re:I'll... (Score:2, Insightful)
Re:I'll... (Score:2, Insightful)
Re:I'll... (Score:2, Insightful)
Re:Not every chip needs speed (Score:2, Insightful)
Simply, silicon may begin to find competitors in the high end market for people with deep pockets but it will not die out in lower end devices for decades yet if ever (we need to come up with some very novel ideas before your wristwatch needs tens of gigahertz of processing power).
Re:I'll... (Score:2, Insightful)
absolutely positively undeniably 100% wrong
Just because your garage door opener can't "solve" Folding@Home doesn't mean that we can't dream. I mean, at some point we truly need to be able to say something like "well my garage door opener has more processing power than BlueGene/L did in 2008"
Seriously, get over yourself and your "reality"
Only half of the story. (Score:2, Insightful)
In addition to some of the points made by other posters (Silicon CPU's will live on in smart systems, cheap systems, handheld systems, etc.), there is a whole world of silicon chips that are *not* CPU's! Analog and mixed signal circuits need highly linear devices--not just switches that turn on and off--which current silicon technology provides wonderfully. Our current analog design technology has nowhere near exhausted the possibilities on the tapestry that ten/twenty year old silicon fabrication technologies provide.
Maybe graphene, nanotubes, or the Next Big Thing will change the high performance CPU niche, but silicon still provides everything we can manage to use for the rest of the IC world.
Besides, I bet that graffiti [ieee.org] will be quite a challenge with nanotubes.
Re:I'll... (Score:1, Insightful)