Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Hardware

The Death of the Silicon Computer Chip 150

Stony Stevenson sends a report from the Institute of Physics' Condensed Matter and Material Physics conference, where researchers predicted that the reign of the silicon chip is nearly over. Nanotubes and superconductors are leading candidates for a replacement; they don't mention graphene. "...the conventional silicon chip has no longer than four years left to run... [R]esearchers speculate that the silicon chip will be unable to sustain the same pace of increase in computing power and speed as it has in previous years. Just as Gordon Moore predicted in 2005, physical limitations of the miniaturized electronic devices of today will eventually lead to silicon chips that are saturated with transistors and incapable of holding any more digital information. The challenge now lies in finding alternative components that may pave the way to faster, more powerful computers of the future"
This discussion has been archived. No new comments can be posted.

The Death of the Silicon Computer Chip

Comments Filter:
  • I'll... (Score:5, Insightful)

    by PachmanP ( 881352 ) on Friday March 28, 2008 @08:44AM (#22892956)
    ...believe it when I see it!
    • Re:I'll... (Score:4, Insightful)

      by scubamage ( 727538 ) on Friday March 28, 2008 @08:47AM (#22892990)
      I agree. We have the methods to use other material, but silicon is plentiful and VERY cheap. Like, the majority of the earth's composition cheap. Grab a handful of dirt ANYWHERE and a large portion will be silicon. Even if it gets replaced for certain high end hardware, I doubt silicon will be going anywhere anytime soon - its simply too affordable.
      • Comment removed (Score:5, Insightful)

        by account_deleted ( 4530225 ) on Friday March 28, 2008 @08:54AM (#22893076)
        Comment removed based on user account deletion
        • Re: (Score:3, Insightful)

          by gyranthir ( 995837 )
          The issue of Carbon is the cost, scalability, accuracy, and timeliness/speed of nanotube production. Not the resource itself.
          • Re:I'll... (Score:5, Insightful)

            by twistedsymphony ( 956982 ) on Friday March 28, 2008 @09:45AM (#22893576) Homepage

            The issue of Carbon is the cost, scalability, accuracy, and timeliness/speed of nanotube production. Not the resource itself.
            What's that quote? "Necessity is the mother of Invention." or something along those lines.

            Silicone was expensive to refine and manufacture at one point too. Like all new technologies the REAL cost is the in manufacturing and the cost goes down once we've manufactured enough of it to refine the process until we know the cheapest and quickest ways to do it.
            • Re: (Score:3, Insightful)

              by gyranthir ( 995837 )
              That may be true, but that isn't going to change in 4 years. The replacement ideas have been around for a good while now and still productions, repetition, and scalability are still very not cost effective or scalable to even minimal production needs. And not to nitpick it's Silicon, not cone.
            • Re:I'll... (Score:5, Interesting)

              by Beetle B. ( 516615 ) <beetle_b@@@email...com> on Friday March 28, 2008 @10:07AM (#22893830)

              Like all new technologies the REAL cost is the in manufacturing and the cost goes down once we've manufactured enough of it to refine the process until we know the cheapest and quickest ways to do it.
              Cost is not the main problem with nanotubes.

              Nanotubes have a certain chirality - denoted by (m,n) with m and n being integers. Those two numbers define the properties of the nanotube (e.g. if m-n is a multiple of 3, the nanotube is metallic - otherwise it is semiconducting). They also determine the radius.

              So far no one has come up with a way to get a nanotube of a certain chirality. They just synthesize many nanotubes and then pick manually the ones they want - if it exists in the sample. Until they can do this, the nanotube industry will not become a reality.
              • I dunno, presumably the tubes with a different chirality has uses for other applications, so could you not just sort the lot rather than throwing away a bunch of them ?
                • Re: (Score:3, Interesting)

                  by Beetle B. ( 516615 )
                  The issue is not so much that some are being "wasted". The problem is selecting the ones you want. How do you automate that? You have a process that gives you lots and lots of nanotubes. How do you automatically filter out the ones you want? That's been the problem since day 1, and has not been resolved.
                  • Re: (Score:3, Interesting)

                    by camperdave ( 969942 )
                    My admittedly limited understanding of carbon nanotubes is that they are self producing. What I mean by that is that if one stable tube diameter is 20 atoms, and another stable tube diameter is 30 atoms, then the 20 atom tube is going to continue to grow as a 20 atom tube. It won't spontaneously widen out to a 30 atom tube. If that is the case, then all you would need is a few seed nanotubes, and the right conditions.
                    • Yes, but that's not what I'm referring to.

                      Generally (always?) nanotubes are not grown individually. The synthesis process simply produces many nanotubes. Then you have to pick the one you want among those.

                      So yes, a nanotube of a certain chirality can continue to grow longer and longer without losing that chirality. But you don't just get one nanotube of desired chirality, but plenty of nanotubes with different chiralities. No one has so far found a way to get just ones of a desired chirality.
              • Or selectively remove (etching) nanobute with certain chirality.
            • Pet Peeve (Score:2, Informative)

              Silicone != Silicon
          • Re: (Score:2, Insightful)

            by maxume ( 22995 )
            Those are all issues with silicon as well(crystals vs nanotubes...), they are just reasonably well solved.
      • Re:I'll... (Score:5, Funny)

        by somersault ( 912633 ) on Friday March 28, 2008 @08:54AM (#22893090) Homepage Journal
        I always wondered why those implants felt like a bag of sand..
      • Re:I'll... (Score:5, Insightful)

        by ScrewMaster ( 602015 ) on Friday March 28, 2008 @08:57AM (#22893106)
        I doubt silicon will be going anywhere anytime soon - its simply too affordable.

        Yes, and we're so damned good at manipulating it. All this newfangled stuff is pie-in-the-sky at this point. Yes, I suppose we'll eventually replace it for the likes of high-end processors, as you say, but everything else out of silicon for a long time to come.

        People keep bring up Moore's Law, as if it's some immutable law of physics. The reality is that we've invested trillions of {insert favorite monetary unit here} in silicon-based tech. Each new generation of high-speed silicon costs more, so that's a lot of inertia. Furthermore, if Guilder's Rule holds true in this case (and I see no reason why it shouldn't) any technology that comes long to replace silicon will have to be substantially better. Otherwise, the costs of switching won't make it economically viable.
        • Re:I'll... (Score:4, Informative)

          by geekoid ( 135745 ) <{moc.oohay} {ta} {dnaltropnidad}> on Friday March 28, 2008 @09:09AM (#22893228) Homepage Journal
          replace 'better' with 'more value'.

          For example:
          If it costs 1/100 the price but seen no end users gains in 'speed' and/or 'power' it could replace silicone. It's not better at doing anything, it just has a higher value.

          "All this newfangled stuff is pie-in-the-sky at this point."

          hmmm, some of this is a lot farther along then pie in the sky.

          Most people on /. don't even seem to understand Moore's law and think it has to do with speed and power;which is doesn't those are artifacts of the law.

          Finally:
          The real problem with silicone is the fabs. They are running into some serious problems at these incredibly small sizes. Some fabs are problems with metal atoms in the air, atom that are below detection and the ability to remove.

          I am not dooming and glooming silicone here(although there are some advantages to hitting a minimum size) it's just that some problems aren't going away and are getting harder to deal with and the past work a rounds aren't cutting it.

          • Re:I'll... (Score:5, Interesting)

            by scubamage ( 727538 ) on Friday March 28, 2008 @09:34AM (#22893456)
            You make some good points and I can't really argue them. As the die sizes continue to get smaller, silicon wafers must be more and more pure because tinier artifacts in the wafer can cause issues in the manufacturing process and thats going to be pretty unavoidable. However it also means that more dies can be stamped onto each wafer which should negate the number that are lost. I was meaning more that even if computer hardware is replaced with something else, things which need lower grade integrated circuits are still going to use silicon. I mean, you don't need a 1thz processor for a car's ECU, or for a garage door opener. And as more and more appliances become "smart" more things are going to need lower end chips - so I highly doubt that silicon is going anywhere. Maybe not for pc's, but everything else that is just starting to get 'wired' silicon is going to be around for a VERY long time.
            • Re: (Score:2, Insightful)

              by suggsjc ( 726146 )

              I mean, you don't need a 1thz processor for a car's ECU, or for a garage door opener.

              absolutely positively undeniably 100% wrong

              Just because your garage door opener can't "solve" Folding@Home doesn't mean that we can't dream. I mean, at some point we truly need to be able to say something like "well my garage door opener has more processing power than BlueGene/L did in 2008"

              Seriously, get over yourself and your "reality"

              • Re: (Score:3, Funny)

                by scubamage ( 727538 )
                absolutely positively undeniably 100% wrong
                I deny your reality and substitute my own ;)
              • Re: (Score:3, Funny)

                by ColdWetDog ( 752185 )

                I mean, you don't need a 1thz processor for a car's ECU, or for a garage door opener.

                absolutely positively undeniably 100% wrong

                sometime in the 1Thz-Garage-Door-Opener-Overlord-future:

                GARAGE_OWNER: "Open the garage door please, Hal"

                GARAGE_DOOR: "I'm sorry Dave, I can't do that.

                You're saying that you want this sort of thing to happen? No thanks. I like my appliances simple and mute, thankyouverymuch.

              • well my garage door opener has more processing power than BlueGene/L did in 2008"

                "Open garage door, please, HAL."
                "I'm sorry, Dave, I can't do that."
                (pause)
                "Why not?"
                "I think you know the answer to that question."
              • by suggsjc ( 726146 )
                Flamebait, really?
                Guess people's sarcasm detectors aren't working.
      • Re:I'll... (Score:5, Insightful)

        by petermgreen ( 876956 ) <plugwash@p[ ]ink.net ['10l' in gap]> on Friday March 28, 2008 @09:06AM (#22893196) Homepage
        I'm pretty sure the cost of the raw material is a negliable part of the costs of making semiconductor grade silicon. Most of the costs are in the very energy intensive purfification processes.

        The real advantage of silicon for many years was that SiO2 was/is a decent gate materal for mosfets and insulator for insulating the metal from the main body of the IC and could be grown easilly on the surface of silicon. But afaict this advantage has dwindled as we need CVD deposited insulators for insulating between multiple metal layers anyway and as processes have got smaller there is a push to switch to other gate materials for better performance.

        The main advantage of silicon right now is probablly just that we are very used to it and know what does and doesn't work with it. Other semiconductors are more of an unknown.

        Even if silicon gets displaced from things like the desktop/server CPU market though I suspect it will stick arround in lower performance chips.
      • Re:I'll... (Score:5, Interesting)

        by iamhassi ( 659463 ) on Friday March 28, 2008 @09:21AM (#22893340) Journal
        "I doubt silicon will be going anywhere anytime soon - its simply too affordable."

        Agreed. Besides, they've been saying this since the 90s, that silicon can't possibly get any faster and it'll be replaced very soon.

        I call BS. They had 350 gigahertz silicon chips 2 years ago [news.com]:
        "At room temperature, the IBM-Georgia Tech chip operates at 350GHz, or 350 billion cycles per second. That's far faster than standard PC processors today, which range from 3.8GHz to 1.8GHz. But SiGe chips can gain additional performance in colder temperatures....SiGe chips, the scientists theorized, could eventually hit 1 terahertz, or 1 trillion cycles a second."

        I think silicon is safe for awhile longer.
        • Re: (Score:2, Interesting)

          by smackt4rd ( 950154 )
          That 350GHz chip is probably much simpler and easier to build than a CPU, but the fact remains that it'd be incredibly difficult to just try and switch from Si to some other semiconductor and be able to build something cheap. We're already starting to see the manufacturers switch their architecture to multi-core CPU's. I think that's alot more practical than trying to switch to an exotic material.
        • I call BS. They had 350 gigahertz silicon chips 2 years ago:
          Yes, that's a record for silicon based devices, as you mentioned.

          However, the record for fastest transistor has been held by III-V based transistors (i.e. not silicon) for a few years now. See this [sciencedaily.com], for example.

          So the article's not all that wrong.
        • Re: (Score:3, Interesting)

          by imgod2u ( 812837 )
          That's for RF chips and RF signals. Silicon Germanium (SiGe) is the material and the 350 GHz signal being propagated is a sine wave with the FET being kept in the linear region. Digital signals are much more difficult to get to 350 GHz.

          To give you an idea, in a mixed signal BiCMOS chip where the digital components are standard CMOS and there's a SiGe layer on top for the RF circuits, the RF transistors are capable of amplifying an input sine wave all the way to the multiple tens of GHz. In the same proce
      • I know I'm being a little pedantic, but Silicon is NOT the most common element in the Earth. It is the most common element in the CRUST of the Earth. The most common element of the Earth is Iron. The Earth is an impure ball of iron oxides.
      • Re:I'll... (Score:4, Funny)

        by Zaatxe ( 939368 ) on Friday March 28, 2008 @11:34AM (#22894854)
        Grab a handful of tit ANYWHERE and a large portion will be silicone.

        There, I fixed that for you.
        • Re: (Score:3, Funny)

          by speculatrix ( 678524 )
          this being slashdot, I'd like to inform most readers the parent is referring to the mammaries of the human female, which a small number of you will encounter privately in person and not just via pr0n websites or viewing through a pair of binoculars.
    • by sm62704 ( 957197 )
      Silicon is dying! I knew it! The vaccuum tube ius making a comeback!

      Wait a minute, the monitor I'm staring at is a vaccuum tube. They told me vaccuum tubes were gone a couple of decades ago and they're still in guitar amps, too.

      I predict that this prediction about the demise of silicon is as accurate as their predictions about the demise of vaccuum tubes. But in four years nobody's going to remember their prediction, or mine either.
    • Re: (Score:2, Insightful)

      by maddriller ( 1148483 )
      It is sort of silly to declare the end of life for one technology when the technology to replace it is not yet in place. Every year for the last twenty people have proclaimed the end of silicon's reign, yet we still use silicon. The is a huge investment in the existing silicon infrastructure that will have to be duplicated in any replacement technology. There is also the educational inertia - engineering schools are still teaching people to use silicon and it will be many years before they start teaching
    • I think the paper only said conventional silicon chip will go away. There are many nonconventional silicon technology like finFET, silicon nanorod and etc.
    • by bkr1_2k ( 237627 )
      Yeah, no doubt. Silicon may or may not be the industry standard in 10 years, but saying it only has 4 more years of life is ridiculous to say the least. We're still using 200 MHz processors, for Christ's sake. The difference is now we're using them smaller, and in more "consumable" resources, rather than as our primary machines. Silicon has at least 2 generations (human generations) of life before we see it truly dead.
    • It's hard to call the end of the reign of silicon when we don't even know who the heir apparent is yet.

      That's kinda of like saying the sequel to Duke Nukem Forever is going to be the best game ever.
  • by ScrewMaster ( 602015 ) on Friday March 28, 2008 @08:49AM (#22893006)
    [R]esearchers speculate that the silicon chip will be unable to sustain the same pace of increase in computing power and speed as it has in previous years.

    In the meantime, other researchers will figure out ways to make silicon work smarter, not harder.
    • by Himring ( 646324 ) on Friday March 28, 2008 @09:06AM (#22893194) Homepage Journal
      They can have my silicon chip when they pry it from my cold, dead, motherboard....

    • Absolutely. I don't think that code for multicore cpus is fully baked yet nor near the end of what can be done with it. FPGAs are improving and we still have not seen that component type hit saturation yet. We are nowhere near done doing all that can be done with the silicon we already have never mind what is coming down the pipe in the next few years.

      It's hype, nothing but.

      I'd like to see something that is vastly better, cheaper, more energy efficient, and capable of greater performance... but until that c
    • by fm6 ( 162816 )
      I take that to mean that you think that future improvements will come from better design, instead of squeezing more transistors onto the chips. Somehow, I doubt that chip designers have been sitting on their hands for the last three decades. The fact is, all that die shrink is useless without clever designers. And there's a pretty big limit on what those designers can do without more transistors.

      So if we want better chips, we have to continue to cram in those transistors. Since we're coming up against the 1
      • Now that we've reached the limit on CMOS die shrink, perhaps it's time to revisit the big chip approach.

        I tend to agree. Amdahl tried to develop multilayer chips (essentially a three dimensional layer-cake approach) but failed, for a variety of reasons, although I remember reading that it was a complexity issue (inadequate design tools) as much as failure rate.

        That's probably a logical way to continue increasing complexity: just stack extra circuitry vertically. Discards don't matter much if you have
        • by fm6 ( 162816 )

          Of course, how to really use that much VLSI is another matter entirely.
          Chip scale has increased by a factor of 1,000 since the first commercial microprocessor was introduced in 1971. And yet developers still are screaming for more power. I think we can count on them to use up all that extra logic.
          • I think we can count on them to use up all that extra logic.

            Sorry, what I meant was having the software tools to develop applications to make effective use of all that power. We're having a hard time writing compilers that can parallelize code across eight or nine processors: what if we have chips with a thousand cores? Of course, with that much power it probably won't matter much if your code isn't all that efficient.
  • Didn't we essentially already talk about a processor replacement with Graphene? [slashdot.org] It wasn't that long ago that such a thing was posted....although I don't know anything about it from a truly technical standpoint whether that is viable or not.
  • Not again (Score:5, Informative)

    by Maury Markowitz ( 452832 ) on Friday March 28, 2008 @08:51AM (#22893034) Homepage
    I've been hearing this claim every few years for the last 25. Remember optical computers in the mid-80s? How about gallium arsenide? CRAY-3 anyone?

    And of course what's really reaching a limit is not the CPU's, but our ability to use them effectively. See "TRIPS architecture" on the wiki as an example end-run around the problem that offers hundred-times improvements using existing fabs.

    Maury
    • Re: (Score:3, Insightful)

      by esocid ( 946821 )
      Yeah, agree with you there. The article said they will be replaced within 4 years...yeah right. Maybe in 10 years something will come out that may be faster, but marginally more expensive. I don't see silicon exiting the technology world altogether within even the next 50 years. Some parts may be replaced but Si chips will still be kicking.
    • by geekoid ( 135745 )
      haha, TRIPS.

      So this other technology that claims it will be going by 2012, it's not going to happen, but this other technology that claims it will be giong by 2012 is a show in!

      Sorry, you will nede more then that. All these slashdot articles remind me of when tubes went away*, the same arguments.

      *Yes, I KNOW there are devices that use tubes, seriously.When was the last time you saw a tube tester in a grocery store?
    • I've been hearing this claim every few years for the last 25. Remember optical computers in the mid-80s? How about gallium arsenide? CRAY-3 anyone?
      While GaAs has not replaced silicon for computer CPU's, it has some applications (non-optical ones) that Si cannot compete with. Cell phones, for example.
    • gallium arsenide was a reasonable technology to pursue at the time. It had teething problems, was expensive to manufacture, and ccc ran into funding problems related to a drop off in defense spending after the end of the cold war. That is not to say that Gaas was a completely foolish technology for the time. There are many reasons to believe that it offered faster switching times, and smaller module packages than did ECL logic of the time. CCC was putting out a 500mhz machine in the early 90's, four years b
      • Vitesse had a 0.35 micron GaAs CMOS process that was capable - in principle - of putting millions of mesfets on a chip. As the internet bubble collapsed and Vitesse lost huge amounts of money, they abandoned the process. They could not afford the step to a smaller geometry.

        As I understand it (I'm no expert; I could easily be wrong) GaAs starts to lose its speed advantage below the 0.35 micron node because the drift velocity saturates.

    • actually there might be a point this time. you remember pentium 4? it supposed to go to 5 ghz and more but intel engineers have got all sorts of problems with approaching that speed so today most desktop cpus have stopped at about 3 ghz. instead both intel and amd prefer to add cores.
  • Well, we use most silicon to display boobs, might as well repay the favour :)

    Overclocking might be fun as well: "Hey, I managed a stable DD at room temperature!"
    • Well, we use most silicon to display boobs, might as well repay the favour :)

      Overclocking might be fun as well: "Hey, I managed a stable DD at room temperature!"


      In that context, you probably should have said, "overcocking". I know I get enough emails on that subject every day.
  • I remember reading about the death of silicon in the 1970s...

    (Okay, dating myself here, but still...)
  • by Ancient_Hacker ( 751168 ) on Friday March 28, 2008 @08:58AM (#22893126)
    Let's think, a technology that has taken 60 years to go from lab to today's level, it's going to be superseded in five years by technology that has not yet made a single transistor or gate. Hmmmm..... Meanwhile silicon is not going to be improved in any obvious way, such as with ballistic-transistors, gallium-arsenide, silicon-carbide, 3-d geometries, process shrinkage, etc, etc, etc, etc, etc, etc.... No soup for you.
  • Unlikely (Score:5, Informative)

    by aneviltrend ( 1153431 ) on Friday March 28, 2008 @09:06AM (#22893202) Homepage

    Intel's CTO Justin Rattner just gave a talk at Cornell two days ago; he covered this topic carefully and confirmed that Intel has the technology and plans to carry out Moore's Law for another 10 years on silicon. Technologies such as SOI [wikipedia.org] and optical interconnects will be leveraged to hit this.

    It's not necessarily the size of the transistors that make chips hard to make these days either (although they are now giving us huge problems with leakage current). It's harder to route the metal between these transistors than it is to pack them onto the silicon. New processors from Intel and AMD have areas with low transistor density just because it was impossible to route the large metal interconnects between them. Before we can take advantage of even smaller transistors we'll need a way for higher interconnect density.

    • Re:Unlikely (Score:5, Interesting)

      by geekoid ( 135745 ) <{moc.oohay} {ta} {dnaltropnidad}> on Friday March 28, 2008 @09:20AM (#22893334) Homepage Journal
      hmmm, I trust the people I know on the floor more then someone whose job it is to say things that maintain consumer confidence.

      It would be a stock hit to say "We will be replacing silcone in x period of time if X is any longer then 'right now'.

      Some new technologies solve those problems. Technologies in the 'we hobbled something together proof of concept stage, not the I wrote this down on paper stage.

      Some of it is impressive, whether or not there will b a practical way to mass produce it is another thing. If not, I can imagine a time in the future where only large entities that can afford 500K a chip will be using them. Or anyone at home that can afford the latest electron microscope, laser, super cooling.

      meh, I'm just glad the MHz war is pretty much subsided and we are FINALLY focusing on multi-core.

  • by Enleth ( 947766 ) <enleth@enleth.com> on Friday March 28, 2008 @09:10AM (#22893234) Homepage
    I don't know any numbers, but I think I can safely guess that the computer processor business is just a fraction of the whole silicon chip manufacturing business - maybe not a small fraction, but still. And the rest of the industry doesn't need extreme speeds - there are microcontrollers, integrated buffers, logic gates, comparators, operational amplifiers and loads of other $0.05 crap you got in your toaster oven, blender, wirst watch, remote-controlled toy car, printer, Hi-Fi, etc., etc. And there is an obvious priority for those: cheap and reliable. So the silicon is not going anywhere.
    • And the rest of the industry doesn't need extreme speeds - there are microcontrollers, integrated buffers, logic gates, comparators, operational amplifiers and loads of other $0.05 crap you got in your toaster oven, blender, wirst watch, remote-controlled toy car, printer, Hi-Fi, etc., etc. And there is an obvious priority for those: cheap and reliable. So the silicon is not going anywhere.

      And let's not forget Solar Cells, which are increasing production like crazy (and is causing silicon prices to increase).

    • Re: (Score:2, Insightful)

      by MttJocy ( 873799 ) *
      Exactly if silicon is going to be phased out anywhere in 4 year (note the IF) it will be in extremely high end supercomputer type devices perhaps a decade or so later this might get enough research combined with economies of scale to hit the high end PC market, maybe another decade may go by and the development dollars earned from this will enable them to enter the price range of the rest of the PC market (note here at this point you will most likely have a motherboard with silicon chips with exception only
  • Birth vs. Death (Score:4, Insightful)

    by 192939495969798999 ( 58312 ) <info AT devinmoore DOT com> on Friday March 28, 2008 @09:19AM (#22893320) Homepage Journal
    This guy is confused. The BIRTH of the silicon chip is nearly over... now is when it will completely take over our environments. To put it another way: demand for silicon chips is as dead as demand for crude oil, corn, or other staples.
  • Wrong tag (Score:3, Funny)

    by Mantaar ( 1139339 ) on Friday March 28, 2008 @09:25AM (#22893368) Homepage
    This should really be tagged software, shouldn't it?

    While we're at it, might add that Duke Bend'Em Forever tag, too...
  • Wake me when they announce the death of the Slashdot dupe [slashdot.org]
  • ECHO! Echo! echo! (Score:5, Insightful)

    by Chas ( 5144 ) on Friday March 28, 2008 @09:31AM (#22893428) Homepage Journal

    This has been getting bandied about every time someone comes up with a new, spiff-tastic technology/material to build an IC out of.

    "THIS COULD REPLACE SILICON! WOOT!"

    Yet it keeps NOT happening. Again, and again (and again).

    The trailblazers keep forgetting, the silicon infrastructure has a LOT more money to play with than a given exotic materials research project. And, in many cases, what's being worked on in exotics can be at least partially translated back to silicon, yielding further improvements that keep silicon ahead of the curve in the price/performance ratio. Additionally, we keep getting better at manufacturing exotic forms of silicon too.

    So, until silicon comes to a real deal-breaker problem that nobody can work their way around, I SERIOUSLY doubt that silicon IC is going anywhere. Especially not for a technology that has taken several years, and recockulous amounts of money simply to get a single flawless chip in a lab.

  • Not so fast... (Score:3, Insightful)

    by jandersen ( 462034 ) on Friday March 28, 2008 @09:35AM (#22893464)
    The transistor was first patented in 1925 (look it up in Wikipedia) and the integrated circuit in 1949 - both fundamental for microchips - but we still use radio valves today, and not just for nostaligic reasons. Silicon will probably hang around for a long time to come, I think.

    For something else to replace silicon it will have to not only be better, but so much better that it will justify the investment, or it will have to offer other, significant benefits, like being cheaper to produce, using less power or being smaller. Of these, I think speed is probably the least important, at least for common consumers.

    Personally, I still haven't reached the point where my 3 year-old machine is too small or slow - not even near. It wouldn't make sense to upgrade, simply. I think most people see it that way, they would probably be more interested in gadgets than in a near-super computer.

    • I think most people see it that way, they would probably be more interested in gadgets than in a near-super computer.

      Well, now that depends. If you mean a supercomputer whose only function is to run Microsoft Office faster ... you're right. Not much point in that. However, if we did have that kind of power in a sub-$1000 computer system, odds are we'll find something way cool to do with it. Something on the order of useful AI, for example.
  • by swordgeek ( 112599 ) on Friday March 28, 2008 @09:41AM (#22893530) Journal
    Even if the hard limits of silicon circuits are reached in four years, we will NOT be switching to nanotubes, graphene, superconductors, or quantum computing. Any of those technologies are at least a decade away from commercial applications, and 15 years is more likely. If there's nowhere to advance after four more years (and I rather doubt that--we've got too much history proving us wrong), then we'll just grow out. Bigger silicon dies, bigger cache, more cores. Maybe we'll actually hit the terminus of Moore's law, but that won't stop computers from advancing, and it won't magically make any of the alternative technologies mature.

    When someone makes a nanotube 80486 that I can buy and use, THEN I'll start to believe we're close to a technology shift. Hell, give me a 4004 - at least it's a product.

    Bottom line: We're not there yet.
  • ... why don't we call Nanotubes and superconductors The Microprocessor Killers (TM)?
  • Although I'm no expert, I've been reading that one reason Solar photo-voltaic panels have not dropped in price is due to the fact that much of the silicon used to make them is tied up in chip fabrication.

    I wonder if those same silicon wafer production facilities can be converted to make solar panels once the move away from silicon in the microprocessor industry takes place?
  • If this is true, then the players who are overly committed to silicon may lose ground to those moving to new materials and technologies. It could portend quite a shake up.
  • Silicon Scaling (Score:2, Interesting)

    by wilsonjd ( 597750 )
    Silicon scaling will run out. We will reach a point where we can no longer make working circuits any smaller, but it will NOT be in the next four years. 45, 32, 22 nm circuits are already in the lab. 16nm (which may be the limit,) is expected to be in production by 2018 (10 years from now.) After 16nm, quantum tunneling may be a problem. http://en.wikipedia.org/wiki/16_nanometer [wikipedia.org]

    Intel thinks we may hit the limit by 2021. http://news.zdnet.com/2100-9584_22-5112061.html [zdnet.com]
    • by Dunbal ( 464142 )
      And THEN you just figure out how to fit more blocks of chips together... quad cores, 8x cores, 16 x cores... various of multi-cored chips linked together on a motherboard, etc.
  • by the_kanzure ( 1100087 ) on Friday March 28, 2008 @10:38AM (#22894194) Homepage
    SciAm is running an April 2008 article on graphene, so here are my notes on graphene fabrication [heybryan.org]. This is pretty neat, and worth some amateur experimentation. You can make the AFM/STM for ~$100 USD [heybryan.org]. As for graphene, there are some instructions on that page for chemically synthesizing it, or just use pencil graphite and write over a piece of paper. Another cool idea is figuring if we can use mechanical force to use a very thin pencil tip to write a circuit. JohnFlux in ##physics on freenode mentions that resistors could be used as a poor man's piezo, just heat up the metal (or perhaps pencil) and it will move. It will move very slowly. But a start.
    • by Bender_ ( 179208 )
      I liked your homepage, you seem to be really interested in Nanotechnology and the likes. You need to understand more of the underlying science though, that will help you to understand how things fall into place. It is too easy to get fascinated with a bunch of hype technologies and popular science attached to it. All of it comes with a long history of development that is required to understand the merits and how things work. Most of this is hidden to the casual passer by. I guess university is going to do t
  • by Kjella ( 173770 ) on Friday March 28, 2008 @10:47AM (#22894296) Homepage
    ...because the top speed has barely moved in the last decades. The commercial airplane is dead because the top speed has gone DOWN after the Concorde landed. WTF? If we really hit the hard limits of silicon, then there won't be half a dozen techs for terahertz speed waiting. It might mean that the next generation WON'T see improvements of many orders of magnitude like we have, that's it. Computers will be something that operate at some given performance and the world will shrug at it. In short, the world won't collapse if this completely uncharacteristic development comes to an end. And even then I suspect it will go on elsewhere, did you see flashmicro's 900GB 2,5" flash disk? Yes, at ungodly prices but I think we have a long way to go yet...
    • Re: (Score:3, Funny)

      by Robotbeat ( 461248 )
      Wait, so does that mean that The Singularity won't happen?!? Say it ain't so! I was so looking forward to the Nerd Rapture by our friend, the Computer!
  • Some very intelligent researchers at the Institute of Physics' Condensed Matter and Material Physics Conference came to some very intelligent decisions about the future of CPU's... but this is hardly the end of the silicon chip.

    In addition to some of the points made by other posters (Silicon CPU's will live on in smart systems, cheap systems, handheld systems, etc.), there is a whole world of silicon chips that are *not* CPU's! Analog and mixed signal circuits need highly linear devices--not just switches
  • Oh man, I was hoping germanium was going to make a comeback as a semiconductor. It holds up better when its hot, IIRC from my college days. I'll just have to make a lot of retro guitar effects pedals.
    • Germanium has a lower temperature limit than silicon. One reason is that its bandgap is smaller, so it goes to zero at a lower temperature, at which point it's useless. I've also read that germanium typically uses a dopant which melts at a low temperature. Materials with large bandgaps (silicon carbide) are useful for high temp semiconductors.
  • This statement:

    ...the conventional silicon chip has no longer than four years left to run... [R]esearchers speculate that the silicon chip will be unable to sustain the same pace of increase in computing power and speed as it has in previous years.

    Does not equal this statement:

    Hardware: The Death of the Silicon Computer Chip

    What the first statement means is that they may have found something faster than silicon chips. That doesn't mean that silicon will suddenly "go away" just because it cannot maintain Moore's law predictions.

    Hell - do you think they're going to put some uber carbon nanotube processor in your TV remote or your microwave oven control panel? Silicon cpu chips have *plenty* of uses other than high end mainframes. They're damn useful - that's why they're

  • The individual transistors may be approaching a limit that, in theory, could be passed by other materials. But silicon still has plenty of potential.

    For starters we're still using a primarily two-dimensional structure on the surface. There's no reason you couldn't build your structures in three dimensions through a cube of the material. (Yes power and cooling become issues, but they're soluble.)

    Going truly 3-D again shortens the wiring, leading to another speed increase with a given speed of components.
  • Thats when I'll believe the replacement is at hand.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...