Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Hardware

MIT's Hybrid Microchip To Overcome Silicon Size Barrier 77

schliz writes "MIT researchers have successfully embedded a gallium nitride layer onto silicon to create a hybrid microchip. The method could be further developed to combine other technologies such as spintronics and optoelectronics on a silicon chip. It is expected to be commercialized in a couple of years, and allow manufacturers to keep up with Moore's Law despite today's shrinking devices."
This discussion has been archived. No new comments can be posted.

MIT's Hybrid Microchip To Overcome Silicon Size Ba

Comments Filter:
  • MIT researchers have successfully embedded a gallium nitride layer onto silicon to create a hybrid microchip.

    Arthur: What do you mean, an african or european gallium nitride layer?
    Bridgekeeper: Both! That's why it's an hybrid!
    Arthur: I didn't know that! Auuuuuuuugh!

    • by Fred_A ( 10934 ) <fred@f r e d s h o m e . o rg> on Monday September 21, 2009 @11:45AM (#29492119) Homepage

      MIT researchers have successfully embedded a gallium nitride layer onto silicon to create a hybrid microchip.

      Arthur: What do you mean, an african or european gallium nitride layer?
      Bridgekeeper: Both! That's why it's an hybrid!
      Arthur: I didn't know that! Auuuuuuuugh!

      Meanwhile, a few hundred years later...

      Customer : So, um, I'll only have to refill my computer half as often right ?
      Best Buy Salesperson : Actually it so happens that we have a promotion on computer tanks in the next aisle.

    • Nicely done, but my first thought on reading the headline was that scientists had found a microchip that allowed for insanely sized artificial breasts.
  • by Anonymous Coward

    Unless the figure out a way to make plastic stronger, I think cellphones shouldn't get much thinner or smaller.

    • by jeffb (2.718) ( 1189693 ) on Monday September 21, 2009 @11:07AM (#29491661)

      Smaller equals faster, and can equal lower power. Both of these are good for cellphones, and lots of other things.

      More to the point, this particular advance means fewer individual chips, which means cheaper.

      • Smaller equals faster

        Not necessarily. As transistor gate widths get smaller, parasitic capacitances become more of an issue. When you place metal traces as close to each other as is required for a sub-45 nm MOSFET gate, the capacitance between them becomes greater, thus actually reducing performance.

        • by imgod2u ( 812837 ) on Monday September 21, 2009 @12:32PM (#29492753) Homepage

          You're talking about coupling capacitance, which is something that can be alleviated by design. The biggest issue is that shrinking wires don't result in faster signals due to the load capacitance remaining relatively the same. This becomes the majority of the delay and the speed of the transistor becomes a smaller part of the equation.

          Add to this the fact that transistors themselves aren't getting faster. The speed of a FET is proportional to its gate dielectric thickness. That is 1nm at 45nm and 0.9nm at 32 (for Intel). This can't really shrink much more like it has in the past -- once you're down to a single layer of Hafnium, you can't really cut out any more -- and as a consequence, transistors won't be getting faster at the same rate that they have been in the past (for MOS at least).

          Looking at Intel's roadmap, upcoming node shrinks scale in power and size but not in speed.

      • The insides can be small without making the outsides small.
    • The electrical components can get smaller without shrinking the phone. Just more empty space, or mayb e a larger battery. Possibly even a faster CPU.

      I want a cell phone that can run my chosen desktop with my chosen word processing software, web browser and a few other productivity apps.

      • why not step up a bit further and play your games like Crysis or GTA4 on max settings
      • I can log in to my laptop from my phone and play wow... There is some input lag and i wouldn't pvp but I did a OS run (10man raid) on it without tooo many problems.
    • by noundi ( 1044080 )

      Unless the figure out a way to make plastic stronger, I think cellphones shouldn't get much thinner or smaller.

      Two words: carbon nanotubes.

      • by imgod2u ( 812837 )

        Nanotubes have great tensile strength but very very poor compression/lateral strength. Your cell phone would resemble the rigidity of wool fabric.

        • Re: (Score:3, Informative)

          by treeves ( 963993 )
          You don't make the case out of CNTs. You put them in the plastic to make it (much) stronger. A composite material. The carbon fibers needn't be "nano" to work well though.
    • No, Moore's Law doesn't end when things get too tiny. Moore's law says that the number of transistors you can get on a chip for a fixed dollar investment doubles every 18 months. This doesn't need to end once you hit the hard limit of silicon, because then the technology for making things that small will mature and you will still be able to get the same number of transistors but for half the price.
      • by Animats ( 122034 )

        This doesn't need to end once you hit the hard limit of silicon, because then the technology for making things that small will mature and you will still be able to get the same number of transistors but for half the price.

        We're starting to hit some fundamental limits. The ultimate one, of course, is that at some point atoms are too big, and you need at least one electron per bit of data. We're not quite there yet, but we're getting close. There's serious work on single electron memory cells. [nd.edu]

        The cur

        • by Imrik ( 148191 )

          You're missing the point, once you get to the point where you can't make transistors smaller, you can still make them cheaper.

    • Just you wait for Carbon Nano-Phones!

    • phone blobs (Score:2, Interesting)

      by zogger ( 617870 )

      It will just become one plastic blob, with the circuitry embedded right in the plastic, and being semi immune from bending fatigue breakage. No board and separate case in other words. I guess they'll need a way to do the sim card, but perhaps they can do with with bluetooth.(or some other shortrange wireless tech). Charging the blobbed batt will be inductive. Pros are sturdy, weather proof and most likely pretty cheap, cons, no user serviceable entry at all without some serious leet dremel skills and a micr

      • SIM card shouldn't be a problem - you'll still need a way to connect the screen, buttons, etc. - some traces for SIM card contacts aren't going to break the cost advantages of this kind of integration.

        I'd say, if it weren't for smartphones needing so much processing, that we're probably there (should a dumb-phone be allowed to come to market).

    • The video screen would be a sheet of paper; audio just the earphone, text input just the keyboard etc. The power source is the other barrier. Batteries are still bully and less than an order of magnitude more efficient than a century ago. Smaller computing device would shrunk the power need, but the interface consumes lots of power.

      In the more distant future the interface would bypass the senses and connect to the nervous system.
  • Hey now! (Score:1, Funny)

    by Anonymous Coward

    > "...despite today's shrinking devices."

    It's not the size of the device, it's how you use it.

  • by Drakkenmensch ( 1255800 ) on Monday September 21, 2009 @11:05AM (#29491659)
    It's been getting interesting these past couple of years to see chip manufacturers not only content with observing the results of Moore's Law, but working hard to actually meet it as a self-imposed deadline. Would Intel have come as far as it did recently if Moore had never put his famous observation onto paper?
    • by TheRaven64 ( 641858 ) on Monday September 21, 2009 @11:56AM (#29492263) Journal

      Would Intel have come as far as it did recently if Moore had never put his famous observation onto paper?

      Yes. Someone else would have made a similar guess. Developing a CPU takes around 5 years. When you start, you need to know roughly how many transistors you will be able to use to make it. This depends on the market segment it will be aiming for (and the amount people are willing to pay for a chip in that segment) and the number of transistors you will be able to fit on a chip for that much money. Moore's 'law' is a good rule of thumb that lets you make a reasonable guess as to how many transistors you can fit on a chip by the time it is ready to be made. Sometimes it works, sometimes (e.g. the P4) it doesn't. Without it, Intel would have had to use some other mechanism for making guesses, but given that Moore's law is just a simple extrapolation from their past performance, that's probably what they would have used anyway.

    • Would Intel have come as far as it did recently if Moore had never put his famous observation onto paper?

      James Burke talked a lot about the phenomenon of the exponential explosion of technology in his Connections series [wikipedia.org]. Many others have commented about this as well. (Toeffler, Vinge, Kuzweil, to name a few) Technology often makes other technology easier, so you have an exponential chain reaction. Moore's law is just a consequence of this acceleration of technological advance in a highly technical field.

      I am also reminded of a chip industry quip: "Gallium Arsenide, the technology of the future! Always was

      • by sean.peters ( 568334 ) on Monday September 21, 2009 @12:33PM (#29492765) Homepage

        Technology often makes other technology easier, so you have an exponential chain reaction.

        I hear a lot about the "exponential" growth of technology. I'm not sure whether technology is really growing exponentially, but I do know this: exponentially growing processes don't go on forever - they can't. Rather quickly, they hit upon some underlying limitation in the physical world, and progress stops. I think it's much more likely that growth in technology follows a logistic curve [wikipedia.org], which grows pseudo-exponentially for a while, but then plateaus. We're just in the steep part of the curve right now.

        • People have been predicting such limitations for some time. Time and again someone thinks of something new that the predictor of the limit hadn't thought of, and the improvement continues.
        • Re: (Score:2, Redundant)

          by vertinox ( 846076 )

          Rather quickly, they hit upon some underlying limitation in the physical world, and progress stops.

          The problem is that you are viewing the technology industry as the same as natural evolution.

          Yes, bacteria and various species do hit limitations on their exponential growth because they run out of food.

          But technology in general expands on those limitations and raises the bar faster than the limitations can keep up.

          I mean when is the last time you saw a bacteria species create their own irrigation system and f

      • Connections is awesome. You just don't get documentaries like that any more.
    • Or would they have been progressed at a far faster rate, because the progress would have been based on competition instead of an arbitrary expected development?

      • Or would they have been progressed at a far faster rate, because the progress would have been based on competition instead of an arbitrary expected development?

        Either that or the industry as a whole would have fallen into an lethargic pause, feeling no need to reinvent themselves as no standards are imposed on them to create faster hardware. The demands of the gaming market may have done this eventually, but then again we might still be playing Pong 12 Remix Championship 30th Anniversary Edition...

    • by BryanL ( 93656 )

      I thought that was the point of Moore's law: to double the nuber of transistors on a circuit every 18 months-two years. The "law" is more of a business strategy or practice than a true law. That would, by definition, make it self fulfilling. The reason most people think it is a law proper is that chip manufacturers have been able to continue this strategy for so long.

  • Great idea (Score:5, Funny)

    by AP31R0N ( 723649 ) on Monday September 21, 2009 @11:21AM (#29491829)

    i should get my girlfriend to use silicon to overcome her size barrier.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      i should get my girlfriend to use silicon to overcome her size barrier.

      Perhaps she'd say the same about you.

    • Re: (Score:2, Funny)

      by RivenAleem ( 1590553 )

      I'd certainly layer then!

    • Re: (Score:1, Funny)

      by rcamans ( 252182 )

      heh heh heh. Hey, he said girlfriend! heh heh.

    • by Anonymous Coward

      Step 1 is to get her to overcome the virtuality barrier.

    • Took me a minute to realize you weren't talking about her barrier toward dealing with your inadequacies.

    • I should get my girlfriend to use silicon to overcome her size barrier.

      You want her boobs miniaturized? Maybe that's why your example is purely hypothetical...

  • Leave it to the guys at MIT to find that. I guess if you do enough 'research' on silicone sizing, you'll find the barrier.
  • by imgod2u ( 812837 ) on Monday September 21, 2009 @11:37AM (#29492019) Homepage

    They aren't talking about shrinking existing MOS transistors (which make up 99.999% of digital circuits); which is what Moore's Law talks about. They're talking about the ability to integrate transistors with better matching characteristics (CMOS is terrible at it) for analog and photoelectric circuits onto existing silicon. This idea has been done again and again from Intel's hybrid silicon laser to Silicon Germanium, which is already widely used in cell phone chips.

    This won't make digital circuits smaller and isn't a solution to it so the headline isn't accurate. What this will mean is that potentially, cell phones won't need 4-5 separate chips for RF, digital, baseband, etc. You can integrate all those functions into one. But again, that's nothing new. IBM already provides BiCMOS with a SiGe layer on top for analog circuits. It's not been economical since it usually lags behind their bulk CMOS process for digital-only chips.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      There is another impact of this technology that has nothing to do with Moore's law. If a GaN device such a blue or green LED can be grown on a Si substrate then it could be a lot less expensive. This could pave the way for much less expensive white light LED's. Why? Because some of the methods of making a "white" LED require a blue and/or green LED. You can make a red LED (in AlGaAs, not GaN), a green LED (in GAN), and a blue LED (in GaN) and combine them to make white. Or, you can make a blue (or pur

  • by onkelonkel ( 560274 ) on Monday September 21, 2009 @12:01PM (#29492347)
    Well, ok, it is. But in my day we called them chips. A micro was a microprocessor. So unless you were talking about a microprocessor chip, using the word microchip marked you as a clueless non-technical luser of the sort that writes the science articles for the local paper. Now get off my lawn, uphill both ways, in the snow.
    • Integrated Circuit. If you are in a hurry/need column space say 'IC' and hope no automobile engineers are around.

  • Boy, it's a good thing this guy's theories [juliansimon.org], demonstrated to have great predictive value, are being followed rather than a politician's theories about the impending Moore's Law crash.

    It's why I create karma in the first place.

Receiving a million dollars tax free will make you feel better than being flat broke and having a stomach ache. -- Dolph Sharp, "I'm O.K., You're Not So Hot"

Working...