Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
IBM Hardware Technology

IBM Beats The Rest of the World To 7nm Chips, But You'll Need to Wait For Them 89

Mickeycaskill writes: IBM's research division has successfully produced test chips containing 7nm transistors, potentially paving the way for slimmer, more powerful devices. The advance was made possible by using silicon-germanium instead of pure silicon in key regions of the molecular-size switches, making transistor switching faster and meaning the chips need less power. Most current smartphones use processors containing 14nm technology, with Qualcomm, Nvidia and MediaTek looking for ways to create slimmer chips. However, despite its evident pride, IBM is not saying when the 7nm technology will become commercially available. Also at ComputerWorld and The Register.
This discussion has been archived. No new comments can be posted.

IBM Beats The Rest of the World To 7nm Chips, But You'll Need to Wait For Them

Comments Filter:
  • by suso ( 153703 ) * on Thursday July 09, 2015 @08:49AM (#50075359) Journal

    In 2006 they developed a 350GHz room temperature capable silicon gallium CPU. Where is that?

    • Re: (Score:3, Funny)

      by Anonymous Coward

      The world isn't ready for it.

      Christ, - you'd actually be able to run Crysis.

    • Re: (Score:1, Interesting)

      by Anonymous Coward

      By 'developed' you mean, mocked-up-one-for-patent-purposes....

      To deliver 7nm devices you need to solve real world production problems, but to mock up a basic chip and simulate how it would run if you ever solved those problems....for the purposes of writing a patent, none of that actual work needs to be done.

      The output of this work is a patent not a working silicon gallium processor. The market there is to hijack profits from any company that actually intends to make silicon gallium processors.

    • by kamapuaa ( 555446 ) on Thursday July 09, 2015 @09:21AM (#50075521) Homepage

      No they didn't. They developed a 350 GHz room temperature transistor.

      • by suso ( 153703 ) * on Thursday July 09, 2015 @09:50AM (#50075693) Journal

        No they didn't. They developed a 350 GHz room temperature transistor.

        According to this article it was a CPU:

        http://www.techspot.com/news/2... [techspot.com]

        Maybe the article is wrong?

        • It wasn't a CPU in the sense that you could actually process things with it and make it the central design element of a computer.

          • by suso ( 153703 ) *

            It wasn't a CPU in the sense that you could actually process things with it and make it the central design element of a computer.

            That's funny I thought CPU meant Central Processing Unit. Unless you can provide some reference to backup your claim I'm more inclined to believe the article I referenced. Back in the mid 90s there was a Scientific American article talking about how IBM was trying to use gallium arsenide and other materials to make higher frequency CPUs. I would think they would already have a transistor that fast back then, so it makes sense that after 10 years they may have some basic general purpose CPU. So the timeline

        • by Art3x ( 973401 )

          In 2006 they developed a 350GHz room temperature capable silicon gallium CPU. Where is that?

          No they didn't. They developed a 350 GHz room temperature transistor.

          According to this article it was a CPU:

          http://www.techspot.com/news/2... [techspot.com]

          Maybe the article is wrong?

          In 2002 [eetimes.com] they developed a 350 GHz silicon-germanium transistor.
          In 2006 [ibm.com] they developed a silicon-germanium processor that reached 350 GHz at room temperature and 500 GHz when supercooled with liquid helium.

    • Theoretically the maximum we tap out at is around 10 THz before raising the speed is just not practically doable. Even at 1 THz you need a clock for every core at a small scale. The big problem is power/heat that comes from capacitors. Since their power draw is theoretically infinite the first second and gradually falls, the shorter you run them as switches, the more time they spend at this peak theoretical power, the hotter they get. With non-capacitive transistors, there is no power increase with frequenc
      • "There is no practical reason in any textbook I've found or any chip architecture designer or physicist I've talked to why we can't have processors running at least 500x faster than we currently run them at lower power usage." You need to read different textbooks, and talk with more knowledgeable chip architecture designers and physicists, preferably ones who actually work on CPU design. Not only is capacitance inherent in transistor design, it also impacts the interconnect layers and the substrate to whic
        • Are you being serious? Capacitance is NOT inherent to transistor design. Anybody who has taken even their first circuits lab knows this. I still have printouts from the oscilloscope for the lab in quesiton. CMOS the power usage increased exponentially with frequency we switched it at, and heated up rapidly, but BJT had constant power draw all the way from 1 Hz to 100 THz with no change in power draw. Course, beyond 1 THz it couldn't switch fully before switching again, so our readout of the signal was less
          • by bws111 ( 1216812 )

            Good grief. Bipolar? Are you kidding?

            In the mid-90s (hardly the 'early days of computing'), IBM made the decision to switch from bipolar to CMOS for its mainframes. This was quite painful, as CMOS did not have near the performance of bipolar, and they risked losing customers. So why did they do it? Because the fastest machine they offered, with 10 CPUs, used 165KVA and created 475,000 BTUs/hr of heat, and the roadmap showed ever increasing rates of energy usage. By the time CMOS caught up in performanc

            • And back then the fastest processors were running in the low MHz range, and the thermal runaway and leakage was high in large BJT transistors. But when you scale them down much of that thermal runaway disappears, as does the leakage, and, since they are not capacitors in their function, you can run them at speeds 2-3 orders of magnitude faster. I'm not ignorant of the problems of early BJTs, but CMOS can not be pushed faster due to it's reliance on capacitance for switching, that is a fundamental upper limi
              • by bws111 ( 1216812 )

                No, the fastest processors were certainly not in the 'low MHz' range. As I recall, they ran at about 700MHz. Not as fast as today, but far from low MHz.

                I do not know exactly why BJT is not used, I am not a semiconductor expert. But I do know that there are specialized applications where they mix BJT and CMOS on the same chip, which is way harder than just making a BJT chip, so there must be some reason they do it. And I know that there are very smart people all over the world looking for any way to incr

        • Now if you have a real reason it won't work aside from angry "if we aren't doing it now then clearly it doesn't work" kind of quackery, I'd love to hear it. Really, I'd hate to have my hopes up for no reason. But I'm not going to roll over at such a stupid post with no real information in it.
        • It was one of the labs we did in this course, so I've seen it first hand: http://classes.engineering.wus... [wustl.edu]
      • I think the problem is the higher the clock the less time the signals have to get to other parts of the chip, so the number of transistors you can have in lockstep gets smaller and smaller for a given size of transistor... so even if everything was switching at 1 thz most of the time they'd be waiting for signals to reach them from other parts of the chip...
        • The big issue is clocks. Right now with current size, 1 THz, the propagation is a big issue around 500 GHz. With smaller technologies like 15nm, 7nm, it gets even smaller so propagation is even less of an issue. With a clock per core in a typical nVidia chip, you could get up to about 5 THz before propagation delay becomes an overriding factor. And the power efficiency gains would be amazing. Even if we just ran them at 5 GHz to quintouple speed, or 50 Ghz even, it would just mean no increase on power usage
          • Just some back of the envelope calculation but at light speed at 1THZ a signal could travel 3000000/1000000000000 = .0003 meters and i5 processors are about 17x17 mm so a signal could only get about 20% of the way across the chip on that clock at light speed, I don't know how much slower than light speed they actually move though...
    • Those high frequency transistors (like the 350ghz one) are used for communications, it wasn't for a cpu.
    • In 2006 they developed a 350GHz room temperature capable silicon gallium CPU. Where is that?

      Ahem:

      http://arstechnica.com/uncategorized/2006/06/7117-2/

    • by delt0r ( 999393 )
      That was not a "switching" transistor. They are also physically quite bit.
    • I was taught that very high speeds came with very short wires, and getting chips faster than about 5 GHz meant that you could not even have external support chips, never mind the constant wait for memory to respond and sitting around idle at the inevitable cache failures if you went faster.

      Hence, the push from Intel, etc., to massive numbers of cores rather than faster cores as transistors shrank farther.

  • It's about yield (Score:4, Insightful)

    by cerberusss ( 660701 ) on Thursday July 09, 2015 @08:52AM (#50075375) Journal

    IBM is not saying when the 7nm technology will become commercially available

    No, because a big hurdle is of course lithography on 7 nanos, but the even bigger hurdle is using it with a high enough yield to make it commercially viable.

    • And no mention on the leakage power. Curious. Smaller transistors have less dynamic power, but higher static power.

    • high enough yield

      You mean they've got to make weapons grade Silicon Galanium?

  • Incorrect... (Score:4, Insightful)

    by Junta ( 36770 ) on Thursday July 09, 2015 @08:55AM (#50075395)

    Most current smartphones use processors containing 14nm technology

    Only a few use 14nm today. It's still relatively scarce.

    Also, a company that no longer had a fab did a proof of concept in a lab. This is not what the headline suggests. It's nice to know that we have a proven hypothetical to get down to 7, but the practical side of things has a tenuous relation to research.

    • Meh ... years ago when they spelled IBM by dragging around individual molecules hasn't really turned into much in the way of practical.

      Nonetheless, IBM does a lot of basic research into things, because some of it will eventually pay off.

      At least someone is doing it.

  • Not sure.. (Score:5, Interesting)

    by Virtucon ( 127420 ) on Thursday July 09, 2015 @08:58AM (#50075417)

    Yeah because IBM sold their FAB so they don't know when anybody will produce chips based on this 7nm technology. They'll be happy to license it to chip manufacturers, they just won't produce it themselves.

  • Slimmer devices (Score:3, Interesting)

    by hsa ( 598343 ) on Thursday July 09, 2015 @09:16AM (#50075499)
    Most likely not.

    The CPU/GPU is not the bottleneck anymore. The screen and wireless consume more power. The sad truth is, everything else has advanced, but battery technology is still in the last decade.

    • Re: (Score:3, Informative)

      by Anonymous Coward
      I'm sorry, but this is absolutely not true. You think the batteries in your devices are merely smaller because of reduced CPU consumption? The truth of the matter is that battery technology is improving at a constant rate, and the amount of R&D being poured into the industry is impressive. Just because you don't know that it's taking place doesn't mean it's not happening. And seriously, you can confirm this for yourself with some simple google searches.
    • Most likely not. The CPU/GPU is not the bottleneck anymore. The screen and wireless consume more power. The sad truth is, everything else has advanced, but battery technology is still in the last decade.

      This is very true. However, the battery doesn't need to be the bottleneck for significant improvements. At the moment much of the energy is wasted in the screen backlight on LCD devices, and as more displays move towards new technologies like quantum dot or AMOLED this will be reduced significantly. On the wireless side it is a bit harder to get gains, but as smaller mobile cells gets built out this will allow for a reduction in power in the situations most people use their phones.

      With these developments it

    • Comment removed based on user account deletion
    • Shrinking GPU and CPU die is one of the major reasons battery power has been able to keep up until recently.

    • The chip is a tiny sliver of silicon, it has made no significant contribution to device "thickness" anytime in my 40 years of life.

      It effects power usage, and therefore battery size, and through that device thickness.

      Going from 14 to 7 is nothing about physical size and everything about electron flows.

      They can't actually make the dies smaller even on the new process because you still have to dissipate the heat it generates. AND bring in hundreds of electrical connections to the package so it can actually co

      • And the problem of heat dissipation still exists. I remember a graph comparing various materials' energy densities--the P4 at full-bore was putting out more energy per volume than a nuclear reactor (sub-critical, of course).

  • Pull the Other One (Score:4, Interesting)

    by Jahoda ( 2715225 ) on Thursday July 09, 2015 @10:07AM (#50075783)
    Given the challenges Intel faced with yields at 14nm.... and indication they face the same challenges with 10nm, evidenced by the push back to 2017 for the technology - I'm pretty goddamned skeptical that IBM has "beat" anyone to anything. Could I go to an Intel laboratory today and see a proof-of-concept 7nm chip? 5nm? Probably using all manner of interesting silicon replacements? I bet that I could.

    No, as you can see from the market today, this is merely an attempt by IBM to resurrect their flagging stock prices (which has worked).
    • You are silly, IBM is showing a working device that proves 7nm is possible if using heavier atom in alloy, which is a first step

      Mass production lithography and multipatterning issues are another step that needs to be addressed.

  • If a huge percentage of and entire wafer ends up unusable due to defects, then it doesn't really matter how tiny you've made the transistors because it'll be too expensive to be marketable. What we've got here is at best a proof-of-concept. At some future date I'm sure the process will be refined to the point where it's mass-producable enough to be practical, profitable, and affordable, but who can say how long that will take?
  • ...but I don't know when it will be commercially available
  • www.crn.com/news/components-peripherals/300077387/why-ibms-breakthrough-7-nanometer-chip-matters-to-partners.htm
  • "7-Elevens. 7 dwarves. 7, man, that's the number. 7 chipmunks twirlin' on a branch, eatin' lots of sunflowers on my uncle's ranch. You know that old children's tale from the sea. It's like you're dreamin' about Gorgonzola cheese when it's clearly Brie time, baby. Step into my office." Sorry. It could not be helped.

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...