IBM Beats The Rest of the World To 7nm Chips, But You'll Need to Wait For Them 89
Mickeycaskill writes: IBM's research division has successfully produced test chips containing 7nm transistors, potentially paving the way for slimmer, more powerful devices. The advance was made possible by using silicon-germanium instead of pure silicon in key regions of the molecular-size switches, making transistor switching faster and meaning the chips need less power. Most current smartphones use processors containing 14nm technology, with Qualcomm, Nvidia and MediaTek looking for ways to create slimmer chips. However, despite its evident pride, IBM is not saying when the 7nm technology will become commercially available. Also at ComputerWorld and The Register.
Re: (Score:1)
Yes they did. They beat everyone in the technology, but they have not yet beat anyone for actual commercial products.
Where is our 350GHz room temp CPU? (Score:4, Insightful)
In 2006 they developed a 350GHz room temperature capable silicon gallium CPU. Where is that?
Re: (Score:3, Funny)
The world isn't ready for it.
Christ, - you'd actually be able to run Crysis.
Re: (Score:1, Interesting)
By 'developed' you mean, mocked-up-one-for-patent-purposes....
To deliver 7nm devices you need to solve real world production problems, but to mock up a basic chip and simulate how it would run if you ever solved those problems....for the purposes of writing a patent, none of that actual work needs to be done.
The output of this work is a patent not a working silicon gallium processor. The market there is to hijack profits from any company that actually intends to make silicon gallium processors.
Re:Where is our 350GHz room temp CPU? (Score:4, Insightful)
No they didn't. They developed a 350 GHz room temperature transistor.
Re:Where is our 350GHz room temp CPU? (Score:4, Informative)
No they didn't. They developed a 350 GHz room temperature transistor.
According to this article it was a CPU:
http://www.techspot.com/news/2... [techspot.com]
Maybe the article is wrong?
Re: (Score:2)
It wasn't a CPU in the sense that you could actually process things with it and make it the central design element of a computer.
Re: (Score:2)
It wasn't a CPU in the sense that you could actually process things with it and make it the central design element of a computer.
That's funny I thought CPU meant Central Processing Unit. Unless you can provide some reference to backup your claim I'm more inclined to believe the article I referenced. Back in the mid 90s there was a Scientific American article talking about how IBM was trying to use gallium arsenide and other materials to make higher frequency CPUs. I would think they would already have a transistor that fast back then, so it makes sense that after 10 years they may have some basic general purpose CPU. So the timeline
Re: (Score:1)
Re: (Score:2)
The article is titled:
"IBM produces 500GHz silicon-germanium CPU"
Re: (Score:3)
In 2006 they developed a 350GHz room temperature capable silicon gallium CPU. Where is that?
No they didn't. They developed a 350 GHz room temperature transistor.
According to this article it was a CPU:
http://www.techspot.com/news/2... [techspot.com]
Maybe the article is wrong?
In 2002 [eetimes.com] they developed a 350 GHz silicon-germanium transistor.
In 2006 [ibm.com] they developed a silicon-germanium processor that reached 350 GHz at room temperature and 500 GHz when supercooled with liquid helium.
Re: (Score:2)
Re: (Score:3)
Re: (Score:1)
Re: (Score:2)
Good grief. Bipolar? Are you kidding?
In the mid-90s (hardly the 'early days of computing'), IBM made the decision to switch from bipolar to CMOS for its mainframes. This was quite painful, as CMOS did not have near the performance of bipolar, and they risked losing customers. So why did they do it? Because the fastest machine they offered, with 10 CPUs, used 165KVA and created 475,000 BTUs/hr of heat, and the roadmap showed ever increasing rates of energy usage. By the time CMOS caught up in performanc
Re: (Score:1)
Re: (Score:2)
No, the fastest processors were certainly not in the 'low MHz' range. As I recall, they ran at about 700MHz. Not as fast as today, but far from low MHz.
I do not know exactly why BJT is not used, I am not a semiconductor expert. But I do know that there are specialized applications where they mix BJT and CMOS on the same chip, which is way harder than just making a BJT chip, so there must be some reason they do it. And I know that there are very smart people all over the world looking for any way to incr
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
In 2006 they developed a 350GHz room temperature capable silicon gallium CPU. Where is that?
Ahem:
http://arstechnica.com/uncategorized/2006/06/7117-2/
Re: (Score:2)
Re: (Score:1)
I was taught that very high speeds came with very short wires, and getting chips faster than about 5 GHz meant that you could not even have external support chips, never mind the constant wait for memory to respond and sitting around idle at the inevitable cache failures if you went faster.
Hence, the push from Intel, etc., to massive numbers of cores rather than faster cores as transistors shrank farther.
It's about yield (Score:4, Insightful)
IBM is not saying when the 7nm technology will become commercially available
No, because a big hurdle is of course lithography on 7 nanos, but the even bigger hurdle is using it with a high enough yield to make it commercially viable.
Re: (Score:2)
What kind of imbecile actually believes that the z12 has the performance of a 1990s pentium? The kind that thinks a mainframe MIPS has ANY relation at all to an Intel MIPS.
Re: (Score:2)
Any evidence at all for that no benchmarking claim? Thought not.
AS400s are not mainframes and never have been. You seem to know very little about a subject you are talking about.
Re: (Score:2)
The fact that you could even write that shows just how clueless you are. MIPS has not actually meant Million Instruction Per Second for about 3 decades.
Even if we assume it is a meaningful count of instructions, what instructions are they? Are they register-register adds, decimal floating-point divides, a move of 100MB of data, encyption of a block of data? All of those can be done in a single instruction on Z, yet they all require different amounts of time to execute.
And even if you can choose the instr
Re: (Score:2)
Cost of germanium? Per chip? (Score:2)
Sure, germanium is rarer and more expensive than silicon. And given the mass of germanium required to add the necessary layer to a single chip, that might raise the per-unit material cost by multiple cents -- but probably not.
SiGe devices are more expensive because they're harder to make and (so far) don't enjoy the same economies of scale as silicon. As I understand it, material cost, especially raw material cost, is a vanishingly small contributor.
Re:It's about yield -- and leakage (Score:2)
And no mention on the leakage power. Curious. Smaller transistors have less dynamic power, but higher static power.
Re: (Score:2)
high enough yield
You mean they've got to make weapons grade Silicon Galanium?
Incorrect... (Score:4, Insightful)
Most current smartphones use processors containing 14nm technology
Only a few use 14nm today. It's still relatively scarce.
Also, a company that no longer had a fab did a proof of concept in a lab. This is not what the headline suggests. It's nice to know that we have a proven hypothetical to get down to 7, but the practical side of things has a tenuous relation to research.
Re: (Score:3)
Meh ... years ago when they spelled IBM by dragging around individual molecules hasn't really turned into much in the way of practical.
Nonetheless, IBM does a lot of basic research into things, because some of it will eventually pay off.
At least someone is doing it.
Not sure.. (Score:5, Interesting)
Yeah because IBM sold their FAB so they don't know when anybody will produce chips based on this 7nm technology. They'll be happy to license it to chip manufacturers, they just won't produce it themselves.
Re: (Score:2)
My guess is 2-3 years from now is when we start to see this tech. 10nm is just coming online with most of the 'big guys' having a bit of trouble with it. 10nm was basically 'labware' 2-3 years ago.
On the roadmap [wikipedia.org], it's predicted to be 2017-2018.
As for me, I really hope we get another clock-speed doubling or two before we get to 5nm. Not sure if it'll happen, though.
Re: (Score:2)
I do ignore them because GF isn't going to build something without a market or orders standing out the door waiting to be filled. I can see this as a boon to mobile first, so yes maybe Samsung but they're no slouches either and may just do their on 7nm process. The bet for IBM is that they can get somebody to license it and turn it into hard bottom line profit. Either that or in year Ginny will axe the chip R&D group because they're not "profitable"
Re: (Score:2)
True, yeah it was pretty much "here, take it. just take it away" but now they come up with new technology and they'll either have to partner or license to get it to market. It's worked out so well for AMD after all.
Slimmer devices (Score:3, Interesting)
The CPU/GPU is not the bottleneck anymore. The screen and wireless consume more power. The sad truth is, everything else has advanced, but battery technology is still in the last decade.
Re: (Score:3)
Thinness is mostly a by-product of being able to cram more circuits in a smaller space which cuts
Re: (Score:3, Informative)
Re: (Score:2)
Re: (Score:3)
The results are absolutely there. Your phone today has bigger battery with smarter management technology than last year's model. But your phone is also faster, more powerful, and doing more stuff.
You can get a phone that last for days. But not one running iOS8 or Android 5.
Re: (Score:2)
I'd rather have a nice beefy tablet over something thinner. A beefy laptop over literally the worst piece of crap ever, MacBook Air. Fuck you Apple, you don't get to decide optical is dead. Or Flash. Or anything.
I get a lot of use out of my Macbook Air. I don't need optical, I have an SDXC slot and thumb drives. If I really want an optical drive I have a USB one in my bag. Apple hasn't said that Flash is dead. It works fine on a Macbook Air. They just don't support it on iOS.
In the end, Apple tech works great for a lot of people. If you don't like it, don't buy it. Personally, I actually LIKE having a commercially supported desktop/mobile UNIX system with a standard suite of off-the-shelf familiar desktop p
Re: (Score:2)
Most likely not. The CPU/GPU is not the bottleneck anymore. The screen and wireless consume more power. The sad truth is, everything else has advanced, but battery technology is still in the last decade.
This is very true. However, the battery doesn't need to be the bottleneck for significant improvements. At the moment much of the energy is wasted in the screen backlight on LCD devices, and as more displays move towards new technologies like quantum dot or AMOLED this will be reduced significantly. On the wireless side it is a bit harder to get gains, but as smaller mobile cells gets built out this will allow for a reduction in power in the situations most people use their phones.
With these developments it
Re: (Score:2)
Re: (Score:2)
Shrinking GPU and CPU die is one of the major reasons battery power has been able to keep up until recently.
Re: (Score:2)
The chip is a tiny sliver of silicon, it has made no significant contribution to device "thickness" anytime in my 40 years of life.
It effects power usage, and therefore battery size, and through that device thickness.
Going from 14 to 7 is nothing about physical size and everything about electron flows.
They can't actually make the dies smaller even on the new process because you still have to dissipate the heat it generates. AND bring in hundreds of electrical connections to the package so it can actually co
Re: (Score:2)
And the problem of heat dissipation still exists. I remember a graph comparing various materials' energy densities--the P4 at full-bore was putting out more energy per volume than a nuclear reactor (sub-critical, of course).
Pull the Other One (Score:4, Interesting)
No, as you can see from the market today, this is merely an attempt by IBM to resurrect their flagging stock prices (which has worked).
Re: (Score:3)
You are silly, IBM is showing a working device that proves 7nm is possible if using heavier atom in alloy, which is a first step
Mass production lithography and multipatterning issues are another step that needs to be addressed.
'Possible' != 'Practical' (Score:2)
I beat the world to making a time machine... (Score:1)
Also covered on CRN (Score:1)
7's the key number here. Think about it. (Score:2)