Graphene Won't Replace Silicon In CPUs, Says IBM 81
arcticstoat writes "IBM has revealed that graphene can't fully replace silicon inside CPUs, as a graphene transistor can't actually be completely switched off. In an interview, Yu-Ming Lin from IBM Research (Nanometer Scale Science and Technology) explained that 'graphene as it is will not replace the role of silicon in the digital computing regime.' Last year, IBM demonstrated a graphene transistor running at 100GHz, while researchers at UCLA produced a graphene transistor with a cut-off frequency of 300GHz, prompting predictions of silicon marching towards its demise, making way for a graphene-based future with 1THz CPUs. However, Lin says, 'there is an important distinction between the graphene transistors that we demonstrated and the transistors used in a CPU. Unlike silicon, graphene does not have an energy gap, and therefore, graphene cannot be "switched off," resulting in a small on/off ratio.' That said, Lin also pointed out that graphene 'may complement silicon in the form of a hybrid circuit to enrich the functionality of computer chips.' He gives the example of RF circuits, which aren't dependent on a large on/off ratio."
Re:Whoah (Score:1, Informative)
> 100GHz? 300GHz? 1-fucking-THz?
> I started drooling just a bit. Talk about a jump in speed.
The US government has had 100 GHz for around ~20 years ago using graphene. It is only "new" to the civilian world, who can't afford the multi-million dollar cooling system that goes along with it.
But still, yeah, dam cool. :-)
Re:Whoah (Score:4, Informative)
Re:Congratulations (Score:5, Informative)
Graphene actually can be made to have a bandgap so this problem may only be a temporary one.
Here's a paper which discusses graphene's band gap: Direct observation of a widely tunable bandgap in bilayer graphene [nature.com]
Here's a free article discussing this: Tunable Graphene Bandgap Opens The Way To Nanoelectronics And Nanophotonics [sciencedaily.com]
In fact, here's an article about IBM doing research on this very topic: IBM opens bandgap for graphene [eetimes.com]
Note that the date of the article discussing graphine with a bandgap is after the date of the article linked in the slashdot summary discussing that graphine can't have a bandgap. Sounds like the authors of the articles need to talk to some more people and get their facts ironed out.
Ever hear of leakage? (Score:5, Informative)
Most of the freaking chip is cache. Have a look at the floorplan sometime.
Intel engineers sometimes joke that they're the biggest memory vendor that nobody heard of.
The fundamental problem with "doesn't turn off" is that leakage current (IDS(OFF)) is already a major component of chip dissipation, even when we use all sorts of tricks to reduce it. With graphene, that goes from "problem" to "useless." Except for analog, where the transistors don't turn off anyway.
somehow they failed to check google (Score:5, Informative)
Re:SSD application? (Score:5, Informative)
Here's the electronics 101 version: Transistors have three ports. Electrons flow from one (the source/emitter) to another (the drain/collector). The amount of electrons that make it across depends on the mode the transistor is in, and the mode is controlled by the voltage applied to the third port, (the gate/base). There are three modes: off, linear, and saturation. In saturation, the electrons are flowing as fast as they can and small changes in the gate voltage don't matter. In linear mode, the current is directly proportional to the gate voltage - this mode is key to analog circuits. When the transistor is off, very little current gets across (on the order of femtoamps). When they say graphene transistors can't be completely turned off, they mean the amount of current that gets through when it is off is much larger than for normal transistors. It can still be "turned off" in the sense that if you take away all of the electricity, it loses its state, so there's no particular reason that it would be useful for storage.
As the article notes, a likely use would be in combination with more traditional transistors, wherein you could take advantage of graphene's speed, and then have a silicon "boot" to turn off the circuit when it's not in use by cutting off all of the power to that block.
Ya well don't get too excited (Score:5, Informative)
The other problems with graphene mentioned aside, you start to run in to speed of light issues with extremely high frequencies. At high frequencies the wavelength is so short, that it can't travel across a chip in a single cycle. That has some real design issues. For example at the full speed of light a 5GHz signal has a wavelength of 6cm. Ok, not a problem. A core in a CPU is smaller than that so the signal can travel anywhere in a single clock, even taking in to account that wire runs could be longer. However at 50GHz, well then you are only talking 6mm. That's a potential problem. Current chips are larger than that, never mind the wire runs. Maybe if cores are kept small and simple it is fine, but it is getting problematic. At 1THz you are talking only 300 micrometers wavelength.
So even if graphene becomes practical, speeds that high may never make it in to CPUs. That a transistor can operate at those speeds doesn't mean a whole CPU can.
Re:This is a problem? (Score:4, Informative)
We're not talking "a bit high". ECL logic is *insanely* power hungry. If we implemented all of a processor's logic in source-coupled logic, it would consume 10-100 times more power than it does currently.
Re:Ya well don't get too excited (Score:4, Informative)
This is a problem with or without graphene. Chip design usually doesn't have signals travel all the way across the chip. In fact, good place and route engineers will keep signal nodes as close as possible and logic designers will try to keep high-connection nodes to a minimum, separating out logical clusters to be as isolated as possible.