IBM Builds First Graphene Integrated Circuit 77
AffidavitDonda writes "IBM researchers have built the first integrated circuit (IC) based on a graphene transistor. The circuit, built on a wafer of silicon carbide, consists of field-effect transistors made of graphene. The IC also includes metallic structures, such as on-chip inductors and the transistors' sources and drains. The circuit the team built is a broadband radio-frequency mixer, a fundamental component of radios that processes signals by finding the difference between two high-frequency wavelengths."
This is an extremely important accomplishment. (Score:2, Funny)
I don't think that the article goes into enough detail about just how important this accomplishment is. Frankly, this is our only hope going forward. With so much slow software written in languages like Ruby and JavaScript becoming popular, it will again fall back to the hardware guys to really make things fast again. This will probably be the way they'll do it!
Re: (Score:2)
Do lay it all on the hardware side. We'll continue to see speed gains in interpreter output too. Java, for example, was dog slow when it started; it's speed has improved significantly since then, and not solely due to processor improvements.
Or was this just an opportunity to troll?
Re: (Score:2)
And imagine if all that effort speeding up these slow languages was actually put to use in writing code in an efficient language to begin with.
Re: (Score:3)
Not that much effort compared to the savings of writing in a portable language like Java.
Re: (Score:3, Funny)
Re:This is an extremely important accomplishment. (Score:5, Interesting)
15 years of optimization from Sun just to bring it within a magnitude of C? That seems quite a bit of effort wasted.
Re: (Score:1)
It's OK. Big enterprise has billions of dollars to waste on using a portable language to process textual data.
Re:This is an extremely important accomplishment. (Score:4, Informative)
portable language like Java.
Right.... One can find a C compiler for pretty much every processor since the 80s. I can point out a number of still widely used architectures that have no JVM.
Re: (Score:3, Insightful)
Things like porting from one UNIX to another UNIX on a different arch - stuff that most armchair programmers view as "just recompile" - can take hundreds of man-hours or more on complex codebases. C is not portable.
Re: (Score:2)
What Lunix Nutcase meant was that you can write C code for practically any processor out there. The same can't be said for Java. And yes, C is portable; all you have to do is make use of ifdef statements to use the right code for architecture-specific code.
Re:This is an extremely important accomplishment. (Score:4, Insightful)
While it is certainly possible to write C code that is endian-dependent, I consider such code to be broken--- as would any sane, professional C programmer.
To wit, the eleventy-million lines of C code in the Linux kernel are fully endian-agnostic. And largely independent of integer representation size, too!
Re: (Score:2)
Now, obviously, J2ME is not J2SE, but it's all that was available on those platforms.
Re: (Score:1)
Unless you are coding device drivers or the like (in which case the JVM wouldn't be very helpful anyway), the endianness shouldn't affect your code at all. If it does, you've done something wrong.
Re: (Score:3)
Re: (Score:2)
Re:This is an extremely important accomplishment. (Score:4, Interesting)
This does not explain the slow languages that are difficult to use, but it does explain why assembly has fallen from favor, and why C is in decline.
Re: (Score:3)
and why C is in decline.
Hahaha, lol wut? According to the much beloved Tiobe Index by Java programmers, C is 2nd place in most popular languages.
Re: (Score:1)
Re:This is an extremely important accomplishment. (Score:5, Insightful)
Remember when it used to be first, by a huge margin?
C is still more wisely used by a huge margin. Just because you "enterprise" developers don't use it (despite the infrastructure of your managed languages being written in C and C++) doesn't change that.
Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.
Yeah right. What are you going to write your kernels in? What are you going to use for those millions if not billions of microcontrollers that will still be in use that can't run a JVM? What exactly are you going to write your VMs and interpreters in? Right, they will have to be written in a C or C++ and assembly.
Re: (Score:2)
Remember when it used to be first, by a huge margin?
C is still more wisely used by a huge margin. Just because you "enterprise" developers don't use it (despite the infrastructure of your managed languages being written in C and C++) doesn't change that.
Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.
Yeah right. What are you going to write your kernels in? What are you going to use for those millions if not billions of microcontrollers that will still be in use that can't run a JVM? What exactly are you going to write your VMs and interpreters in? Right, they will have to be written in a C or C++ and assembly.
Let's add that C99 and C2003 for OpenCL and C++ latest standard with all you mentioned and OpenCL will only expand.
Re:This is an extremely important accomplishment. (Score:5, Informative)
Remember when it used to be first, by a huge margin? It's not dead by any means, and still a very active language, but it's not taught as much anymore. Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.
... and kernels, and drivers, and embedded applications, and core libraries, and runtimes, too, unless those go away.
C is a fantastic language that very effectively performs a much-needed role in software development: to provide a lightweight, usable, and readable language while retaining (most of) the capabilities of machine code. C is intended to interface directly with the system, or closely with the operating system.
C is in decline because many modern programming challenges don't benefit from working on the level of machine code or operating system, nor should they. If I want to write a game, I want to focus on the game design and mechanics, not bit blitting pixels onto a buffer. Libraries, interfaces, and abstraction levels are all things higher-level languages leverage to constrain the perspective and duty of the developer to the most productive (and, oftentimes, interesting) areas.
Also, let's not forget that in the common non-kernel case, most of the reason C is even usable is because C, itself, leverages a massive host of support libraries and a not-so-lightweight runtime.
Re: (Score:3)
C appears to be in decline only because of the explosive growth in the number of applications produced in higher-level languages. Total annual C output is increasing year-on-year--- mostly to implement systems that themselves support the aforementioned applications.
Put another way, none of the growth in Java, C#, Python, Ruby, etc. etc. etc. would be possible without growth in C output as well. C won't ever go away, because every higher-level language in existence depends on it.
So you can have your Java.
Re: (Score:2)
Absolutely! C is totally a part of Ruby. We'd be suffering without it.
If something is too slow, we write it as an extension... in C.
In fact, the ease of integrating with C is one of the biggest advantages in Ruby over Perl, where you need a whole additional language (XS) to glue them together. In Ruby all we need is a lightweight C API and we're there.
Re: (Score:1)
Re: (Score:3)
No. FORTRAN was replaced. You can do anything you could do in FORTRAN in more modern languages (like C, for instance). However, you cannot write operating systems in Java. I don't think there's any replacement on the near horizon that fills C's low-level niche.
I can see the use of C and C++ in most applications decreasing, although not when speed or performance is more critical than the expense of extra
Re: (Score:2)
Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps. No. FORTRAN was replaced. You can do anything you could do in FORTRAN in more modern languages (like C, for instance). However, you cannot write operating systems in Java. I don't think there's any replacement on the near horizon that fills C's low-level niche. I can see the use of C and C++ in most applications decreasing, although not when speed or performance is more critical than the expense of extra labor.
FORTRAN is making a comeback especially seeing as how phenomenal its uses in the Applied Sciences needing Numerical Analysis [Yes, C is it's bride in this area] but FORTRAN was designed from the ground up for such work.
Re: (Score:2)
Re: (Score:1)
So we start with a CPU that processes native Java bytecode, then build a kernel, then userspace, then ...
and we shall call it Javux ...
Re: (Score:2)
Re: (Score:2)
Legacy apps like the JVM?
Re: (Score:1)
Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.
You're making a fool of yourself. New codes are still written in Fortran, just not in your niche. The Fortran standard is still evolving (we've had Fortran 2003, Fortran 2008 recently). Massively parallel (think thousands of cores) high-performance computing ("number crunching") scientific programs are often written in Fortran. This is because the language is rather simple and hence compilers can be heavily optimized -- often *beyond* what is possible in C (e.g. the compiler can use the fact that arguments
Re:This is an extremely important accomplishment. (Score:5, Interesting)
Programmer time is more expensive than hardware time. If a less efficient language is easier to use, it makes business sense to use it to save money. This does not explain the slow languages that are difficult to use, but it does explain why assembly has fallen from favor, and why C is in decline.
I honestly hate this idea. You write have to write a program once. Most programs run thousands of times, some programs will run millions or billions of times. If you actually calculated the global collective waste due to slow heavily abstracted languages running across the globe that cost is significantly than it would've been to write it properly to begin with.
Re: (Score:2)
I honestly hate this idea. You write have to write a program once. Most programs run thousands of times, some programs will run millions or billions of times. If you actually calculated the global collective waste due to slow heavily abstracted languages running across the globe that cost is significantly than it would've been to write it properly to begin with.
If you're developing OpenOffice or MySQL perhaps. I've many scripts and procedures at work that are run once a day or once a month on a centralized system for many users, the carbon footprint of that is probably smaller than the first user who drove to the office. And right now I'm doing a migration that's only going to be done once, things that are wasteful but make no sense optimizing.
If we assume you get less done with a more "to the metal" language, you also have to consider the cost of what we wouldn't
Re: (Score:3)
Programmer time is more expensive than hardware time. If a less efficient language is easier to use, it makes business sense to use it to save money.
I honestly hate this idea. You write have to write a program once.
If you made a mistake, and the language you're using makes it difficult to track down that mistake, you may write that program many times over.
If you have a hotshot programmer on your team who thinks he's more clever than he is, no matter what language you're using, you may have to rewrite it again, or parts of it. The easier that is, the less of a loss it was to have made that hiring decision.
If your language of choice does not support certain features natively (concurrency, garbage collection, variable t
Re: (Score:2)
On the other hand would you rather have an application in which the developer spent most of their time writing code or adding features?
Sometimes performance is a feature but on most modern systems for most applications performance is secondary to functionality.
I write most of my tools in an abstracted scripting language instead of C++ SDKs. Why? Because I can crank out a tool in about an hour. Writing a C++ SDK would take several days.
I'm less interested in how fast it is than what it can do. JIT scripti
Re: (Score:2)
Most programs run thousands of times, some programs will run millions or billions of times. If you actually calculated the global collective waste due to slow heavily abstracted languages running across the globe that cost is significantly than it would've been to write it properly to begin with.
Yeah, but that's a cost to the user who pays for the hardware, not to the company that writes the software.
Re: (Score:2)
If you actually calculated the global collective waste due to slow heavily abstracted languages running across the globe that cost is significantly than it would've been to write it properly to begin with.
- but there is another side to this coin. What if writing an application in a higher level language is not only simpler than in a lower level language, due to more variety of specialized libraries, but also produces fewer errors per some unit of code (per line or per transaction or per function, whatever)?
If this is true ( and I do think it is, that's after working in this industry since 1995), then except for having a possibly slower program, you also end up having a program, that is possibly more correct
Re: (Score:2)
But efficient in man hours? I prefer C++/Qt and without Qt on top I'd probably drop C++ entirely. Even with Qt there are many things I miss or think work illogically that are as they are because C++ was designed in 1979 as an extension to C designed in 1973 and C++0x only adds toppings. The standard library is completely barebones compared to what you'd expect from Java, C# or any other modern language in 2011.
Honestly so many defaults should have been switched for sanity, for example I'd make all numbers d
Re: (Score:2)
You assembler snobs make me sick.
Re: (Score:2)
Trolling against who? I thought it was obvious that these languages were slower than say C#, C++ or Java. They make up for it in terms of flexibility (dynamic code generation) and ease of use, but I thought that was also taken foregranted.
Re: (Score:2)
It's certainly useful for analog systems.
Now we just need something with that kind of electron mobility that still has a band gap so it can be shut off for digital.
Re:This is an extremely important accomplishment. (Score:4, Interesting)
Re: (Score:3)
I don't think that the article goes into enough detail about just how important this accomplishment is. Frankly, this is our only hope going forward. With so much slow software written in languages like Ruby and JavaScript becoming popular, it will again fall back to the hardware guys to really make things fast again. This will probably be the way they'll do it!
While I agree with your statement that this is likely incredibly important, your concept of the state of software is absurd.
Non-specialized (e.g., consumer-grade) software platform choices - language, compiler, interpreter, execution environment, and operating system - are made largely based on the current hardware status quo of the typical software user. If hardware (CPU, GPU, network, etc.) continues to get faster, software will be written to complement that hardware. The second hardware becomes a limitat
Re: (Score:1)
The second hardware becomes a limitation, software will back off, trim down, and optimize. While the factors behind it are too numerous to fully detail in a post, the key idea here is that waste and bloat can often be the consequences of a tradeoff for functionaity, stability, speed, and development time to the net benefit of the consumer.
lol -- meanwhile back in the real world...
Re: (Score:1)
Gigahertz are useless... (Score:2)
...without efficient static memory, mostly because of the CPU-Memory gap. A faster CPU would require the memory and the bus to keep up at a similar frequency. That's already a problem, and even if that were possible then it would lead to increased power consumption using dynamic RAM and frankly, I think that's the last thing we need.
So faster CPUs will only be a viable alternative when we manage to get something like those memristors they keep talking about. Until then, it's larger caches and higher-frequen
Re: (Score:2)
Re: (Score:2)
Re:Gigahertz are useless... (Score:5, Interesting)
Wow, you're full of shit. Graphene isn't useful for digital circuits at all (at least yet) because it has crappy on-off ratios, but GHz are most definitely useful for radio work.
Remember how everybody's using their mobiles for everything these days, and streaming video keeps getting more popular and higher bitrate? Well, when you run out of spectrum below 5GHz (where all mobile networks currently operate), getting up into the 10-100GHz range is extremely useful to provide that extra bandwidth.
Since nobody but actual electrical engineers seems to know anything about radio anymore (used to be a common hobby for geeks, but I guess it's not "cool" anymore), let me explain one application of a mixer like the one described in TFA. You can use it to make a transverter, which takes a signal from your UHF radio (maybe a mobile phone, wifi card, whatever), and kicks it up to ~10GHz for transmission. And flips received signals back down to the original ~2GHz band.
Not impressed? Sure, 10GHz isn't much, we can easily beat that already -- it's only a prototype. But it's quite likely we'll have 100GHz-1THz in the lab inside a year, and on the market in ~5. There's a whole lot of bandwidth north of 30GHz, and (as long as you stay out of absorption bands like O2 at 60GHz, which limit you to short-range stuff like wifi/bluetooth replacements), it's eminently usable for urban cellular networks -- if you have the ICs to handle it.
Re: (Score:1)
It's a diode! (Score:4, Informative)
The circuit the team built is a broadband radio-frequency mixer, a fundamental component of radios that processes signals by finding the difference between two high-frequency wavelengths.
Did someone paid directly by IEEE write that? "Two high-frequency wavelengths?"
The device is a nonlinear summing element. In other words, it has a transfer function of the form y=Sum(ax^n) for integer values of n from zero to at least 2. A very common example is a diode. But it could also be a transistor in the saturation region, or something more esoteric.
Due to the nonzero second-order transfer function coefficient, provides not only the superposed sum of the two signals at their original frequency, but also at the sum and difference of the two input frequencies. Add filters to throw away the parts you don't want, and you can make a modulator, a frequency upconverter, or a downconverter... all of these are used every day inside things you probably have in your pocket or purse, from cellphones to car stereos, television receivers to communications satellites.
But basically, it does the same thing a diode does... just faster.
Re: (Score:2, Funny)
all of these are used every day inside things you probably have in your pocket or purse, from cellphones to car stereos, television receivers to communications satellites.
Lesse:
1) cellphone: yup!
2) car stereo: well, I don't carry a purse, but if I did, I guess so.
3) television receiver: yup, though an interesting one would probably warrant a purse in place of a pocket.
4) communications satellite: erm, here's where I run into the problem of one fitting in my pocket, or a purse.
So, is this "suggest three plau
Re: (Score:2)
I think you're being unfair to the writer. "Finding the difference between two high-frequency wavelengths" is an accurate* description of a downconverter, which is a likely use for this device. And saying it's no different from a faster diode is like saying a GaAs transistor is just like a faster vacuum tube.
* Okay, technically, using the word "wavelength" to describe the signals is a bit ...off. But it's close enough.
Re: (Score:3)
They happen to be using it as a mixer, but the article clearly says that it's a FET (which certainly qualifies as a non-linear device). It might not be suitable for digital logic yet, but it is a transistor I believe. Also, 10 GHz for a proof-of-concept is damn fast.
More than binary? Wave propagation and harmonics. (Score:1)
PMMA (Score:1)