Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
IBM Hardware Technology

IBM Builds First Graphene Integrated Circuit 77

AffidavitDonda writes "IBM researchers have built the first integrated circuit (IC) based on a graphene transistor. The circuit, built on a wafer of silicon carbide, consists of field-effect transistors made of graphene. The IC also includes metallic structures, such as on-chip inductors and the transistors' sources and drains. The circuit the team built is a broadband radio-frequency mixer, a fundamental component of radios that processes signals by finding the difference between two high-frequency wavelengths."
This discussion has been archived. No new comments can be posted.

IBM Builds First Graphene Integrated Circuit

Comments Filter:
  • by Anonymous Coward

    I don't think that the article goes into enough detail about just how important this accomplishment is. Frankly, this is our only hope going forward. With so much slow software written in languages like Ruby and JavaScript becoming popular, it will again fall back to the hardware guys to really make things fast again. This will probably be the way they'll do it!

    • Do lay it all on the hardware side. We'll continue to see speed gains in interpreter output too. Java, for example, was dog slow when it started; it's speed has improved significantly since then, and not solely due to processor improvements.

      Or was this just an opportunity to troll?

      • And imagine if all that effort speeding up these slow languages was actually put to use in writing code in an efficient language to begin with.

        • Not that much effort compared to the savings of writing in a portable language like Java.

          • Re: (Score:3, Funny)

            by Anonymous Coward
            Write Once, Debug Everywhere.
          • by Lunix Nutcase ( 1092239 ) on Friday June 10, 2011 @05:51PM (#36406796)

            15 years of optimization from Sun just to bring it within a magnitude of C? That seems quite a bit of effort wasted.

            • It's OK. Big enterprise has billions of dollars to waste on using a portable language to process textual data.

          • by Lunix Nutcase ( 1092239 ) on Friday June 10, 2011 @06:00PM (#36406886)

            portable language like Java.

            Right.... One can find a C compiler for pretty much every processor since the 80s. I can point out a number of still widely used architectures that have no JVM.

            • Re: (Score:3, Insightful)

              Porting C code between architectures is a pain in the ass. Endianness issues alone can fuck up plenty of code, without even getting into differences in compilers and standard libraries.

              Things like porting from one UNIX to another UNIX on a different arch - stuff that most armchair programmers view as "just recompile" - can take hundreds of man-hours or more on complex codebases. C is not portable.
              • What Lunix Nutcase meant was that you can write C code for practically any processor out there. The same can't be said for Java. And yes, C is portable; all you have to do is make use of ifdef statements to use the right code for architecture-specific code.

              • by bgat ( 123664 ) on Friday June 10, 2011 @09:07PM (#36408140) Homepage

                While it is certainly possible to write C code that is endian-dependent, I consider such code to be broken--- as would any sane, professional C programmer.

                To wit, the eleventy-million lines of C code in the Linux kernel are fully endian-agnostic. And largely independent of integer representation size, too!

              • Heh....I worked for a while at a company that had a product for Brew, written in C, and in J2ME, of course Java (covered the vast majority of programmable cell phones at the time). We had three guys working full time to port the J2ME version to different platforms, because all the manufacturers had different quirks. It only took one guy to take care of the Brew ports, and it was easier.

                Now, obviously, J2ME is not J2SE, but it's all that was available on those platforms.
              • Endianness issues alone can fuck up plenty of code

                Unless you are coding device drivers or the like (in which case the JVM wouldn't be very helpful anyway), the endianness shouldn't affect your code at all. If it does, you've done something wrong.

          • As one of my colleagues said, saying Java is good because it's multi-platform is like saying anal sex is good because it works on both sexes.
        • by gman003 ( 1693318 ) on Friday June 10, 2011 @05:48PM (#36406762)
          Programmer time is more expensive than hardware time. If a less efficient language is easier to use, it makes business sense to use it to save money.

          This does not explain the slow languages that are difficult to use, but it does explain why assembly has fallen from favor, and why C is in decline.
          • and why C is in decline.

            Hahaha, lol wut? According to the much beloved Tiobe Index by Java programmers, C is 2nd place in most popular languages.

            • Remember when it used to be first, by a huge margin? It's not dead by any means, and still a very active language, but it's not taught as much anymore. Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.
              • by Lunix Nutcase ( 1092239 ) on Friday June 10, 2011 @06:04PM (#36406924)

                Remember when it used to be first, by a huge margin?

                C is still more wisely used by a huge margin. Just because you "enterprise" developers don't use it (despite the infrastructure of your managed languages being written in C and C++) doesn't change that.

                Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.

                Yeah right. What are you going to write your kernels in? What are you going to use for those millions if not billions of microcontrollers that will still be in use that can't run a JVM? What exactly are you going to write your VMs and interpreters in? Right, they will have to be written in a C or C++ and assembly.

                • by tyrione ( 134248 )

                  Remember when it used to be first, by a huge margin?

                  C is still more wisely used by a huge margin. Just because you "enterprise" developers don't use it (despite the infrastructure of your managed languages being written in C and C++) doesn't change that.

                  Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.

                  Yeah right. What are you going to write your kernels in? What are you going to use for those millions if not billions of microcontrollers that will still be in use that can't run a JVM? What exactly are you going to write your VMs and interpreters in? Right, they will have to be written in a C or C++ and assembly.

                  Let's add that C99 and C2003 for OpenCL and C++ latest standard with all you mentioned and OpenCL will only expand.

              • by Jahava ( 946858 ) on Friday June 10, 2011 @06:07PM (#36406954)

                Remember when it used to be first, by a huge margin? It's not dead by any means, and still a very active language, but it's not taught as much anymore. Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.

                ... and kernels, and drivers, and embedded applications, and core libraries, and runtimes, too, unless those go away.

                C is a fantastic language that very effectively performs a much-needed role in software development: to provide a lightweight, usable, and readable language while retaining (most of) the capabilities of machine code. C is intended to interface directly with the system, or closely with the operating system.

                C is in decline because many modern programming challenges don't benefit from working on the level of machine code or operating system, nor should they. If I want to write a game, I want to focus on the game design and mechanics, not bit blitting pixels onto a buffer. Libraries, interfaces, and abstraction levels are all things higher-level languages leverage to constrain the perspective and duty of the developer to the most productive (and, oftentimes, interesting) areas.

                Also, let's not forget that in the common non-kernel case, most of the reason C is even usable is because C, itself, leverages a massive host of support libraries and a not-so-lightweight runtime.

                • by bgat ( 123664 )

                  C appears to be in decline only because of the explosive growth in the number of applications produced in higher-level languages. Total annual C output is increasing year-on-year--- mostly to implement systems that themselves support the aforementioned applications.

                  Put another way, none of the growth in Java, C#, Python, Ruby, etc. etc. etc. would be possible without growth in C output as well. C won't ever go away, because every higher-level language in existence depends on it.

                  So you can have your Java.

                  • Absolutely! C is totally a part of Ruby. We'd be suffering without it.

                    If something is too slow, we write it as an extension... in C.

                    In fact, the ease of integrating with C is one of the biggest advantages in Ruby over Perl, where you need a whole additional language (XS) to glue them together. In Ruby all we need is a lightweight C API and we're there.

              • by Toonol ( 1057698 )
                Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.

                No. FORTRAN was replaced. You can do anything you could do in FORTRAN in more modern languages (like C, for instance). However, you cannot write operating systems in Java. I don't think there's any replacement on the near horizon that fills C's low-level niche.

                I can see the use of C and C++ in most applications decreasing, although not when speed or performance is more critical than the expense of extra
                • by tyrione ( 134248 )

                  Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps. No. FORTRAN was replaced. You can do anything you could do in FORTRAN in more modern languages (like C, for instance). However, you cannot write operating systems in Java. I don't think there's any replacement on the near horizon that fills C's low-level niche. I can see the use of C and C++ in most applications decreasing, although not when speed or performance is more critical than the expense of extra labor.

                  FORTRAN is making a comeback especially seeing as how phenomenal its uses in the Applied Sciences needing Numerical Analysis [Yes, C is it's bride in this area] but FORTRAN was designed from the ground up for such work.

                  • yep I have have been considering writing some Map Reduce programs in FORTRAN and as tryione points out FORTRAN still used in full on technical programming where you want to solve a problem and not reinventing the wheels C doesn't have. FORTRAN also has decades of work going into compiler development including the specialized parallel and CUDA aware variants.
                • by GNious ( 953874 )

                  So we start with a CPU that processes native Java bytecode, then build a kernel, then userspace, then ...

                  and we shall call it Javux ...

              • Dude, you can have my FORTRAN compiler when you pry it out of my cold, dead hands. Now where did I leave those punchcards for the latest project again?? AND GET OFF MY LAWN!
              • Legacy apps like the JVM?

              • Within a generation, it'll be in the same class as FORTRAN - only used to support legacy apps.

                You're making a fool of yourself. New codes are still written in Fortran, just not in your niche. The Fortran standard is still evolving (we've had Fortran 2003, Fortran 2008 recently). Massively parallel (think thousands of cores) high-performance computing ("number crunching") scientific programs are often written in Fortran. This is because the language is rather simple and hence compilers can be heavily optimized -- often *beyond* what is possible in C (e.g. the compiler can use the fact that arguments

          • by parlancex ( 1322105 ) on Friday June 10, 2011 @07:44PM (#36407676)

            Programmer time is more expensive than hardware time. If a less efficient language is easier to use, it makes business sense to use it to save money. This does not explain the slow languages that are difficult to use, but it does explain why assembly has fallen from favor, and why C is in decline.

            I honestly hate this idea. You write have to write a program once. Most programs run thousands of times, some programs will run millions or billions of times. If you actually calculated the global collective waste due to slow heavily abstracted languages running across the globe that cost is significantly than it would've been to write it properly to begin with.

            • by Kjella ( 173770 )

              I honestly hate this idea. You write have to write a program once. Most programs run thousands of times, some programs will run millions or billions of times. If you actually calculated the global collective waste due to slow heavily abstracted languages running across the globe that cost is significantly than it would've been to write it properly to begin with.

              If you're developing OpenOffice or MySQL perhaps. I've many scripts and procedures at work that are run once a day or once a month on a centralized system for many users, the carbon footprint of that is probably smaller than the first user who drove to the office. And right now I'm doing a migration that's only going to be done once, things that are wasteful but make no sense optimizing.

              If we assume you get less done with a more "to the metal" language, you also have to consider the cost of what we wouldn't

            • Programmer time is more expensive than hardware time. If a less efficient language is easier to use, it makes business sense to use it to save money.

              I honestly hate this idea. You write have to write a program once.

              If you made a mistake, and the language you're using makes it difficult to track down that mistake, you may write that program many times over.

              If you have a hotshot programmer on your team who thinks he's more clever than he is, no matter what language you're using, you may have to rewrite it again, or parts of it. The easier that is, the less of a loss it was to have made that hiring decision.

              If your language of choice does not support certain features natively (concurrency, garbage collection, variable t

            • On the other hand would you rather have an application in which the developer spent most of their time writing code or adding features?

              Sometimes performance is a feature but on most modern systems for most applications performance is secondary to functionality.

              I write most of my tools in an abstracted scripting language instead of C++ SDKs. Why? Because I can crank out a tool in about an hour. Writing a C++ SDK would take several days.

              I'm less interested in how fast it is than what it can do. JIT scripti

            • Most programs run thousands of times, some programs will run millions or billions of times. If you actually calculated the global collective waste due to slow heavily abstracted languages running across the globe that cost is significantly than it would've been to write it properly to begin with.

              Yeah, but that's a cost to the user who pays for the hardware, not to the company that writes the software.

            • If you actually calculated the global collective waste due to slow heavily abstracted languages running across the globe that cost is significantly than it would've been to write it properly to begin with.

              - but there is another side to this coin. What if writing an application in a higher level language is not only simpler than in a lower level language, due to more variety of specialized libraries, but also produces fewer errors per some unit of code (per line or per transaction or per function, whatever)?

              If this is true ( and I do think it is, that's after working in this industry since 1995), then except for having a possibly slower program, you also end up having a program, that is possibly more correct

        • by Kjella ( 173770 )

          But efficient in man hours? I prefer C++/Qt and without Qt on top I'd probably drop C++ entirely. Even with Qt there are many things I miss or think work illogically that are as they are because C++ was designed in 1979 as an extension to C designed in 1973 and C++0x only adds toppings. The standard library is completely barebones compared to what you'd expect from Java, C# or any other modern language in 2011.

          Honestly so many defaults should have been switched for sanity, for example I'd make all numbers d

        • You assembler snobs make me sick.

      • by Twinbee ( 767046 )

        Trolling against who? I thought it was obvious that these languages were slower than say C#, C++ or Java. They make up for it in terms of flexibility (dynamic code generation) and ease of use, but I thought that was also taken foregranted.

    • by Hartree ( 191324 )

      It's certainly useful for analog systems.

      Now we just need something with that kind of electron mobility that still has a band gap so it can be shut off for digital.

    • by lurgyman ( 587233 ) on Friday June 10, 2011 @05:49PM (#36406774)
      Well, let's not get ahead of ourselves. A mixer is an analog circuit, and silicon carbide is an expensive substrate to work with (very high processing temperatures), so it is typically only worthwhile for high-power analog devices. There is no discussion about anything digital in this article, so this is not related to programming languages or computers. Many analog devices have been made beyond 100 GHz on plain old silicon too; graphene on SiC may be important by enabling greater power density at high frequencies. As a microwave engineer, I'm excited about this, but this needs to happen on an inexpensive IC process for very small devices to be useful for digital circuits.
    • by Jahava ( 946858 )

      I don't think that the article goes into enough detail about just how important this accomplishment is. Frankly, this is our only hope going forward. With so much slow software written in languages like Ruby and JavaScript becoming popular, it will again fall back to the hardware guys to really make things fast again. This will probably be the way they'll do it!

      While I agree with your statement that this is likely incredibly important, your concept of the state of software is absurd.

      Non-specialized (e.g., consumer-grade) software platform choices - language, compiler, interpreter, execution environment, and operating system - are made largely based on the current hardware status quo of the typical software user. If hardware (CPU, GPU, network, etc.) continues to get faster, software will be written to complement that hardware. The second hardware becomes a limitat

      • by osu-neko ( 2604 )

        The second hardware becomes a limitation, software will back off, trim down, and optimize. While the factors behind it are too numerous to fully detail in a post, the key idea here is that waste and bloat can often be the consequences of a tradeoff for functionaity, stability, speed, and development time to the net benefit of the consumer.

        lol -- meanwhile back in the real world...

      • I like the way you wrote "consumer grade", as if there is another
  • ...without efficient static memory, mostly because of the CPU-Memory gap. A faster CPU would require the memory and the bus to keep up at a similar frequency. That's already a problem, and even if that were possible then it would lead to increased power consumption using dynamic RAM and frankly, I think that's the last thing we need.

    So faster CPUs will only be a viable alternative when we manage to get something like those memristors they keep talking about. Until then, it's larger caches and higher-frequen

    • Kind of. I agree memory bandwidth is important and especially for parallel computing problems which can be very memory bandwidth demanding, but for modern processors consisting of 4 or even 8 cores you'll see very little gains in performance when increasing DRAM frequency. The reason for this is modern processors have a lot of (ostensibly wasted) die space that allows it to make intelligent decisions in prefetching, register renaming, and yes, caching.
    • by Anonymous Coward on Friday June 10, 2011 @06:35PM (#36407200)

      Wow, you're full of shit. Graphene isn't useful for digital circuits at all (at least yet) because it has crappy on-off ratios, but GHz are most definitely useful for radio work.

      Remember how everybody's using their mobiles for everything these days, and streaming video keeps getting more popular and higher bitrate? Well, when you run out of spectrum below 5GHz (where all mobile networks currently operate), getting up into the 10-100GHz range is extremely useful to provide that extra bandwidth.

      Since nobody but actual electrical engineers seems to know anything about radio anymore (used to be a common hobby for geeks, but I guess it's not "cool" anymore), let me explain one application of a mixer like the one described in TFA. You can use it to make a transverter, which takes a signal from your UHF radio (maybe a mobile phone, wifi card, whatever), and kicks it up to ~10GHz for transmission. And flips received signals back down to the original ~2GHz band.

      Not impressed? Sure, 10GHz isn't much, we can easily beat that already -- it's only a prototype. But it's quite likely we'll have 100GHz-1THz in the lab inside a year, and on the market in ~5. There's a whole lot of bandwidth north of 30GHz, and (as long as you stay out of absorption bands like O2 at 60GHz, which limit you to short-range stuff like wifi/bluetooth replacements), it's eminently usable for urban cellular networks -- if you have the ICs to handle it.

  • It's a diode! (Score:4, Informative)

    by bughunter ( 10093 ) <bughunter@@@earthlink...net> on Friday June 10, 2011 @06:13PM (#36407016) Journal

    The circuit the team built is a broadband radio-frequency mixer, a fundamental component of radios that processes signals by finding the difference between two high-frequency wavelengths.

    Did someone paid directly by IEEE write that? "Two high-frequency wavelengths?"

    The device is a nonlinear summing element. In other words, it has a transfer function of the form y=Sum(ax^n) for integer values of n from zero to at least 2. A very common example is a diode. But it could also be a transistor in the saturation region, or something more esoteric.

    Due to the nonzero second-order transfer function coefficient, provides not only the superposed sum of the two signals at their original frequency, but also at the sum and difference of the two input frequencies. Add filters to throw away the parts you don't want, and you can make a modulator, a frequency upconverter, or a downconverter... all of these are used every day inside things you probably have in your pocket or purse, from cellphones to car stereos, television receivers to communications satellites.

    But basically, it does the same thing a diode does... just faster.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      all of these are used every day inside things you probably have in your pocket or purse, from cellphones to car stereos, television receivers to communications satellites.

      Lesse:

      1) cellphone: yup!

      2) car stereo: well, I don't carry a purse, but if I did, I guess so.

      3) television receiver: yup, though an interesting one would probably warrant a purse in place of a pocket.

      4) communications satellite: erm, here's where I run into the problem of one fitting in my pocket, or a purse.

      So, is this "suggest three plau

    • by artor3 ( 1344997 )

      I think you're being unfair to the writer. "Finding the difference between two high-frequency wavelengths" is an accurate* description of a downconverter, which is a likely use for this device. And saying it's no different from a faster diode is like saying a GaAs transistor is just like a faster vacuum tube.

      * Okay, technically, using the word "wavelength" to describe the signals is a bit ...off. But it's close enough.

    • They happen to be using it as a mixer, but the article clearly says that it's a FET (which certainly qualifies as a non-linear device). It might not be suitable for digital logic yet, but it is a transistor I believe. Also, 10 GHz for a proof-of-concept is damn fast.

  • Will this allow a different way to signal other logic gates? Is it worthwhile to think about the frequencies that are usually discarded? It seems to me that two or more gates might reinforce each other enough to trigger a third that wasn't directly linked? Could you do something with the extra information?
  • by Ramble ( 940291 )
    The article mentions PMMA and resist for use in electron beam lithography, yet PMMA is THE resist used in e-beam.

1 1 was a race-horse, 2 2 was 1 2. When 1 1 1 1 race, 2 2 1 1 2.

Working...