Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Power The Internet The Military Hardware

Nvidia Wins $20M In DARPA Money To Work On Hyper-Efficient Chips 72

coondoggie writes "Nvidia said this week it got a contract worth up to $20 million from the Defense Advanced Research Projects Agency to develop chips for sensor systems that could boost power output from today's 1 GFLOPS/watt to 75 GFLOPS/watt."
This discussion has been archived. No new comments can be posted.

Nvidia Wins $20M In DARPA Money To Work On Hyper-Efficient Chips

Comments Filter:
  • If used as space heaters, that is.

  • by kthreadd ( 1558445 ) on Saturday December 15, 2012 @06:00AM (#42300083)
    But you will need their proprietary driver.
  • by iceco2 ( 703132 ) <meirmaor@gmai l . com> on Saturday December 15, 2012 @06:38AM (#42300219)

    It seems to me a x75 increase in power efficiency should be worth to nVidia (or any competitor) much more than $20M, why does DARPA need to fund this, this seems exactly like the kind of work which doesn't need DARPA money. DARAPA should spend money where it is not clearly economic for others to do so.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Maybe DARPA also wants the thing to withstand radiation. Maybe they want it to have so little computational power that it would not sell in the market. Maybe they want every component to have been made in the US which the market won't care about. Maybe they have other restrictions. You don't know.

      • by rmstar ( 114746 )

        Maybe they want it to have so little computational power that it would not sell in the market.

        That's an intetresting point. How much power would a 300Mhz pentium of 1998 consume if built with modern technology? That thing could do quite a bit of computing, but as you say, wouldn't survive in todays PC market. But for embedded applications it would be great, not least for the fact that you can get the comfort of a full computer.

      • Maybe they want every component to have been made in the US which the market won't care about.

        ... I don't think you understand the chip fab process (that's what we're talking about here). $20 million is a lot of money in any other business, though.

    • It seems to me a x75 increase in power efficiency should be worth to nVidia (or any competitor) much more than $20M, why does DARPA need to fund this, this seems exactly like the kind of work which doesn't need DARPA money. DARAPA should spend money where it is not clearly economic for others to do so.

      You were under the assumption that we live in a purely capitalistic society? My mistake. Even in countries that are into extreme capitalism shift back to subsidies and support when it comes to certain things.

      Though, usually those things involve essential services like fire-fighters, road maintenance, and policing ...

    • DARAPA should spend money where it is not clearly economic for others to do so.

      Well, that's good advice, and I'm sure they'll take it over at DARAPA. On the other hand, DARPA has certain goals, whatever they might be, and if this is the most economical way to achieve them, then it's money well-spent from their perspective. If you're going to have a thing like DARPA, then you need to permit it to do things like this if you want it to be efficient. On the other hand, if you're going to do things like this, you need substantial oversight in place to prevent abuse. And on the gripping han

      • There is no way anyone can make an intelligent decision on what to purchase today. There are way too many variables so one can only guess. I am sure that a Gigaflops per watt is much better than the computer I am typing on now. So if they can go to 75 Gigaflops per watt why would anyone buy a computer again since a dump terminal could be better. But that decision would take a lot more knowledge than I have and I know a lot more about computers than the average person. I contribute to a project sponsored
        • There is no way anyone can make an intelligent decision on what to purchase today.

          Why not? If I'm shopping for an Android tablet, say, there's only a small handful of credible processors, and there's only so many screen technologies and manufacturers, and I can gauge a manufacturer's past quality and hope that it will serve as a useful predictor of future performance. I can look at their financial statements and find out if they have been purchased by vulture capitalists. And you can do all of this from a free or nearly-free computer. (I've given away computers more than adequate to the

    • There are only really two players in the GPU market. The only incentive they have is "be slightly better than the other guy".

  • by Anonymous Coward

    DARPA money is tax payer money.

  • Why GFLOPS/watt? that is (operations/second)/(Joules/second). why not just operations/joule?
    • Practicality. When talking about energy consumption, it's usually given in watts because the practical implications are time-dependant. You've got to account for the time it takes to run the calculations (which may be time-critical - you don't want your amalgamated radar data on a five-minute delay) and need to know the wattage to calculate cooling requirements. While operations/joule and flops/watt are equivilent, it's easier to think in terms of the former.

    • Maximizing ops/power is not the same as maximizing ops/energy. You would think that somebody that knows the difference between power and energy would also know the difference here.
      • Yes you maximize ops/power by running for longer! Prior to you no one has mentioned this concept, we have only considered ops/energy, either as such or as ops/sec/power.
    • I work with high-performance computing in physics -- all of my peers know the difference between energy and power. Sometimes people use "flops" as an abbreviation for "floating point operations" ("It takes XYZ flops per site to compute the Wilson Dirac operator" or "The flops/bytes ratio describes the balance between processing and communication in the algorithm") without the "per second".

  • the latest gpus are laready 15-18 GFLOPS/watt already. *confused*
    • Comment removed based on user account deletion
      • by Rockoon ( 1252108 ) on Saturday December 15, 2012 @08:47AM (#42300621)
        We passed 1e+07 operations per kWh in 1965.
        We passed 1e+08 operations per kWh in 1971.
        We passed 1e+09 operations per kWh in 1976.
        We passed 1e+10 operations per kWh in 1981.
        We passed 1e+11 operations per kWh in 1987.
        We passed 1e+12 operations per kWh in 1992.
        We passed 1e+13 operations per kWh in 1997.
        We passed 1e+14 operations per kWh in 2001.
        We passed 1e+15 operations per kWh in 2008.

        citation and graph [economist.com]

        Energy efficiency consistently doubles approximately every 1.6 years, so if we are at ~16 glops/watt right now, then we will blow past DARPA's target early in 2016... just a little over 3 years from now.
    • by Entropius ( 188861 ) on Saturday December 15, 2012 @10:11AM (#42300937)

      Out of curiosity (and I ask because I genuinely don't know), how many flops/watt do modern smartphones do? What about the GPU coprocessors in them?

      Modern GPU's are great, but they're not even optimized that strongly for power consumption.

      • by Kjella ( 173770 )

        Out of curiosity (and I ask because I genuinely don't know), how many flops/watt do modern smartphones do? What about the GPU coprocessors in them? Modern GPU's are great, but they're not even optimized that strongly for power consumption.

        I would think GPUs are actually worked more on for peak efficiency because top cards have been consuming hundreds of watts and particularly workstation and compute cards will often run at 100% when in use for a big render/compute job. Smartphones are much more about dynamic power, adjusting clocks and voltages and tons of sleep modes, if you're doing 100% load on all cores then none of that will have an effect. Sure they care about power usage at peak too, but I don't think more than GPUs.

        • by fnj ( 64210 )

          All true and insightful, but it would still be nice to know the actual per-watt figures operating all out, and compare them with desktop figures.

  • by account_deleted ( 4530225 ) on Saturday December 15, 2012 @07:20AM (#42300343)
    Comment removed based on user account deletion
  • by MSTCrow5429 ( 642744 ) on Saturday December 15, 2012 @08:18AM (#42300533)
    So of course the Federal government needs to blow $20 million of taxpayer money, irregardless of its fiscal condition.
    • by DavidClarkeHR ( 2769805 ) <david.clarke@hr g e n e r a l i s t .ca> on Saturday December 15, 2012 @09:23AM (#42300749)

      So of course the Federal government needs to blow $20 million of taxpayer money, irregardless of its fiscal condition.

      I prefer it was spent on computing, rather than explosions.

      • re: I prefer it was spent on computing, rather than explosions.
        .
        Don't forget that they can use the improved computational power and that improved computational power efficiency to simulate and design better explosions! But look at how much innovation comes about from war and war/defense funding. (It's not hard to search for it). Heck, even canned food had its research and development funded by Napoleon to help the French military.
        And it cuts both ways: any innovation can be put to use in the aid of def
    • by tyrione ( 134248 )

      So of course the Federal government needs to blow $20 million of taxpayer money, irregardless of its fiscal condition.

      Joint research. Grow up.

  • "chips for sensor systems"

    Wonder what they mean by "sensor systems".
  • ..and 20 mil is enough to develop that??? 75x the capability?

    I'm no genius in development or marketing, but if that could have been done, it would have already.

    I don't see in TFA where it says how long they have to complete this project. So that makes one wonder if they'll (based on Moore's Law) have it out one week earlier than all competitors with that small lump of change.

  • As an avid nVidia fan, I do hope they will share their findings with AMD(and Intel if applicable) to prevent anti-trust monopolies and to encourage even more innovation.
  • by DanielRavenNest ( 107550 ) on Saturday December 15, 2012 @08:39PM (#42304607)

    Current NVIDIA K20X compute card produces 5.575 Gflops double precision/Watt:

    http://www.anandtech.com/show/6446/nvidia-launches-tesla-k20-k20x-gk110-arrives-at-last [anandtech.com]

    Note that these cards are slightly different than consumer graphics cards. They have more double-precision pipelines because scientific computing cares more about that kind of math. They are also much more expensive than consumer cards. The underlying chip design is similar to the 600-series graphics cards. You can think of it as a modified version optimized for math, since the 600 series came out first, and is being produced in higher volume.

  • My theory is that by putting investment capital in to the tech, they have a "mob-like" hand in the technology. Yes, it doesn't seem like a good investment of DARPA's money now, but the favor WILL be repaid by nVidia at some point, probably to a tune of much greater than $20 million. GPU's have incredible potential for processing power even in current day, and DARPA is one of the government divisions that I would expect might need such power for various project(s).

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...