Nvidia Wins $20M In DARPA Money To Work On Hyper-Efficient Chips 72
coondoggie writes "Nvidia said this week it got a contract worth up to $20 million from the Defense Advanced Research Projects Agency to develop chips for sensor systems that could boost power output from today's 1 GFLOPS/watt to 75 GFLOPS/watt."
Both AMD and Nvidia are already hyper-efficient (Score:2)
If used as space heaters, that is.
Re: (Score:2)
There aren't (and never will be) enough bitcoins.
Result will work great (Score:4, Funny)
Re: (Score:2)
If you give me $20 million at once I can give you the source code of my drivers you know ;-). Under a NDA of course.
Re: (Score:1)
Fine with me.
Isn't this economic without DARPA funding? (Score:5, Insightful)
It seems to me a x75 increase in power efficiency should be worth to nVidia (or any competitor) much more than $20M, why does DARPA need to fund this, this seems exactly like the kind of work which doesn't need DARPA money. DARAPA should spend money where it is not clearly economic for others to do so.
Re: (Score:2, Interesting)
Maybe DARPA also wants the thing to withstand radiation. Maybe they want it to have so little computational power that it would not sell in the market. Maybe they want every component to have been made in the US which the market won't care about. Maybe they have other restrictions. You don't know.
Re: (Score:2)
That's an intetresting point. How much power would a 300Mhz pentium of 1998 consume if built with modern technology? That thing could do quite a bit of computing, but as you say, wouldn't survive in todays PC market. But for embedded applications it would be great, not least for the fact that you can get the comfort of a full computer.
Re: (Score:1)
Re: (Score:1)
Maybe they want every component to have been made in the US which the market won't care about.
... I don't think you understand the chip fab process (that's what we're talking about here). $20 million is a lot of money in any other business, though.
Re: (Score:2)
Who needs the free market? (Score:1)
It seems to me a x75 increase in power efficiency should be worth to nVidia (or any competitor) much more than $20M, why does DARPA need to fund this, this seems exactly like the kind of work which doesn't need DARPA money. DARAPA should spend money where it is not clearly economic for others to do so.
You were under the assumption that we live in a purely capitalistic society? My mistake. Even in countries that are into extreme capitalism shift back to subsidies and support when it comes to certain things.
...
Though, usually those things involve essential services like fire-fighters, road maintenance, and policing
Re: (Score:3)
DARAPA should spend money where it is not clearly economic for others to do so.
Well, that's good advice, and I'm sure they'll take it over at DARAPA. On the other hand, DARPA has certain goals, whatever they might be, and if this is the most economical way to achieve them, then it's money well-spent from their perspective. If you're going to have a thing like DARPA, then you need to permit it to do things like this if you want it to be efficient. On the other hand, if you're going to do things like this, you need substantial oversight in place to prevent abuse. And on the gripping han
Re: (Score:2)
Re: (Score:2)
There is no way anyone can make an intelligent decision on what to purchase today.
Why not? If I'm shopping for an Android tablet, say, there's only a small handful of credible processors, and there's only so many screen technologies and manufacturers, and I can gauge a manufacturer's past quality and hope that it will serve as a useful predictor of future performance. I can look at their financial statements and find out if they have been purchased by vulture capitalists. And you can do all of this from a free or nearly-free computer. (I've given away computers more than adequate to the
Re: (Score:2)
There are only really two players in the GPU market. The only incentive they have is "be slightly better than the other guy".
It's not darpa money (Score:1)
DARPA money is tax payer money.
Units (Score:2)
Re: (Score:3)
Practicality. When talking about energy consumption, it's usually given in watts because the practical implications are time-dependant. You've got to account for the time it takes to run the calculations (which may be time-critical - you don't want your amalgamated radar data on a five-minute delay) and need to know the wattage to calculate cooling requirements. While operations/joule and flops/watt are equivilent, it's easier to think in terms of the former.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
I work with high-performance computing in physics -- all of my peers know the difference between energy and power. Sometimes people use "flops" as an abbreviation for "floating point operations" ("It takes XYZ flops per site to compute the Wilson Dirac operator" or "The flops/bytes ratio describes the balance between processing and communication in the algorithm") without the "per second".
did i misread something ? (Score:1)
Re: (Score:2)
Re:did i misread something ? (Score:5, Interesting)
We passed 1e+08 operations per kWh in 1971.
We passed 1e+09 operations per kWh in 1976.
We passed 1e+10 operations per kWh in 1981.
We passed 1e+11 operations per kWh in 1987.
We passed 1e+12 operations per kWh in 1992.
We passed 1e+13 operations per kWh in 1997.
We passed 1e+14 operations per kWh in 2001.
We passed 1e+15 operations per kWh in 2008.
citation and graph [economist.com]
Energy efficiency consistently doubles approximately every 1.6 years, so if we are at ~16 glops/watt right now, then we will blow past DARPA's target early in 2016... just a little over 3 years from now.
Re: (Score:2)
Re:did i misread something ? (Score:4, Insightful)
Out of curiosity (and I ask because I genuinely don't know), how many flops/watt do modern smartphones do? What about the GPU coprocessors in them?
Modern GPU's are great, but they're not even optimized that strongly for power consumption.
Re: (Score:2)
Out of curiosity (and I ask because I genuinely don't know), how many flops/watt do modern smartphones do? What about the GPU coprocessors in them? Modern GPU's are great, but they're not even optimized that strongly for power consumption.
I would think GPUs are actually worked more on for peak efficiency because top cards have been consuming hundreds of watts and particularly workstation and compute cards will often run at 100% when in use for a big render/compute job. Smartphones are much more about dynamic power, adjusting clocks and voltages and tons of sleep modes, if you're doing 100% load on all cores then none of that will have an effect. Sure they care about power usage at peak too, but I don't think more than GPUs.
Re: (Score:2)
All true and insightful, but it would still be nice to know the actual per-watt figures operating all out, and compare them with desktop figures.
Comment removed (Score:3)
Re: (Score:2)
Re: (Score:2)
They also don't have CUDA. Some people in my field have considered doing high performance computing on Radeons and generally stick with the thing that's easier to code for.
Re: (Score:1)
AMD generally has faster chips
Perhaps, but at what cost [slashdot.org]?
Re: (Score:2)
performance per watt - yeah, sure, give it to Nvidia.
Or they could have chosen ARM to design faster processors that still use less power. Or give it to AMD who sure need the money.
NVIDIA is worth $7.87 billion (Score:4, Interesting)
Money matters, but ... (Score:4, Insightful)
So of course the Federal government needs to blow $20 million of taxpayer money, irregardless of its fiscal condition.
I prefer it was spent on computing, rather than explosions.
It'll help with explosions ultimately anyway. (Score:2)
.
Don't forget that they can use the improved computational power and that improved computational power efficiency to simulate and design better explosions! But look at how much innovation comes about from war and war/defense funding. (It's not hard to search for it). Heck, even canned food had its research and development funded by Napoleon to help the French military.
And it cuts both ways: any innovation can be put to use in the aid of def
Re: (Score:1)
it's a very nonflawless opinion of mine, though.
Re: (Score:2)
So of course the Federal government needs to blow $20 million of taxpayer money, irregardless of its fiscal condition.
Joint research. Grow up.
sensor systems (Score:2)
Wonder what they mean by "sensor systems".
Math (Score:2)
..and 20 mil is enough to develop that??? 75x the capability?
I'm no genius in development or marketing, but if that could have been done, it would have already.
I don't see in TFA where it says how long they have to complete this project. So that makes one wonder if they'll (based on Moore's Law) have it out one week earlier than all competitors with that small lump of change.
Share the Knowledge (Score:1)
Article Summary Is Incorrect (Score:4, Informative)
Current NVIDIA K20X compute card produces 5.575 Gflops double precision/Watt:
http://www.anandtech.com/show/6446/nvidia-launches-tesla-k20-k20x-gk110-arrives-at-last [anandtech.com]
Note that these cards are slightly different than consumer graphics cards. They have more double-precision pipelines because scientific computing cares more about that kind of math. They are also much more expensive than consumer cards. The underlying chip design is similar to the 600-series graphics cards. You can think of it as a modified version optimized for math, since the 600 series came out first, and is being produced in higher volume.
DARPA money is like mob money... (Score:1)