Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Power The Military Hardware Science Technology

DARPA Targets Computing's Achilles Heel: Power 100

coondoggie writes "The power required to increase computing performance, especially in embedded or sensor systems has become a serious constraint and is restricting the potential of future systems. Technologists from the Defense Advanced Research Projects Agency are looking for an ambitious answer to the problem and will next month detail a new program it expects will develop power technologies that could bolster system power output from today's 1 GFLOPS/watt to 75 GFLOPS/watt."
This discussion has been archived. No new comments can be posted.

DARPA Targets Computing's Achilles Heel: Power

Comments Filter:
  • Do the goverment know of an upcomming energy crysis?
    • by Anonymous Coward on Sunday January 29, 2012 @01:46PM (#38858719)

      No, the problem is getting hold of raw materials for batteries. Mobile computing is on the rise and the west doesn't want to be too dependent on foreign mineral deposits. More efficient computers = smaller batteries = smaller amounts of lithium etc needed.

      • I think another fundamenal problem is why does one need the entire western world need such large ammounts of processing power on the move?
        • by FooAtWFU ( 699187 ) on Sunday January 29, 2012 @02:09PM (#38858823) Homepage

          It occurred to me the other day that, while I have been programming and working with network monitoring tools and the like for a while, and I can get an email alert (or text message) whenever a piece of equipment goes down, the rest of the world doesn't have that sort of capability. A big chunk of of California Highway 1 could fall into the ocean, and people could fall off after it, and no one would notice until someone called it in. If my hard disk is on fire, I can get a message, but if the woods are on fire, you need to wait for someone to see the smoke.

          Sensors and the like are pretty awesome to have.

          • The problem is connectivity with someone who cares. The last mile is notoriously expensive, even with wireless. You could put lots of sensors along Hwy 1, but you'd need something to say where it started and stopped sliding into the ocean. You can actually run a piece of wire, calibrate it, and use two of them to cipher stop and start by using time domain reflectometry-- the technique used to find data cable problems. Somewhere, that wire needs be connected so that a computer will cough an alert when condit

          • Regarding the smoke in the forrest. There are many research topics to create smart grids of sensors. these sensors communicate with each other, creating communications lanes to save energy. They are very small and can be dropped by a plane. These sensor networks can then monitor humidity and temperature. As a result, they can immediately notify the operators about fires.
            Ah, and they are cheap as well!

          • Sensors and the like are pretty awesome to have.

            Indeed.

            - BIG BROTHER
        • How else will keep track of you, Citizen? We need the drones to be able to find you when you step out of line.
        • Why do we need so much energy? Why do we need so much processing power? Why do we need so much stuf?

          All those questions are on the same line. They all have the same answer. To the best of my knowledge the answer is either a deterministc one based on darwinism, or "people are addicted to power".

        • Indeed. The implicit assumption that utility of an embedded device is linear with the utility of the device.

          But of course, a new media format need to be introduced sooner or later, since the prices of Blue Ray are already starting to lose their "premium" justification and become just plain ordinary.

      • by Anonymous Coward

        I don't know if DARPA has other things in mind, but the main reason why most research done into power efficiency of computing is because supercomputing clusters are having energy consumption become more and more of both a cost and a logistical problem. In fact, in the gov't. sponsored research on what it'll take for us to develop an exaflop computer (two years old now, I grant you), significantly increased power efficiency is considered absolutely necessary. Mobile computing is more of an inspiration for ef

      • Like, the Lithium ore in Afganistan? But, but, we won the war, now everything belongs to us...i mean USa.
        • by Luckyo ( 1726890 ) on Sunday January 29, 2012 @05:11PM (#38859905)

          Problem with lithium, it isn't mushroom and berries. You can just walk in there and pick it up. It's also not oil. You can't just put a hole in the ground, connect it to the pumping machinery and have oil. You need to have an actual ore mines, with huge, easy to sabotage, hard to fix machinery.

          And finally, it's solid and heavy. It's a total bitch to move from center of war-torn nation that has world's best specialists in asymmetric warfare fighting against you both economically and in terms of general feasibility.

          • by Anonymous Coward on Sunday January 29, 2012 @06:05PM (#38860191)

            In a pinch you can extract lithium from sea water. That's basically what a lithium deposit is... an old sea that dried up and left the salts. Lithium isn't a big fraction of a battery's cost, weight or volume. Please everyone stop being silly. The cobalt that is often used in lithium batteries is far more expensive, rare and used in larger proportions. We just don't call them cobalt batteries so no one knows about that part.

    • by unts ( 754160 ) on Sunday January 29, 2012 @01:51PM (#38858745) Journal

      The problem is not just generating the power... it's delivering it and consuming it without breaking/melting. And that's what they're getting at here - getting more FLOPS per watt... not finding out how to push more watts into a system. A silly amount of the energy going into a supercomputer comes out as heat... and a silly amount of energy is then used to remove that heat. Hopefully, by significantly improving the energy efficiency of chips and systems, we can make them a lot more powerful without them needing a whole lot more power. And I haven't even mentioned the mobile/embedded side of the spectrum where its about battery life and comfortable operating temperatures... the same energy efficiency goals apply.

      This is the sort of thing we over the pond are very interested in too. Like for example *cough* the Microelectronics Research Group [bris.ac.uk] that I'm a part of.

      • by ultramk ( 470198 )

        Erm, not to be overly pedantic, isn't *all* of the energy consumed by a supercomputer (or any other device) eventually converted into heat? First law of thermodynamics and all that?

    • by stevelinton ( 4044 ) <sal@dcs.st-and.ac.uk> on Sunday January 29, 2012 @02:21PM (#38858895) Homepage

      In a sense. There is a widespread view that we will need 1 Exaflop supercomputers by roughly 2019 or 2020 for a whole range of applications including aircraft design, biochemistry to processing data from new instruments like the square kilometer array. On current trends, such a computer will need gigawatts of power (literally), which amongst other things would force it to be located right next to a large power station that wasn't needed for other purposes. This is felt to be a bit of a problem and this DARPA initiative is just one small part of the effort to tackle this and get the Exaflop machine down to 50MW or so, which is the most that can be routinely supplied by standard infrastructure.

      • With an exaflop computer, simulating the human brain is looking like it might be possible. If we can get a simulated brain working as well as a real brain, there's a good chance we can make it better too, because our simulated brain won't have the constraints hat real brains have (ie. not limited by power/food/oxygen supply, not limited by relatively slow neurones and doesn't have to deal with cell repair and disease)

        Basically, if current models of the brain are anywhere near correct, and current estimates

        • what makes a brain "better"? thinking faster, or thinking better thoughts?

        • With an exaflop computer, simulating the human brain is looking like it might be possible.

          Take a moment, relax, and then try to answer this question: What does computational speed have to do with it?

          The point is that simulations are not linked to computational speed. Some simulations that we do today are performed thousands of times faster than "reality" while most others that we do today are performed thousands, or even millions of times slower. The speed of the simulation is irrelevant to their existence, so stop pretending that speed has any sort of importance to simulating something like

          • by Anonymous Coward

            Fair enough, although it shouldn't be forgotten that just the memory requirement for a simulation on the scale of an entire human brain is huge (the specific order of magnitude necessary for such a computation is unknown as it is unclear at precisely what level the human brain does its computation). A modern supercomputer can't simulate a human brain even at a trillion+ times slowdown due to simply not having the memory for the computation.

            Furthermore, for medical use, a million times slowdown on a simulati

          • If you simulate a human brain a trillion times slower than realtime, and want to spend 10 simulated years teaching it stuff, you're going to be a very old man by the time your experiments complete...

            Speed is important...

            • I think the point is that we already have human brains that we can teach. There's no point having a computer pulling down a whole power station's worth of power just to simulate what is in the end only another human brain.

              I am interested in AI and physics simulations myself so I'm not trying to say that simulating a brain isn't an interesting goal that might have something to teach us - but IMO if your end goal is useful intelligence for using in everyday life, there is no point in it. We already have billi

          • Some simulations that we do today are performed thousands of times faster than "reality" while most others that we do today are performed thousands, or even millions of times slower. The speed of the simulation is irrelevant to their existence

            For a simulation to be useful it must reach desired results in a reasonable amount of time. If you are simulating something that only takes a few milliseconds in real life then a simulation that runs a thousand times slower than reality will still feel basically instant and one a million times slower will be done in around an hour. OTOH if you are simulating something that takes years in real life then with a 1000 times slowdown your simulation will be running for millenia.

        • Exaflop computer

          - limited by power constraints (as per this article)
          - limited by connectivity (nowhere near as many connections as a neurone)
          - limited by lack of unit repair (has downtime when repair needed)
          - limited by possibility of rogue programs, and damage

          Slow neurones and the slow links between them don't actually seem to be an issue ...?

          Seems more limited than a brain to me ...?

          • But each of those limitations improves approximately with moores law. the brain hasn't changed much this century. At some point one will surpass the other.

            • Computers have been faster than brains for most of their history, the thing that seems to matter is not speed but connections ...

              Computers still have relatively limited numbers of these (compared to brains)

              More of what we have now does not seem to be the solution, we are just getting power-hungry behemoths, that are very good at hyper-complex tasks but still no good at what we think is simple...

              Moores Law has a limit, we are nearly at the atomic scale and Quantum effects are becoming more and more of an i

        • by dkf ( 304284 )

          With an exaflop computer, simulating the human brain is looking like it might be possible.

          It's looking like it's going to be rather more complex than that. Human brains use lots of power (for a biological system) and they do that not by being able to switch circuits very rapidly, but rather by being massively parallel. How to map that into silicon is going to be really challenging because it will require a totally different approach to the current ones; dealing with failures of individual components will be really a large part of the problem. To what extent will the power consumption itself prov

          • dealing with failures of individual components will be really a large part of the problem.

            Highly doubtful since the brain itself is very sloppy about the whole process. Neurons dont fire at exact thresholds, frequent permanent damage events plague them as we go through life, and even diet can have measurable effects on brain chemistry that effect how signals propagate as well as cause damage.

            What I'm saying is that there is clearly an extremely high degree of redundancy built into brains because of the reality of physical randomness, and that there is no reason to believe that any small part

        • With an exaflop computer, simulating the human brain is looking like it might be possible.

          The main problem of simulating a brain isn't the computational power required.

    • by Teun ( 17872 ) on Sunday January 29, 2012 @02:24PM (#38858919)
      Why do you need an energy crisis to make something work more efficiently?
      Concidering energy does not come cheap there is a very good commercial reason to save on one of the larger costs in computing (or any other activity)

      And even though the US hosts the worldleaders in denial of CO2 related climate change it is still an ever more important concideration for many people, even in the US.

    • by AHuxley ( 892839 )
      More a "Battle of Stalingrad" fuel supply convoy, fly in vs huge demand on the ground math problem.
      Every aspect of fuel use and cooling is been looked at. From HQ servers, air conditioning, servers in a tank to sensor networks.
      They all need lots of electrical power that comes from very long fuel supply networks.
    • The word you are looking for is "crisis".

      Crysis is a pun based on the Crytek company name and the aforementioned word.

  • Turing Tax (Score:5, Interesting)

    by Wierdy1024 ( 902573 ) on Sunday January 29, 2012 @02:31PM (#38858969)

    The amount of computation done per unit energy, isn't really the issue. Instead the problem is the amount of _USEFUL_ computation done per unit energy.

    The majority of power in a modern system goes into moving data around, and other tasks which are not the actual desired computation. Examples of this are incrementing the program counter, figuring out instruction dependancies, and moving data between levels of caches. The actual computation of the data is tiny in comparison.

    Why do we do this then? Most of the power goes to what is informally called the "Turing Tax" - the extra things required to allow a given processor to be general purpose - ie. to compute anything. A single purpose piece of hardware can only do one thing, but is vastly more efficient, because all the power used figuring out which bits of data need to go where can all be left out. Consider it like the difference between a road network that lets you go anywhere and a road with no junctions in a straight line between your house and your work. One is general purpose (you can go anywhere), the other is only good for one thing, but much quicker and more efficient.

    To get nearer our goal, computers are getting components that are less flexible. Less flexibility means less Turing Tax. For example video encoder cores can do massive amounts of computation, yet they can only encode video - nothing else. For comparison, an HD video camera can record 1080p video in real time with only a couple of Watts. A PC (without hardware encoder) would take 15 mins or so to encode each minute of HD video, using far more power along the way.

    The future of low power computing is to find clever ways of making special purpose hardware to do the most computationally heavy stuff such that the power hungry general purpose processors have less stuff left to do.

    • Re: (Score:3, Informative)

      by Anonymous Coward

      For comparison, an HD video camera can record 1080p video in real time with only a couple of Watts. A PC (without hardware encoder) would take 15 mins or so to encode each minute of HD video, using far more power along the way.

      While it makes your point, you're actually off by orders of magnitude on both: a modern PC can easily encode at 2-4x realtime for 1080p... and a good hardware encoder often uses less than 100 milliwatts. A typical rule of thumb is that dedicated hardware is roughly 1000 times more e

      • You are indeed correct - it all depends on the codecs, desired psnr and bits/pixel available. For modern codecs, the motion search is the bit that takes most of the computation, and doing it better is a super-linear complexity operation - hence both your numbers and mine could be correct, just for different desired output qualities.

        The ratio though is a good approximate rule of thumb. I wonder how this ratio has changed as time has moved on? I suspect it may have become bigger as software focus has move

    • If you want to talk about encoding, anime fan subbers are at the fore front. The latest is 10 bit encoding. It has a lot of benefits but what its main downside is that there is no hardware for it, you need to run it on the cpu. Someday hardware like a GPU might support it but that takes far to long to stay current.

      That is the reason the general purpose CPU has won out so far, why mobile phones and tablet come with them as the main computing unit, because keeping up in hardware with the latest developments j

    • I have four words for you. Field, Programmable, Gate, and Array.
      • A couple of more words: Power Hog.

        - at least when compared to ASICs. But there are new developments in the area, see Silicon Blue Technologies [wikipedia.org]. It will be interesting to see how things work out in the future. Looks like all the players are trying to create power efficient FPGAs.

        • by gtall ( 79522 )

          And some words for you: volume and change. If you have a large enough application in the sense that you need millions of the things, and the application set in stone forever more, then ASICs are fine. If you ever intend to change it, or your run is small, FPGAs are a better choice.

          • or your run is small, FPGAs are a better choice

            Yes, but the topic of discussion is power consumption not purchase price. My point was that FPGAs do not solve the problem of power consumption - at least not yet. They are getting better but then so are ASICs.

            It appears that, looking forward, the best solution will be a combination of the two techniques. Specialized ASIC components glued together with FPGA elements. Most FPGA manufacturers already do this to a limited extent. It is common to see embedded CPUs in FPGAs - and I'm not referring to sof

            • Yes, but the topic of discussion is power consumption not purchase price.

              Power consumption is part of it but I don't think you can draw reasonable conclusions from power consumption alone. It's important but so are upfront cost and flexibility.

              My point was that FPGAs do not solve the problem of power consumption - at least not yet. They are getting better but then so are ASICs.

              Yes ASICs are the most power efficient way of performing a repetitive computation task because they have neither the data pushing overhead of CPUs/GPUs or the reconfigurable wiring overhead of FPGAs. However putting a design into an ASIC is expensive, so it's only practical if you want a lot of a copies of the design, plan to run each cop

    • Less flexibility means less Turing Tax. For example video encoder cores can do massive amounts of computation, yet they can only encode video - nothing else.

      And a Turning machine makes sense when transistors are expensive. But what's the actual cost of adding an h.264 encoder to a hardware die today? I bet it's cheaper than the electricity cost for doing much encoding over the ownership time of the part.

      I suppose DSP's, VMX, MMX, SSE, etc. can all be seen as ways this has held true over time as transistor

    • Re:Turing Tax (Score:5, Interesting)

      by Kjella ( 173770 ) on Sunday January 29, 2012 @07:42PM (#38860617) Homepage

      To get nearer our goal, computers are getting components that are less flexible.

      Actually, computers have lost lots of dedicated processing units because it just wasn't worth doing in dedicated hardware, that's where for example softmodems (aka winmodems) came from. And with GPUs going from fixed pipelines to programmable shader units, they too have gone the other way. Dedicated hardware only works if you are doing a large number of exactly defined calculations from a well established standard, like say AES or H.264. Even in a supercomputer the job just isn't static enough, if the researchers have to tweak the algorithm are you going to do build a new computer? You have parameters, but the moment they say "oh and we have to add a new correction factor here" you're totally screwed. Not going to happen.

      • by tlhIngan ( 30335 )

        Actually, computers have lost lots of dedicated processing units because it just wasn't worth doing in dedicated hardware, that's where for example softmodems (aka winmodems) came from. And with GPUs going from fixed pipelines to programmable shader units, they too have gone the other way.

        It's cyclical - going from specialized to general to specialized, etc.

        Early computers used character generator chips - specialized processors that took ASCII(ish) inputs and generated the onscreen information. This evolve

  • So they're researching how to create computronium? Will we then turn the whole solar system into a Matrioshka brain and all live in a virtual world?
    • First we'll need nuclear fusion and some kind of autonomous robots.

      And we ever want that brain to be ours, we'd better get a deep understand of neurology and neural implants before we get those autonomous robots...

      • Getting the level of detail from a brain you'd need to simulate it might be less a matter of implants than destructive readout. Slice-and-scan.
        • Getting the level of detail from a brain you'd need to simulate it might be less a matter of implants than destructive readout. Slice-and-scan.

          Well, right. That also eliminates the potential issues from having duplicate persons in virtual space and meat space.

  • by Gravis Zero ( 934156 ) on Sunday January 29, 2012 @03:00PM (#38859133)

    TI's line of MSP430 chips run using little solar cells. hell, they practically run on their own self-esteem. so scale that technology and bam, you got a super computer that runs on a couple AA batteries.

  • if this was applied to American companies and western manufacturing ONLY. Sadly, the neo-cons will push for the to be applied to everybody, esp. China.
    • if this was applied to American companies and western manufacturing ONLY. Sadly, the neo-cons will push for the to be applied to everybody, esp. China.

      Yeah, I hate it when those xenophobic racist neo-cons push to share our advances with China (who has the fastest-growing need for power and one of the dirtiest power sources, namely coal). Hell, a move like that just might reduce poverty AND pollution, and no one wants that!

      Thank god we liberals know that the only way to make the world a better place is to reflexively oppose not just everything the conservatives do, but everything we imagine they might do!

      • Of course, I am not a liberal. But the problem is that China is cheating at WTO/IMF/FTAs. So, paying for tech like this and then allowing it to simply transfer to there because a bunch of neo-cons say to transfer.
        • OK, maybe china's cheating (by giving us goods/services at below cost - those bastards!)

          But how did "neo-cons" get involved? I didn't see them (or any mention of politics at all) anywhere in this story until you conjured them up out of thin air. Let me guess: the reason you're not a liberal is because the liberals are way too far to the right - correct?

          Besides, the last time I looked, the neo-cons had been out of power for several years, so I don't think you have anything to worry about.

          P.S. Neo-co
          • First off, the neo-cons are VERY much alive and doing well. They control the republican party. Neo-cons was originally applied to dems that switched to the republican party. reagan was the head of that. The big difference is that the group that adapted reagan's beliefs (changing republican party's core beliefs dramatically) called themselves neo-cons. IOW, they declared themselves a group by doing that. That includes not just reagan, but those that follow him such as W., Cheney, Rove, Rumsfeld, Boehner, Ca
  • Is there any sort of rule of thumb when measuring power consumption - ie, X amount of processing uses Y blocks of power? Is there a theoretical minimum requirement of energy to perform certain types of calculations?
    • Re: (Score:2, Informative)

      by alreaud ( 2529304 )
      Yes and actually very simple (SI units),

      P = C*V^2*f where P is power in Watts, C is capacitance in farads, V is voltage in volts, and f is frequency in Hertz. C is kind of hard to measure, and is dynamic depending on processor load. A design value can be determined from processor data sheets.

      Power is only consumed in MOS transistors during transitions, to the value I = C*dv/dt, where C is the overall transistor capacitance to the power supply, in this instance. If dv is 0, ie, at a stable logic
  • The real concern is how to maintain the mission critical applications when the power grid fails. The only fallback outside of tons of fuel (And even this won't last for decades) is a sustainable solution.
  • The machines already solved this problem in the fictional world.
  • everyone who checks in at a hospital should have their electrical energy harvested! that way we can pay for healthcare. and all the fancy computers, electronic medical records,etc that are needed in hospitals these days. Even though mortality has not been reduced by any of these measures.

"Someone's been mean to you! Tell me who it is, so I can punch him tastefully." -- Ralph Bakshi's Mighty Mouse

Working...