Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Hardware

Intel Gets Serious With Solar-powered CPU Tech 74

angry tapir writes "Intel's experimental solar-powered processor may have started off as a fun project, but the chip maker is now looking to extend the technology to hardware such as graphics processors, memory and floating point units. Intel last year showed the low-power processor — charged only by the light from a reading lamp — running Windows and Linux PCs. Intel is expected to share further details about the processor, which is code-named Claremont, at the International Solid-State Circuits Conference in San Francisco. The company is also expected to reveal information about efforts to integrate wireless capabilities into Atom chips for mobile devices."
This discussion has been archived. No new comments can be posted.

Intel Gets Serious With Solar-powered CPU Tech

Comments Filter:
  • by icebike ( 68054 ) * on Sunday February 19, 2012 @09:12PM (#39096303)

    Yes Intel did demo a solar cell powering a Pentium, but that was merely to make a point about the inefficiencies of near-threshold voltage (NTV) CPUs. They have no particular focus on Solar powered processors.

    Near-threshold voltage (NTV) CPUs are the focus of Intel's research here.
    NTV transistors can switch at voltages just the threshold for the device's powered state, and CPUs made of these can idle along at extremely low voltage doing real work (slower) or they can ramp up the power and work much faster.

    The Register has a much better explanation of this technology [theregister.co.uk] than the linked article.

    The idea is to have devices run at low voltages and power consumption rates that would be akin to a sleep mode in today's chips. And NTV techniques are not just limited to processors used in hand-held devices like smartphones and tablets, but to everything all the way up to exascale supercomputers, says Rattner. The important thing is that NTV techniques allow a chip's performance and power to scale as voltage scales up and down, and to do so across a wide dynamic range.

    Also a good summary here [theregister.co.uk]:

    Marketing spin aside, the "near-threshold voltage" chip is quite an achievement. Intel first revealed in March 2010 that it had a prototype chip running at such low voltages, but Claremont's creators took that technology and baked it into a full IA architecture processor. Based on a Pentium core, Claremont can not only be throttled down to "within a couple of hundred millivolts of the threshold voltage of the transistors," said Intel engineer Sriram Vangal, who demoed the chip during Rattner's turn, but – equally important – it also has a high dynamic range that allows it to be cranked up to deliver ten times the low-power performance by increasing the voltage.

    Once again, the Register does a better job of reporting than Techworld.

    • So it's essentially a throttle? You can use however little power you have time for? So my netbook can render a big Blender animation on a single battery charge, I'd just have to wait for a few weeks? Sounds very useful indeed.
      • Re: (Score:2, Informative)

        by froggymana ( 1896008 )

        But you also forgot that your processor is running 10x slower. That blender render will then also take approx. 10x longer, so no you don't get a magical boost in processing power.

        • by Anonymous Coward

          I don't think he forgot. He said:

          So my netbook can render a big Blender animation on a single battery charge, I'd just have to wait for a few weeks

        • by fatphil ( 181876 )
          It's generally more efficient to work hard, and then rest more, than to work
          slowly. This is sometimes called "race to sleep", "race to idle", or similar.

          Your final word "power" is inappropriate, as power is a rate over time. Something like "capacity" would have been a better word.
      • by icebike ( 68054 ) * on Sunday February 19, 2012 @10:53PM (#39096743)

        Essentially a throttle, but more likely a demand based system, such that non-busy processors can run at the lowest possible speed and voltage, and when work stacks up, it ramps up.

        Great for the smart phone in your pocket which has nothing to do for hours at a time other than check the email and listen for calls.
        Since its screen is off, you really don't care how fast it does those things as long as they are just barely fast enough.

        There is a great deal of "stare time" that happens when people look at computers, and the processors are spinning away all the time while you are reading this. They could just as well drop to an extremely low power state, and wait for a mouse move, finger tap, or something else.

        This much we've been doing all along, for the last 20 years. But power consumption still remained high, because even simple tasks like checking the clock to see if its time to increment that digital time read out took processing power, and historically any use of the processor kept it awake at something like full power for that task.

        Now, those tasks can be performed at extremely low power, without ramping up the speed. Only when the processor can't meet the demand would the system increase the voltage and speed up the chip.

        • Another method which (I assume) addresses the problem is running a more full-featured BIOS [betanews.com] that could operate the "basic" applications like web browsers and skype without having to load a full-fledged OS.

          You have essentially instant-on access to the most basic popular applications, and then boot to a real operating system when you have to edit that film or play that game.

          Toshibas(and probably others) had the ability to play audio CDs from within the BIOS way back in 2003 when I was fixing laptops fo
          • by Belial6 ( 794905 )
            ASUS does this with their ExpressGate feature. http://expressgate.asus.com/ [asus.com]
          • And some Dell Lattitude even had a separate low power CPU for that task.

            Power up the ARM and boot into the in-BIOS Linux for basic web etc.
            Power up the Intel and boot into the full installed OS (windows or whatever) for a full environment (but power hungry)

        • Until Intel makes something useful of it and ships it at retail, it may as well be that "single atom transistor" tech from a few stories ago, or the holographic storage from yesterday, or cold fusion.
      • So you can run your CPU 4x slower while you're not doing much at 16x less power.

        source: imaginary approximations numbers based on 4x voltage decrease. The article mentions 280mv and 1.2v.
    • It's like a hybrid running off batteries in low-power mode, with the gas engine kicking in a high power.

      All we need now is "regenerative braking" using a thermocouple to harness all that heat!
  • It's a Race (Score:5, Insightful)

    by GLMDesigns ( 2044134 ) on Sunday February 19, 2012 @09:17PM (#39096335)
    So many people are worried about how technological advances are ruining the environment. What many often forget is that technology is also the answer (unless you want to go back to a hunter-gather lifestyle and I hear that the drum/smoke-signal bandwidth really sucks, it's takes forever to download the latest movie.)

    We're in a race - computational speed, new materials, new efficiencies versus the rate in which we're polluting the environment. Many things make me optimistic: photovoltaic paints for one - and now processing power so efficient that it can be solar powered. Wow. We may win this race after all. .

    • by KazW ( 1136177 ) *

      So many people are worried about how technological advances are ruining the environment. What many often forget is that technology is also the answer (unless you want to go back to a hunter-gather lifestyle and I hear that the drum/smoke-signal bandwidth really sucks, it's takes forever to download the latest movie.)

      We're in a race - computational speed, new materials, new efficiencies versus the rate in which we're polluting the environment. Many things make me optimistic: photovoltaic paints for one - and now processing power so efficient that it can be solar powered. Wow. We may win this race after all. .

      You insensitive clod! Smoke signals release carbon into the atmosphere!

    • Re: (Score:3, Interesting)

      When I consider that the human brain is many orders of magnitude more powerful than any electronic computer, and uses only a few hundred calories a day, it makes me realize that our electronic computers have a huge potential for improvement in both energy efficiency and power.

      • oh, how many 7 digit base plus 2 digit exponent floating point operations a second can your brain do? 0.01? the brain isn't a digital computer, rather some kind of funky kludgy signal processing system. it's not a question of less or more power, rather a different kind of power.
      • Re:It's a Race (Score:5, Interesting)

        by ZankerH ( 1401751 ) on Sunday February 19, 2012 @10:11PM (#39096595)
        How many FLOPS does that get you at peak performance?

        Saying the human brain is "more powerful" makes no sense by itself. It's better at certain tasks (like pattern recognition, jumping to conclusions and holding contradictory beliefs) because it's hard-wired to do them. When it has to use general-purpose computing (like when you try to do floating-point math), you'll find most computers a great deal faster and more efficient.
        • When it has to use general-purpose computing (like when you try to do floating-point math), you'll find most computers a great deal faster and more efficient.

          True, there are "sweet spots" such as this where computers have an advantage over humans. However, as the math gets more advanced, computers rapidly start losing steam. Humans can prove advanced theorems such as Fermat's Last Theorem that computers can't even begin to touch, even with state of the art automated theorem provers.

          • Re:It's a Race (Score:5, Interesting)

            by TheCouchPotatoFamine ( 628797 ) on Monday February 20, 2012 @12:50AM (#39097223)

            the basic rule is that neural networks can solve problems without knowing *how* precisely, and digital computers can do anything if you know exactly how. See the difference? You can't compare brains and computers. They are good at diametrically opposed things and always will be. Thats the law (of physics and computation).

            • A standard meme in the A.I. circles is that Neural Nets are always the second best way to solve a problem.
        • About 100 teraflops, according to what I could dig up on Google. By comparison, the highest end single GPUs can do about 2.5 teraflops (and at raw computation they destroy general purpose CPUs), and those generally consume a few hundred watts. Obviously supercomputers are a lot faster, but by input energy, the human brain is much faster than a computer. Our minds just aren't designed to handle numerical calculations, but they certainly could outperform a computer. Granted, someone whose brain was wired to d
          • Granted, someone whose brain was wired to do that would probably be completely non-functional, since we need so much power for our other activities, but it is certainly possible.

            Indeed you are right at this point. I once saw a documentary about savants [wikipedia.org] and this is one of the ideas you got. These people master skills like no one else could imagine. The most famous savant, Kim Peek [wikipedia.org] (the inspiration for the character of Dustin Hoffman in Rainman) had memorized thousands of books. Other played instruments flawlessly even after hearing a new piece just once. Other could draw Manhattan with impressive detail after seeing only one photo. The list goes on.
            The inabilities they had were qui

        • When it has to use general-purpose computing (like when you try to do floating-point math), you'll find most computers a great deal faster and more efficient.

          Is that true? I thought that the human brain was very good at all sorts of calculation, but that was hidden from consciousness. The computational power required to walk and chew gum at the same time is impressively high, no?

      • while its accurate in that HYBRID processing systems are certainly a bright spot in the future, its amazing to me how many people totally fail to realize that neurons are analog and computers are digital. They solve problems in completely different ways and domains, and there are tasks suited to both but rarely at the same time. For instance, as mentioned on a sibling post, brains ain't gonna have FLOPS. More like FLOMS - bad jokes aside, digital computers are not going to be able to identify orthogonal pat

      • 1) The brain is more parallel and fuzzy than traditional CPUs, but "more powerful" is getting really blurry with today's machines.

        2) Remember that each "Calorie" is 1000 calories. You're going through at least hundreds of thousands, more likely millions, of calories of energy per day. All estimates of average human energy usage I've seen tend to be in the range of 200-300W. Though that's not just the brain, one can assume that a reasonable percentage of that is spent on it, and even the majority when sit

    • by dak664 ( 1992350 )

      Sure technology helps, why not use all the available tools. But excreting business as usual with the expectation that technology will save you is not responsible. You don't have to increase pollution to accommodate new discoveries, why make a race between techonology and death? Einstein didn't have a computer, and he was no hunter-gatherer.

      • No, Einstein didn't have a computer. Technology amplifies your ability to do a task. It allows us to find research information faster, to disseminate important information FASTER. This is important. We're not only exponentially increasing the amount of information but also getting this information to people who can then use it to create something new. I personally don't think we are in *danger* of losing this race (except for wars over resources ballooning out of control when it combines with religious fer
  • by Theovon ( 109752 ) on Sunday February 19, 2012 @09:50PM (#39096513)

    I'm not sure what transistor geometry Clairmont is manufactured at, but for really small transistors (e.g. 32nm), process variation is a serious problem, making it hard to scale voltage down that low. The results are unpredictable performance from die-to-die and within die and major reliability problems. Static RAMs are hit the hardest, because they use the smallest transistors. "http://www.cse.ohio-state.edu/~millerti/parichute-camera.pdf" is an example of a paper that explores the consequences of ultra-low voltage SRAMs and tries to solve it with forward error correction.

    • From the first linked article with spelling errors intact

      The CPU was made using the 32-nanometer processor

      RTFA.

  • by Lumpy ( 12016 ) on Sunday February 19, 2012 @09:55PM (#39096529) Homepage

    I was running a processor off of solar years ago. using the VIA C3 processors from 4 years ago. Glad to see Intel catching up to the rest of the industry.

  • by gstrickler ( 920733 ) on Sunday February 19, 2012 @10:29PM (#39096659)

    1.2V @ 1GHz is not power efficient at speed. Existing Core designs are running much faster at lower voltages. Based upon what they've demonstrated so far, it's useful for devices that need moderate speed on an occasional basis, but spend the majority of their time at idle.

    Now, if they can scale it up to 2-3GHz at around 1V and idle at less than 0.5V at a reduced freq, then it'll be something worth looking at for common applications.

    • by Kell Bengal ( 711123 ) on Sunday February 19, 2012 @11:09PM (#39096805)
      Imagine CPUs that run at 1.1V... you could power them with a potato!
    • by Khyber ( 864651 )

      Voltage doesn't mean shit as far as power efficiency without an amp-hours number with it, pal. Re-join the discussion when you can provide full numbers.

    • Did you miss the part where the CPU runs at 0.28V @ 3MHz? That's lower than any other digital logic IC in the world.
      • No, did you miss the part where I said these would only be useful where they're running at idle most of the time? That's an idle speed, and yes, it's extremely low voltage, and presumably ultra low power. But 3MHz is fast enough to do much work these day, so it'll have to ramp up the speed and voltage to do any useful work.

        • You mean like how your phone sits idle waiting for network activity or background tasks to run? or when you're reading the content of that document? or between the key presses when you're writing the document? intel speedstep is already very quick and ramping up voltages and frequency
  • Sunscreen (Score:2, Informative)

    by erick99 ( 743982 ) *
    I don't about using my computer outside, especially in the summer when it's very hot and in the winter when it's very cold. I might be able to manage spring & fall but not on windy days as my papers would fly about.
  • They must really be intimidated by rise of ARM. I wonder where this will take us in terms of the evolution of embedded computers.
  • Who wants a solar powered cpu or gpu? Pretty sure it's a dark and dusty place in my computer, not the sort of place the sun shines.
    • by Jeremi ( 14640 )

      Who wants a solar powered cpu or gpu? Pretty sure it's a dark and dusty place in my computer, not the sort of place the sun shines.

      Cell phone towers, outdoor electronic signs, drones, satellites, navigation equipment, weather monitors, wi-fi base stations...

      Not that Intel particularly cares about solar powered devices; they just use solar power to make the point that their experimental CPU can operate on a very limited amount of power.

  • Every Linux box/laptop I have ever installed/used consumes more energy that their Windows counterpart. My Nokia phone battery lasted up to 6 days. My android phone, 6 hours.

    • by fatphil ( 181876 )
      Android is not linux, android is linux plus a whole lot of crud that's waking up too often. Blame userspace, not linux. Nokia's linux devices have far better battery life (over 2 weeks on my N9).

      And in a desktop/laptop context, you also have to remember that MS have got NDAs with the hardware manufacturers and BIOS writers regarding power control, which prevents linux from being as aggressive. Linux hackers are trying to reverse engineeer these interfaces, clearly, but progress is slow. Have you run powerto
  • I live off the grid solar is finally ready for prime time I've been waiting since the seventies http://lenny.com/ [lenny.com]

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...