Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Power Upgrades Hardware

Samsung Develops Power-Sipping DDR4 Memory 152

Alex writes with this excerpt from TechSpot: "Samsung Electronics has announced that it completed development of the industry's first DDR4 DRAM module last month, using 30nm class process technology, and provided 1.2V 2GB DDR4 unbuffered dual in-line memory modules (UDIMM) to a controller maker for testing. The new DDR4 DRAM module can achieve data transfer rates of 2.133Gbps at 1.2V, compared to 1.35V and 1.5V DDR3 DRAM at an equivalent 30nm-class process technology, with speeds of up to 1.6Gbps. In a notebook, the DDR4 module reduces power consumption by 40 percent compared to a 1.5V DDR3 module. The module makes use of Pseudo Open Drain (POD) technology, which allows DDR4 DRAM to consume just half the electric current of DDR3 when reading and writing data."
This discussion has been archived. No new comments can be posted.

Samsung Develops Power-Sipping DDR4 Memory

Comments Filter:
  • by HockeyPuck ( 141947 ) on Wednesday January 05, 2011 @02:15AM (#34762892)

    In a typical notebook, how much power does memory actually consume compared to other components (CPU, HD, screen, wireless transmitter etc..)?

    • Leave your laptop in Sleep mode.. It will last many hours but it will eventually get critically low and shut down..

    • by Kjella ( 173770 )

      Good question, I don't have a direct answer however I found a desktop memory review here [legitreviews.com]. Given that they found a 7W difference under full load and power scales to voltage squared (1.64^2) / (1.34^2) = 1.5 you can estimate it draws 14W at low voltage and 21W at high voltage.

      Of course in a laptop you'll have a quite different low-power RAM like you have low-power CPUs but I'm guesstimating that yes it's significant. If you have a CPU that draws 30W at max, the RAM probably draws 5-10W too. Divide everything

      • How about this: you can double the amount of RAM for the same power budget. Batteries are not getting better as fast as we would like to use more RAM.

    • by c0lo ( 1497653 )
      Interesting question. Wild guess: up to 5-8 watts?

      Some data on what other components [codinghorror.com] consume. Not very rigorously determined, but good to make an idea.

      Some other data on how much switching from 1.5V to 1.35V to 1.25V [tomshardware.com] DDR3 type of RAM impacts the power consumption at idle time (scroll to the bottom of the page: 1W).

      The RAM power consumption will have, though, an impact on how long you can keep a laptop/notebook on idle (so, little CPU, no HDD and LCD, no graphics) before it shuts down and you loose everyt

      • by satuon ( 1822492 )

        So this raises another interesting question - do laptops with less RAM have better battery life and if so, should people refrain from getting laptops with more than 2 GB? That is more than enough for normal usage. Maybe bleeding edge games would require more.

        • It depends on the workload. No modern OS that I'm aware of powers down the RAM, so if you have a laptop with 8GB of RAM and are only using 2GB, then you will probably get lower power usage. If you have more RAM, however, the OS will cache more things. That means fewer disk reads, which reduces power usage. It may also reduce the CPU power load, because it may not be able to go into a low power state while handling a page fault.
    • Hardly any. I remember skimming through a study of component power consumption and IIRC memory topped out at something like 5% total draw. So memory with half the power draw will buy you about 10 minutes.
      Whoopdeefuckingdoo.

      • Re: (Score:3, Insightful)

        by galvanash ( 631838 )

        Hardly any. I remember skimming through a study of component power consumption and IIRC memory topped out at something like 5% total draw. So memory with half the power draw will buy you about 10 minutes. Whoopdeefuckingdoo.

        That is with the display turned on... Most portable devices spend a considerable amount of time with the display turned off to conserve power. To put this into perspective, on an HTC Desire android device with an AMOLED display the screen uses about 50%-60% of total power, memory is p

        • Comment removed based on user account deletion
          • by MBCook ( 132727 )

            Turning off DIMMs can be done. Some high end servers will let you add/remove RAM on the fly, it's basically the same thing except the RAM isn't physically going anywhere.

            There are two problems you'd run into. First you'd have to move everything to the DIMMs that you're keeping on. This means that all the pointers would change, so you'd have to have a way to keep track of that. If you moved things and then moved them back when the power came 'back on', that may suffice.

            The second is how much stuff is in me

            • First you'd have to move everything to the DIMMs that you're keeping on. This means that all the pointers would change, so you'd have to have a way to keep track of that.

              Fortunately, if you come from some time after the 1980s, this is done already. Nothing except the kernel sees physical memory addresses, they see virtual memory addresses. These are mapped to the physical address by the MMU / page tables, and often do change over the apps lifetime (e.g. when a page is swapped out then in, it is not always returned to the same physical page).

          • WTH? Another genius with a billion dollar idea? Sorry, but no...

            There's no such thing as unused memory, and hasn't been for a couple decades. Any portions of memory not allocated by programs are used to cache data read from the disk...

            So shutting them off will mildly reduce power consumption, right up until one bit of that cached data is needed... then, hitting up the disk will consume more power than you ever saved, and as an added bonus, you've taken a performance hit, too.

            SSDs don't change that fact,

    • Every little bit (or watt, as the case may be) counts.
      We should now 'encourage' the vendors of CPU, HD, screen... to reduce power consumption. The easiest thing to do is say: "Meh, my component uses way less power than everything else". Then you end up with a laptop power adapter that is larger than the laptop itself and allows you to boil water for coffee.
    • power consumption = heat that needs to be removed. Heat becomes a bigger problem the smaller the components are. Reduce the amount of heat produced and you've just made it easier to produce even smaller components.

    • In my 4 year old desktop (E6300 processor; 4x1GB ram; nVidia 9500 GT; 2x HDD), total power draw goes between 150watts and 220 (full load) watts, as measured by an in-line watt meter.

      Laptops tend to use quite a bit less power; 15-60watts is what I found from some quick googling, and the last laptop I had was around 18 watts as measured with laptop-tools under linux.

      This [interfacebus.com] indicates 10 watts for DDR2, vs 4 watts for DDR3, vs (presumably) 4*.6= 2.4 watts for DDR4. Not sure whether that link is accurate, or whet

  • by Anonymous Coward

    I just bought my Sandy Bridge rig, now they announce this?! Ffffffuuuuuuuuu-

  • Good news (Score:5, Interesting)

    by del_diablo ( 1747634 ) on Wednesday January 05, 2011 @02:17AM (#34762904)

    Now, lets pair this with a ARM core and hope we get a reasonable hack that allows a wireless that does not eat power like the current ones.....
    Then lets enjoy our ARM-puter: Portable, powerful, and battery for more than a day of use.

    • by Khyber ( 864651 )

      Not happening. Higher-frequency transmissions need more power to go further. Lower frequencies don't carry as much data, so there's a huge trade-off in play.

      • But why hasn't somebody developed some low-frequency wireless that is suitable for internet speeds? (Other than the obvious licensing restrictions?) Sure, wifi is great and getting 100+ mbps is awesome and all, but most US residents really only use wifi for the internet, not transferring huge files back and forth. Someone needs something that's limited to 20mbps or so, but at a lower frequency for increased range/penetration.

        • by Khyber ( 864651 )

          Lower frequencies don't carry as much data, as I stated before.

          Also, bear in mind that for each extra device connected to the wireless access point, that's less available bandwidth overall. That 54mpbs rating for 802.11g is a pooled rating, meaning every time you add a new client onto the AP you lose some bandwidth - not every single person gets 54mbps throughput unless you're the only person using that access point/that channel.

          Also, most the lower frequencies are already allocated for certain services by

          • How many people seriously use 100% of that 54mbps of wireless...constantly? It's just like ISPs overselling bandwidth; most of the time it goes underutilized. It would be fine to have 20 devices on a 20mpbs wifi connection...they aren't going to be saturating it in a normal household.

        • by Korin43 ( 881732 )

          But why hasn't somebody developed some low-frequency wireless that is suitable for internet speeds?

          Because lower frequency means slower data transfer. They're directly related. And 20 Mbps is seriously overestimating the current speed of wireless technology.

    • by Anonymous Coward

      Unfortunately, in that case, the bit that eats power like the current ones is the wireless bit, itself and that's not going to change. To quote a famous TV engineer, we "cannae change the laws of physics". What we need is better battery technology... which is being worked on as well, so, hey, maybe someday.
       

      • Can't we bribe god for faster light or something instead? :(
        But has battery tech evolved anything at all in the last 10 years? I think IBM thinkpads had 6-7 hours of battery life back in the glory days, the "improvements" in battery tech so far seems to consist of getting rid of the battery decay problem, not adding in more power.

        • Can't we bribe god...

          He seems to be against that sort of thing.

          Proverbs 17:23 The wicked accepts a bribe in secret to pervert the ways of justice.
          Ecclesiastes 7:7 Surely oppression drives the wise into madness, and a bribe corrupts the heart.
          Isaiah 5:23 Who acquit the guilty for a bribe, and deprive the innocent of his right!

          It goes on and on...

          Although, prayer might work, I wouldn't hold my breath.

    • Or just connect smarter - my n900 connects to wireless networks on demand and auto disconnects when not in use. I can get 1.5 - 2 days of light usage off its battery.

    • Diablo, you get it, but it seems no one else does. You could make a computer comparable to one a few years back, say, with incredible battery life, but no one does. Marketing, and to some extent, the idiocy of customers who bought into that marketing. I find for example that with decent code, an old pentium 3 is just fine, and nowadays you could make a very low power consumption machine with equivalent computational power, such that batteries would last quite awhile.

      But it wouldn't allow people to belie

    • Bad news... The screen is probably consuming more power than all other components, combined. Even if the display is of f the majority of the time . This may not be true if you use your device as an mp 3 player, but otherwise, display power consumption overwhelmingly dominates...

  • Most power goes to the CPU, HDD and WiFi devices. The power consumption of the RAM is minimal.
  • "Power Sipping" (Score:5, Insightful)

    by Aboroth ( 1841308 ) on Wednesday January 05, 2011 @02:46AM (#34762986)
    Does anyone else besides me hate that term?
    • Yes. (Well "dislike" instead of "hate"--hate is such a harsh term.) And I dislike it just as much as these:

      "Samsung has been actively supporting the IT industry with our green memory initiative by coming up with eco-friendly, innovative memory products providing higher performance and power efficiency every year," Dong Soo Jun, Samsung's president of the memory division, said in a statement.

      Add "ecosystem" as well.

      • In what context do you object to "ecosystem"? What word would you prefer we use for the system of biological interdependency?

        • There's only one appropriate usage and that is (as you indicated): biological.
          These are inappropriate usage, taken from the first page search results querying /. for 'ecosystem':
          • Java ecosystem
          • FLOSS ecosystem
          • to make sure that what we do maximizes innovation and investment across the ecosystem
          • 'Open Web App ecosystem.'
          • Microsoft Server ecosystem
          • "Drools (sometimes called 'JBoss Rules') is a Business Rules Engine and supporting ecosystem"
          • SDK ecosystem
          • PostgreSQL ecosystem
          • Internet ecosystem
          • what if Oracle bought
    • Power Sipping belongs in the same family as Speed Walking.

      OTOH, sounds like someone might have a case of Powerthirst [collegehumor.com].

    • by jez9999 ( 618189 )

      Yep.

      And 'renewable energy'. What's renewable about it?

    • Don't anthropomorphize DRAM devices - they hate that.

  • by PatPending ( 953482 ) on Wednesday January 05, 2011 @02:48AM (#34763000)

    PatPending (talking to friend on phone during a bash help session): It's called Pseudo Open Drain (POD) technology
    Friend: Okay, I'll try that...
    Friend(typing): sudo open drain
    Friend: Argh! I hate this command line bullshit!
  • What's up with the pseudo-open drain? Is that new and exciting or just marketing speak? I know what open drain is, but how do you have a "pseudo" open drain?

    • POD explained (Score:5, Informative)

      by overshoot ( 39700 ) on Wednesday January 05, 2011 @05:42AM (#34763506)
      In a classical open-drain connection, the active device pulls down and the bus termination pulls up. For a pure transmission line, this works just fine -- the current wave from the turn-off of the driver is effectively identical to the current wave from the turn on. In practice, open-drain uses more static current than a push-pull driver against a center termination and since the line isn't a pure transmission line (lumped capacitances, stubs) the rising edge is slower than the falling edge.

      POD addresses this by actively pulling up at the beginning of a rising edge, then releasing the pullup to avoid a bus contention later. This reduces the termination current (at some cost in impedance mismatch, but it's already a sloppy line) and improved switching symmetry.

  • One thing not mentioned in the article or summary is whether or not this technology reduces standby power consumption in DRAM. Under normal use, especially if you have a lot of memory in your system, the standby power consumption is going to matter as much as read/write, if not more.
    • Have the RAM dump into a non-volatile solid state chip? (eg. flash) It would allow for very fast power on/off while keeping the power draw really low.
      • Take a look at the write speeds on cheap flash and realize that even laptops are coming with 4GB of RAM these days before you make a suggestion like this.

    • Nope (Score:4, Informative)

      by overshoot ( 39700 ) on Wednesday January 05, 2011 @05:47AM (#34763514)

      One thing not mentioned in the article or summary is whether or not this technology reduces standby power consumption in DRAM.

      POD by itself doesn't reduce power consumption in standby, since both POD and SSTL turn off the bus drivers then. The older POD technologies from the GDDR families use Thevenin termination, though, so the terminators draw a lot of unnecessary current when they're enabled (as distinct from the result with a dedicated termination supply.)

      If you really want to know how this all works, JEDEC [jedec.org] has the DDR4 standard available for free download. Follow the "free standards" link.

  • Meh (Score:5, Insightful)

    by lennier1 ( 264730 ) on Wednesday January 05, 2011 @03:20AM (#34763130)

    I'd rather have them finally mass-produce 8 and 16 GB modules for the desktop market.

    • Sir, you have my vote.
      Actually, less swap means less HDD churning, so the power consumption might be the same even with addition of more RAM.
    • As someone who regularly edits images larger than 100 megapixels with multiple layers in 16bbp let me just say, why the heck would you want that much on your desktop?
      • 3D CAD. My colleagues regularly run into RAM limits with 4 GB.

        • Mass produced desktop markets don't run 3D CAD. High-end workstations and servers hardly count and wouldn't be a major driver of the market.

          Sure there are plenty of applications for loads of RAM. Multiple huge VMs, 3D CAD, gigapixel photos, but that is a niche. What is a consumer driver for large cheap supplies of RAM?
          • Mass produced desktop markets don't run 3D CAD. High-end workstations and servers hardly count and wouldn't be a major driver of the market.

            Any recent gaming box can run 16G of ram, though admittedly those aren't ECC setups. The high end ain't so high these days.

            What is a consumer driver for large cheap supplies of RAM?

            Games, mostly. That and photoshop. I suspect that it's a wash between the billion odd computers in circulation and the ones in datacenters that drive chip prices these days.

          • Most workstations are still based on the desktop technology. Take a "gamer" mainboard, put in a powerful CPU, the maximum amount of RAM you can get your hands, garnish it with twin GFX cards (or a decent QuadroFX card) and desktop components become a decent graphics workstation.
            Servers are a different story, but if you look at components like Xeons and ECC-capable RAM many components are basically just a more specialised version of what consumers can buy.

            The smaller demand now is mainly a niche which can op

            • Gaming is not really all that memory intensive and we are far more limited by processing power and graphic memory. Sure workstations are consumer market but they are still a very small consumer market. Sure a drafting company will be full of workstations, but the word workstation still implies a very special line of work, and not general use for the desktop market.
          • by geekoid ( 135745 )

            Better gaming? richer environments? more stuff open? quicker video editing? Processing movie's on the fly, 3d processing?

            But hey, you don't need is, so clearly no one does.

            • Better gaming? richer environments? more stuff open? quicker video editing? Processing movie's on the fly, 3d processing?

              But hey, you don't need is, so clearly no one does.

              All you're saying is better better better, but I ask how practically better?

              - The most taxing games on my system take less than 1GB of RAM.
              - Same with environments, we are far more limited by CPU and video processing capabilities
              - How much more stuff? Outlook, 10 instances of word, 5 excel files, an image editor? I do that regularly on 2GB of RAM. Or are you talking multiple instances of insanely complicated apps such as CAD software or commercial are not in the realm of the normal users.
              - Quicker vi

        • I recently bought 8GB of RAM for my laptop for $170. If 4GB is a barrier at all, it's because you need the 64 bit version of your OS.
      • I'm going from 4GB to 8GB because I'm running low when running VMs and apps at the same time, and I am allergic to swap (so I have none.) It's not hard to imagine someone using four times as much as I'm using; by my own standards I'm not doing anything all that amazing any more.

        Sometimes I want to have a VM or two open and edit an image at the same time while a video encodes in the background. And that's just on three cores! My next desktop system will probably have at least eight, and even this system will

        • You would have to admit though that running two VMs while encoding a movie and editing an image is not something the average consumer would do. Or not even taking into account the average consumer do you think there would be enough people with your (if I may say so) insane and highly taxing use of a computer to justify mass producing a product?

          I mean compare like a real catalyst for memory usage such as the release of Vista, where every new computer sold suddenly gobbled up 1.5GB of RAM just displaying t
          • You would have to admit though that running two VMs while encoding a movie and editing an image is not something the average consumer would do.

            Actually, I think it's probably getting to where this kind of thing will be more common. The average consumer is getting more interested in video editing etc. Also many Windows 7 users have a virtual machine manager installed already to provide XP Mode, so it's not much of a jump to believe they might use a VM appliance... or to think they're already running a VM at least part of the time.

            do you think there would be enough people with your (if I may say so) insane and highly taxing use of a computer to justify mass producing a product?

            Not really, but amusingly, the original suggestion was to produce them "for the server market", I was only pointing out

            • Actually, I think it's probably getting to where this kind of thing will be more common. The average consumer is getting more interested in video editing etc. Also many Windows 7 users have a virtual machine manager installed already to provide XP Mode, so it's not much of a jump to believe they might use a VM appliance... or to think they're already running a VM at least part of the time.

              Bolded the key bit. I was talking the right here and now. There's no doubt that in the future we will be. When people stop editing 30 second clips from their mobiles and start editing 1080p videos from their cameras would be a killer app for more RAM. As for the VM thing, I run a few programs using that mode. It adds a few hundred MB overhead to the system, once, and you can run as many apps inside the mode as you wish without the overhead increasing too much. XP mode is (for a VM anyway) quite light on mem

              • Bolded the key bit. I was talking the right here and now. There's no doubt that in the future we will be.

                you don't get the future of applications you want until people have the RAM to run them. Catch 22.

                XP mode is (for a VM anyway) quite light on memory usage.

                It's just the beginning. And it's not magic. You can gobble up plenty of memory with it. The performance is poor compared to VMware, which doesn't use any more memory.

                • The catch-22 is very right, but look at it in the context of the reply to the OP. He wishes for mass produced 8GB and 16GB sticks. We currently have 4GB sticks costing pennies, and realistically the 6 and 8GB sticks aren't too far off anyway. Your average motherboard has 3 slots so...

                  So the mass produced chips can already get you 12GB on your average motherboard. That's not taking into account the fact that 6GB and 8GB sticks are already available for the desktop market.

                  So let me ask again why would
        • I'm not judging here, just wondering: have you considered having more than one box? If money is no object then fair enough, but it sounds like you're shelling out a lot for top-notch hardware to do lots of mid-level tasks, when you distribute the work on a KVM setup. You'd save a bundle in hardware, reduce your VM overheads, and introduce some healthy redundancy for when that very expensive rig does something smokey and difficult to diagnose.
    • by rednip ( 186217 )
      As I'm hungry, I'd rather have a ham sandwich rather than mass produced 8 and 16 GB modules. -- is that insightful too?

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...