Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Intel Hardware Technology

Intel To Support 128GB of DDR4 on Core 9th Gen Desktop Processors (anandtech.com) 172

Ian Cutress, writing for AnandTech: One of today's announcements threw up an interesting footnote worthy of further investigation. With its latest products, HP announced that their mainstream desktop platforms would be shipped with up to 32GB of memory, which was further expandable up to 128GB. Intel has confirmed to us, based on new memory entering the market, that there will be an adjustment to the memory support of the latest processors.

Normally mainstream processors only support 64GB, by virtue of two memory channels, two DIMMs per memory channel (2DPC), and the maximum size of a standard consumer UDIMM being 16GB of DDR4, meaning 4x16GB = 64GB. However the launch of two different technologies, both double height double capacity 32GB DDR4 modules from Zadak and G.Skill, as well as new 16Gb DDR4 chips coming from Samsung, means that technically in a consumer system with four memory slots, up to 128GB might be possible.

This discussion has been archived. No new comments can be posted.

Intel To Support 128GB of DDR4 on Core 9th Gen Desktop Processors

Comments Filter:
  • I mean beyond shits and giggles, is there anything out there that could use 128GB of RAM and even get close to that number.
    Or anything in the near future. Next 5-10 years.


    Chrome doesn't count. That will eat up all the RAM anyways.
    • Re: Why? (Score:4, Insightful)

      by Anonymous Coward on Monday October 15, 2018 @12:26PM (#57480658)

      VMs obviously. Adobe as well.

    • by Anonymous Coward

      I run analyses that require ~30GB, so I'd love to be able to multi-thread that and get four instances going.

    • by Anonymous Coward

      640k ought to be enough for anybody.

      • It is for a 16bit program. During that type most applications could fit in 64k of RAM, that has enough room for the OS and a fully operational BASIC interpreter.

        Then we wanted 80 column display, then graphics then we wanted the graphics with higher resolution and more colors....

    • Re:Why? (Score:4, Interesting)

      by Lord_Byron ( 13168 ) on Monday October 15, 2018 @12:28PM (#57480684)

      Certainly, with virtualization. Perhaps not mainstream, but my home server is creaking under it's current memory limits if I have the Windows VMs up. Yes, there are other approaches, but this is a valid use for gobs of RAM.

      Maybe gaming too? Being able to cache the *entire* game to RAM would seem likely to speed things up, maybe make loading screens a thing of the past.

      • Use Cases (Score:5, Informative)

        by JBMcB ( 73720 ) on Monday October 15, 2018 @12:55PM (#57480864)

        Multitrack high-res audio editing. Video editing and compositing. Medium format 48-bit image editing.

        Anything needing a few gigabytes of RAM just to load a project will just get faster the more you can buffer stuff into memory.

        • Medium format 48-bit image editing

          Image editing in general:

          - Normal images with many layers eat RAM.
          - Creating gigapixel images.
          - Advanced processing for images e.g. stacking to 64bit images, deconvolution, etc. They can all happily eat up as much RAM as you let it.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      I mean beyond shits and giggles, is there anything out there that could use 128GB of RAM and even get close to that number.

      Virtual machines ... CAD software ... databases ... rendering software ... huge data sets.

      There's a lot of things for which "too much RAM" can never be true.

      On a desktop I can burn through 16GB without even trying, and 32GB I can fill without trying that hard.

      I can guarantee you, someone somewhere can chew through 128GB of RAM for their specific problem.

    • Re:Why? (Score:5, Insightful)

      by mi ( 197448 ) <slashdot-2017q4@virtual-estates.net> on Monday October 15, 2018 @12:39PM (#57480752) Homepage Journal

      is there anything out there that could use 128GB of RAM and even get close to that number.

      I'm dealing with 100GB-1TB databases at work on a regular basis — I'm sure, others have encountered even bigger ones. Fitting all — or most — of the dataset into RAM is greatly speeding things up. Indeed, there are database-software packages already (such as, ugh, "memcache"), that must load it all into RAM, offering dramatically-improved speeds in exchange for this requirement.

      On the OS level, swap — and the associated complexity of the kernels — is becoming unnecessary in more and more cases. On some of my FreeBSD machines, for example, I'm already compiling the kernel with options NO_SWAPPING.

      On the filesystem-level, ZFS — the revolutionary filesystem — can offer much better speed with more RAM. The abundance of RAM is also making its advanced features (like deduplication) practical.

      And for a layman's personal computer, editing a 4K video becomes much snappier too, if the the entire (uncompressed) clip fits into RAM.

      And then come things like "machine learning" — I'm waiting for a Thunderbird add-on, for example, to automatically sort my incoming e-mail. Not just "spam/not-spam", but all of it, based on the ongoing analysis of how I've been sorting it through the years... For those things to be effective, they need both CPU and memory — continuously...

      Other examples — legitimate and otherwise (like Chrome) — abound...

      • Re: (Score:2, Informative)

        by Anonymous Coward

        > On the filesystem-level, ZFS — the revolutionary filesystem — can offer much better speed with more RAM. The abundance of RAM is also making its advanced features (like deduplication) practical.

        At scale, ZFS deduplication is a non-starter. The requirements of 5GB/TB are just not workable. I'm in the early investigative stages of a petabyte scale project and will be testing Red Hat's VDO layered on top of ZFS. The deduplication requirements of VDO are 268MB/TB, which means our 480TB storage

        • by mi ( 197448 )

          The requirements of 5GB/TB are just not workable

          Why? By your numbers, you can have a 50TB filesystem needing 250GB of RAM for deduplication. That's only two of the 128GB DIMMs being discussed — your server can still use the rest of the RAM-slots for other things (like caching)...

      • by Targon ( 17348 )

        For databases that large, wouldn't something like AMD Threadripper or an Intel Xeon make more sense? Those would offer more memory channels to better access the amount of RAM you are looking to put in there.

    • There is an 20/80 rules. 20% of the data is used 80% of the time.
      For a Database server of a modest size of 600 gigs. 128 gig ram, would be handy for most of your data requests that are handled. Speeding up the data access on the app.

      I happen to do a lot of data processing, the more I can stuff in RAM normally the better, because I don't need to go back and optimize code to handle slower drive reads, or because my OS is thrashing because I gave it too much data.

      For home use not so much. My Laptop has 32gigs

    • Comment removed based on user account deletion
    • by sconeu ( 64226 )

      I remember thinking the same thing when the original Mac II came out, supporting 128MB.

      Never underestimate Parkinson's Law.

    • by darkain ( 749283 )

      Under some NDAs, so I cannot fully answer it. But I did talk with a client once that had a use case for 128GiB RAM on a laptop in order to run a specific type of database for presentations to high ranking government officials. These of course were not standard laptops, but over-sized and high-powered, specialized systems. Essentially mobile desktops with attached screens.

      • The new Thinkpad P52/P72 laptops support 128Gb using 4x32Gb DIMMs with their Xeon laptop chips. I'd _love_ to get my hands on one of those bad boys!

    • by Sique ( 173459 )
      Any database. You want as many tables and indices as possible in RAM to speed up your queries. And you also want the rollback buffer in RAM too. And you want queries distributed to the threads to have as many queries in parallel as possible, which means that you need several rollback buffers.
    • Re:Why? (Score:4, Informative)

      by WilliamGeorge ( 816305 ) on Monday October 15, 2018 @01:35PM (#57481120)

      Photoscan and other photogrammetry applications, when working with large image sets (1000+ photos) and high quality settings.

      After Effects uses RAM to store rendered frames, so increasing from 64 to 128GB means you can have twice as many frames stored in RAM preview at a time.

      Video editing with 6K and 8K footage, though usually in those situations you would want a CPU with more cores anyway (so a Core X processor, which can already support 128GB of memory without more dense modules.

      That is just what I can think of off the top of my head, and that others in this thread haven't already mentioned.

    • I mean beyond shits and giggles, is there anything out there that could use 128GB of RAM and even get close to that number. Or anything in the near future. Next 5-10 years. Chrome doesn't count. That will eat up all the RAM anyways.

      Less than 20 years ago 32 megabytes of memory was the norm under Windows NT Workstation. Not sure why you fail to see another exponential increase in memory demand, particularly for power users. Also, ever heard of VMs before? It's this "new" thing we've been playing with for about two decades now...

      • Less than 20 years ago 32 megabytes of memory was the norm under Windows NT Workstation.

        Minor quibble, but you might be off by a few years. 20 years ago, Dell was selling regular desktop computers that supported 256-512 MB of RAM. 32 MB would have been common more than 20 years ago, but probably less than 25 years ago.

        Just thought I'd mention it, since I remember upgrading my old 486 computer to 16 MB in 1996 or so, and the P2-400 I bought in 1998 came with 128 MB, I think.

        • Back then ram was relatively expensive. Computers could support a lot more RAM than they usually shipped with. You could have a computer with 512 MB of ram in 1998 but be prepared to pay for it. A typical configuration would probably be 32-64 MB. This did come in handy a few years later when upgrading those PCs from Windows 98 (perfectly happy with 64 MB) to Windows XP (256 MB needed to run halfway decently).

          Nowadays, RAM is comparatively cheap - though more expensive than it was just a few years ago, s

    • Since you're asking this question on Slashdot, I'm gonna assume you're trolling. Most people here probably deal with a hell of a lot more RAM than this at their workplace. Hell, I "only" do video editing and I would jump on 128gb if I could afford it right now.
    • by EvilSS ( 557649 )
      I'm not sure of many non-enterprise workloads that would use that much, but it's nice to be given the option. My current i9 supports 128GB and I have 64 in it currently, using it for most of my daily activities, running a hand full of lab VMs in the background, and gaming. All at the same time if I want. If it wasn't for the VMs I could easily get away with half of that.

      For enterprise users, it means they can build beefy workstations without having to resort to Xeon W processors. There is a Xeon W versio
    • I work with 3D, 2D, Video.
      At any one time I might have Photoshop, a couple instances of Maya, Mudbox, After Effects open.

      I have a 6 year old computer at home, it still works great, but I'm starting to bump into its 32GB RAM limit.
      At work I have 64GB, and RAM is not a problem.

      If I were buying a computer today I'd get 64GB with the intention of getting another 64GB somewhere down the lifecycle.
      We might at some point reach the limit of how many pixels and polygons we need want in a scene, but we're not there y

    • Heavy graphics processing usually involving stacking and or stitching very large 16bpc images will. I've been doing that on a box with 32GB of ram with a SSD with 128GB allocated for swap space. A great example is stacking astro images. I don't have an equatorial mount but my camera has some tracking built in. So instead of taking 60x180s images like most would do I will take 200x30s images or even before I was able to enable the tracking, didn't have the camera add on at first, 2000x1s and then stack them.
    • by rhewt ( 649974 )

      Load the OS into RAM, duh.

    • Anything involving analysis of 3D structures (the stuff I remember using was an electromagnetic simulator called HFSS). Doubling your mesh resolution leads to increasing memory usage by a factor of 8 so you can very quickly eat up as much memory as you can throw at it in search of more accurate and/or higher frequency results.

    • by mikael ( 484 )

      CAD workstations. Try visualizing a modern jumbo jet with over 100,000 components represented using the commercial CAD formats. Each of those has so much meta-data you need all that Gigabytes. Then there are GIS systems that model entire cities and states using multi-layer information and Terabytes of textures.

    • The last "big computer" I worked on was a Sun E10k. It had 64 CPUs with 4 cores. And one terrabyte of RAM.

  • Can you imagine what Apple would charge you for 126GB of RAM?
    They already charge you 1200$ for 64GB. I am guessing it will be massively more than that.

    At the moment I have 32Gb on my machine. I have never gotten even remotely close to using it up.
    Then again, I am not editing huge video files are doing renderings. Likely those people would welcome the extra ram.
    Assuming they are using a windows machine, so they could actually afford to buy it ;)

  • by king neckbeard ( 1801738 ) on Monday October 15, 2018 @12:28PM (#57480680)
    128GB ought to be enough for anybody.
    • Still not enough to make a simulation of the universe on the subatomic level. (Granted making a universe on a subatomic level, would require a system the size of the universe, unless you are going to make shortcuts)

    • by EvilSS ( 557649 )
      If you build it, a software developer will find a way to exhaust it.
  • I'm running 128GB of RAM on a i7 6800k on a Asus TUF x99.

    Works great. Virtual machines FTW.

    • by djhertz ( 322457 )

      I read this article and thought the same thing. I'm only at 64gig on my x99 but I could certainly rip some more sticks in there :)

  • And I'm confiscating your 128GB ram for telemetry purposes.
  • Threadripper (Score:5, Informative)

    by Weaselmancer ( 533834 ) on Monday October 15, 2018 @12:48PM (#57480816)

    From this link: [wikichip.org]

    Max Mem 1 TiB

    • From this link: [wikichip.org]

      Max Mem: 1,536 GiB

      50% more memory bandwidth too, due to 6 channel instead of 4.
      They're not just for servers. HP sell them in workstations with up to 3TB of installed RAM for dual CPU models.

  • by Anonymous Coward

    I have 64 GB, but I'm not limited to 64 GiB on my HP Z820 workstation ( with a dual Xeon, I can go to 512 GB) , but it's barely enough to do FLIP fluid simulations in SideFX Houdini 17.
    With 128 GB it's really a lot more comfortable.

  • Double-height is nothing.

    How about octuple-height [lucidscience.com]?

  • A Ryzen CPU can support half that, up to 64 GB of RAM, but a Threadripper CPU can support eight times as much, up to 2 TB.

    • by Anonymous Coward

      TFA actually states that Ryzen supports the 128GB. AMD will test the new quantities, then when qualified they will update their official spec.
      Intel doesn't but you'll be able to search on the web and learn if this worked on older CPUs.

      e.g. the i7 920 officially supported 24GB. i7 2600K supported 32GB. i7 920 worked perfectly with 48GB but Intel will never tell you, in the hope you'll buy something bigger, newer or both.

  • Considering that Dell is already shipping laptops with 128GB of memory as an option (Dell Precision 7530, 7730) — these are single-CPU, 8th gen systems — hasn't the ship sailed on this already?
  • Epyc supports 2TB per chip. [extremetech.com] WTF is up with Intel?

    • by ZiakII ( 829432 ) on Monday October 15, 2018 @01:41PM (#57481146)
      You are looking at Server CPUs... Intel supports up to 3.06 TB a CPU.
      • Intel supports up to 3.06 TB a CPU.

        Intel's high end Xeons have 6 channels, looks like 1.5TB to me.

  • I can finally run all the tabs I would ever want!

    • by nojayuk ( 567177 )

      I can finally run all the tabs I would ever want!

      Firefox sez: "Hold my beer and watch this!"

  • It's worth bearing in mind that as you increase the amount of RAM, particularly in high performance systems like those with i9 processors, that the system has to reduce the memory access speeds accordingly.

    I know that this is something to do with the actual RAM timing profile, but I am not aware of the precise technical driver behind it.

    In other words, if you are adding RAM to gain maximum performance, then there is a sweet spot that you can actually go beyond - and to go beyond will have the effect o
    • It's worth bearing in mind that as you increase the amount of RAM, particularly in high performance systems like those with i9 processors, that the system has to reduce the memory access speeds accordingly.

      AIUI

      A processor has a limited number of memory channels. On Intel desktop chips (not sure about the AMD side) each channel can drive up to two modules with up to two "rank"s each. Each "rank" normally consists of 8 chips and the more ranks are present on a channel the higher the loading on the bus and the slower the timings needed to make the channel work reliably.

      Server chips often use "registered" memory which allows a larger number of ranks (both more modules and more ranks per module) but adds an extra

  • As long as Intel insists on not supporting ECC on desktop chips, they don't stand a chance of getting my business. Even with "only" 16 GB, I want ECC.

  • Threadrippers support 128GB of ram already

The clash of ideas is the sound of freedom.

Working...