AMD Is Working On a Monster 64-Core Threadripper CPU, Landing As Early As Q4 2019 (wccftech.com) 206
AMD is preparing a monstrous 64-core/128-thread Threadripper CPU for launch in Q4 2019. "AMD's largest HEDT processor right now is the W2990X which tops out at 32-cores," reports Wccftech. "This is nothing to sneeze at and is already the highest core HEDT part around but because the world can't get enough of these yummy cores, AMD is planning to launch a 64-core version in Q4 2019." From the report: The platform is called X599 right now although I am told AMD is considering changing the name to avoid confusion with Intel. This is not really surprising since both Intel and AMD HEDT platforms have the same nomenclature and it can get really confusing. I am also told that they they plan to retain the "99" suffix. AMD is planning to launch the 64-core Threadripper part and the corresponding platform in Q4 2019. In fact, that is when you can expect these motherboards to start popping up from various AIBs.
Now my source did not mention a new socket, so as far as I know, this should be socket compatible with the existing TR4 motherboards and only a bios update should be needed if you already own one. What I don't know right now is whether this is a 14nm part or a 7nm part. Conventional wisdom would dictate that this is a 14nm part trickling down from their server space, but who knows, maybe the company will surprise all of us? This is pretty exciting news, because knowing AMD, the 64-core Threadripper CPU will probably be priced in the $2500 to $3000 range, making it one of the most affordable workstation processors around with this many threads.
Now my source did not mention a new socket, so as far as I know, this should be socket compatible with the existing TR4 motherboards and only a bios update should be needed if you already own one. What I don't know right now is whether this is a 14nm part or a 7nm part. Conventional wisdom would dictate that this is a 14nm part trickling down from their server space, but who knows, maybe the company will surprise all of us? This is pretty exciting news, because knowing AMD, the 64-core Threadripper CPU will probably be priced in the $2500 to $3000 range, making it one of the most affordable workstation processors around with this many threads.
Take my money! (Score:2, Insightful)
My Christmas present is all lined up! Please notice I said 'Christmas' not 'Holiday'...
Re: (Score:2)
That'll be a $4,000 Christmas present, like a nice camera. I'm in, I think. On the other hand, 32 cores with higher boost clock... decisions, decisions. What is not in doubt is that my next workstation will be a Threadripper. Enough with these mere desktop class chips. Four memory channels for the win.
Re: (Score:2)
Threadripper is a NUMA architecture and is similar to a dual-CPU 2-channel system in a single die
No wrong. You are talking about old Threadripper. New Threadripper will be symmetric just like Rome, the only slight asymmetry being off-chiplet L3 cache access, which is still, um, symmetric. Not Numa. BTW, both old and new Threadripper are quad channel, one of the major reasons why somebody would want to spend the extra money on it, compared to the same number of cores in a dual channel part.
Re: (Score:2)
Re: (Score:2)
Eight channel you say? That would be sweet, but is it accurate? If so, why would datacenters buy Rome?
Re:Take my money! (Score:5, Funny)
At last... a CPU that will deliver a computing solution capable of keeping up with me when I play Solitaire!
Brilliant!
Re: (Score:2)
Please notice I said 'Christmas' not 'Holiday'...
Fuck off troll. Please notice I said fuck off. No need for you to reply.
Re: Take my money! (Score:5, Informative)
Literally nobody is offended by the word "Christmas". That's always been a shitty straw man by the right. Atheists don't care what it's called and Muslims aren't offended by the holiday since Jesus is one of their prophets. And Jews couldn't give a shit either way.
Re: (Score:2)
...Muslims aren't offended by the holiday since Jesus is one of their prophets.
The Christian Jesus is nothing like the Muslim Jesus. Aside from the status of Christ in one vs the other, there's also the other minor detail that at the end of times, the Muslim Jesus is supposed to rise again and convert everybody to Islam by the sword.
If those 2 were one and the same, I'm sure the Bible would have mentioned that prophecy somewhere or the other. So to say that Jesus is one of their prophets is downright misleading
Re: (Score:3)
Some yahoo idiot with more time at his hands than brain cells between his ears wants something, some public servant or private business caves in.
That story can be played with idiots from the left who berate you when you wish them a Merry Christmas and with idiots from the right who want to push their story book into science classes.
Could we, the sane ones, assemble in the middle and laugh at the clowns from BOTH sides?
Re: (Score:2)
I want htop screenshots! (Score:5, Funny)
64 cores?
I want htop screenshots!
Not to verify the performance, not to dispute the number of cores. Just... some guys like to have pictures of Ferraris and Grand Nationals on their desktops.
Re:I want htop screenshots! (Score:5, Funny)
This is going to force all those 80-column window holdouts to finally get with the times.
Re: (Score:3)
132 Column card in the IIe let me do WYSIWIG with 15 inch Dot Matrix printer with a ZIPChip it flew
Re: (Score:2)
Wow, someone else knows about 132 columns. I still use that as the auto-wrap column for my IDE.
Re: (Score:2)
But cpu usage is shown vertically :/
Re: (Score:2)
We have a 50k+ core supercomputer where I work, but I don't know if htop runs on such a thing...
Re: (Score:2)
See, THIS is why I read Slashdot -- to get tips on which tools are useful. I've learned a ton of things over the years through random throw-away comments from posters who didn't even realize the gems they were laying down. htop! Cool -- I'd never heard of it, but, now that it's installed, I see its value.
Personally, for a geek-cool screenshot, I would prefer the view from xosview +cpus. With 64 cores, yep, that would be sweet!
Re: (Score:2)
There's a screenshot of a 64 core machine on the htop website [hisham.hm]. It's the one with the white background.
Full-blown and blueprinted (Score:5, Funny)
I'm holding out for the 350-thread supercharged Pantyripper with the dual-overhead cams.
Re: (Score:3)
I'd give you the funny mod if I ever had a mod point to give.
I'm taking your actual point to be "Who needs it?"
My feelings are actually kind of mixed. On the one hand, I like to feel like I have a muscle machine, and there was a time when I needed it.
On the other hand, the hardware has gone so far that I just don't need that much anymore. Actually, my smartphone has plenty of power for my everyday applications and almost all of my waiting time is caused by the network, not because I'm CPU bound.
On the third
Re: (Score:3)
side note, you get nerd points if you use on the gripping hand [amazon.com] in place of "third".
Re: (Score:3)
Really lowering the bar for being a traitor, if you're now asserting that listening to someone from Norway before calling the FBI is all it takes. You know Trump is the boss to the FBI, right?
If that's the bar, how many people that were involved with getting information from Russia via a foreigner from England are going to be put in jail for being traitors?
Re: (Score:2)
I'm a big believer in "power for power's sake".
Re: (Score:2)
I'm taking your actual point to be "Who needs it?"
My home workstation is a 32-core 2-socket Xeon box. I use it to transcode Bluray into high-quality rips. That gets me transcoding in somewhat better than runtime, so there's lots of room for meaningful improvement.
Re: (Score:2)
Once the rendering quality exceeds your limited human perceptual capacity, why would you need moor?
Easiest to take the example of digital cameras (since you've gone into that application area). I may be a bit off on the exact numbers, but I think a pre-digital color photograph corresponded to less than 4 million pixels with a color depth less than 24 bits. The exact numbers may be a little different, and of course you can escalate and ask "But what about 8 by 10s?" or "But there's no limit to closeups?", bu
Re: (Score:2)
Once the rendering quality exceeds your limited human perceptual capacity, why would you need moor?
You don't. But Bluray looks better than DVD, and is much more compute-intensive to rip. I can't see a difference between Bluray and 4K, even up close to a big 4K TV, so I don't bother with 4k (not to mention transcoding time would be nuts). Of course, I've only seen streamed 4k content, which might be all crap.
Re: (Score:2)
I hate typos.
s/moor/more/
s/can saw/saw/
Re: (Score:2)
Re: (Score:2)
Currently everything is h.264. I use Handbrake, with it's "constant quality" setting, at 21 for Bluray and 19 for DVD. I picked those by doing some test scenes, then going frame-by-frame flipping back and forth with the original looking for any pixel-level differences (this won't catch something subtle like a 1-bit color difference, but anything more than that jumps out at you). I found the settings that I couldn't see any difference, the set the quality 1 higher.
Unlike some people who do this, I don't k
Re: (Score:2)
For most people, that many threads is overkill. However if you use a computer for creating instead of consuming then there is really no such thing as too much performance assuming the hardware can pay for itself.
Resource requirements also expand as hardware expands. There is no reason any application in MS Office today should use appreciably more resources to do the same task as previous versions. But they do. As one arrogant dev said to me, "my time is worth more than your ram." Clearly he doesn't keep up
Re: (Score:2)
Substantive reply. Reminds me of how expenses rise to meet income. I'm having a bit of trouble interpreting exactly what you mean by "pay for itself" in terms of justifying more CPU capacity.
Maybe I should try to reword it in traditional economic terms? Essentially I'm saying that demand for computing power is inelastic, rather like food. There's a limit to how much food you can eat and there's a limit to how much computing power any person can consume. (However I do reject most of economics in favor of ekr
Re: (Score:2)
If you are secretly an avatar of Shiva, I think you just blew your cover.
Re:Full-blown and blueprinted (Score:5, Funny)
I'm waiting for 640 core version. 640 cores will be enough for anyone.
Re: (Score:2)
Annnnnd... GPUs passed 64 cores long ago. Radeon VII has 3840 I believe.
Re: (Score:2)
ahem, passed 640 cores.
Re: (Score:2)
https://images.app.goo.gl/8pcw... [app.goo.gl]
Re: (Score:2)
You mean the tired Billg meme? IOW, whoosh you.
Re:Full-blown and blueprinted (Score:5, Informative)
Re: (Score:2)
GPU "cores" aren't cores in the traditional sense. They are glorified FPUs.
Re: (Score:2)
Wrong, GPU cores are cores. Over time, they are travelling the same trajectory as CPU cores, that is, more superscaler, more shadow regisisters, more branch optimization, etc. Check out the scalar component of GCN, even more like a typical CPU core that previously, as opposed to the vector component, which is rather like AVX. I believe, scalar cores are doubled in RDNA.
Re: (Score:2)
Right. It's interesting to me how AMD actually increased the scalar cores, I would have thought that the vector units would really dominate, but it seems, the need for flexible branching etc is actually quite significant in the GPU. Note that this is Navi we are talking about, which is really a dedicated graphics design, not primarily used for compute, where I expect Vega to continue to dominate for quite some time.
Re: (Score:2)
Re: (Score:2)
Touche!
Re: (Score:2)
I guess I could've said 640kors...
Re: (Score:2)
Re: (Score:2)
I'm holding out for the 350-thread supercharged Pantyripper
metoo!
We'll look back at this one day (Score:5, Funny)
Re:We'll look back at this one day (Score:5, Funny)
Re: (Score:2)
Good conditional branching performance.
Re: (Score:2)
In case you weren't going for funny: There's a significantly larger difference between GPUs and CPUs than core count. GPUs are only useful for specific calculations on top the requirement for the problem to be embarrassingly parallel.
Re: We'll look back at this one day (Score:2)
Re: (Score:2)
Remember when CPU power was all about MHz and later even GHz and not about cores? Because they all had just ONE?
Re: (Score:2)
Re: (Score:3, Interesting)
Re: (Score:2)
So sad that 8086 became the de-facto standard for performance CPUs. There were so many better choices, like 68000.
Re: (Score:2)
68k wasn't all that great. Would have been nice if PPC had taken off. Its a much cleaner ISA than the minefield of documentation x86 has become. But all of the old Archs from back then are inferior to modern developments anyways. We can and have done better. What really needs to happen is breaking away from the necessity of uArch binary compatibility. Potentially one of the benefits (probably the only, because performance, stability and security definitely aren't) of everything becoming a javascript app is
Re: (Score:2)
Tragically, I think that that race was over years ago once NT became the mainstream Windows kernel in XP, and Intel and AMD both started developing multi-core CPUs. Cost too is a factor, and at that point, one could no longer point to an Alpha or POWER and point out that they were the fastest per core. Once that happened, and Linux too became mainstream, both proprietary Unixes and proprietary RISC platforms died together. Also, the hype over the Itanic just accelerated the demise of not just PA-RISC, bu
Will 100 cores be enough to read email? (Score:3, Informative)
In 20 years time, apps will be so bloated that you would not be able to boot a toaster with only 100 cores. 100gig of RAM? You've got to be joking. Could not even fit the toaster's security software in that.
But the good news is that the toaster of 2040 will only be able to toast properly authorized bread. No fake bread in 2040.
Re: (Score:2)
With Intel 14 nm+++...++ *) and 100 cores, the good news is that you can get rid of the heat elements and toast directly with the CPU in a matter of milliseconds.
*) Slashdot doesn't allow posting as many pluses as Intel is going to have, as those are considered "junk" characters, which is quite fitting in my opinion.
Re: (Score:2)
Re: (Score:2)
Pfft... Kids today. Remember when the 8086/8088 processor had 29,000 transistors? Now get off my lawn!
8086? How about the 8085, when we had to do programming in assembly?
why did apple not use this for the mac pro? (Score:4, Interesting)
why did apple not use this for the mac pro? they have AMD video cards why not also use there cpus?
Re: (Score:2)
why did apple not use this for the mac pro?
Dunno, but it's gonna make the mac pro CPU look like a cow turd. A fresh one, that you can't even burn for fuel.
Re: (Score:2)
Dunno, but it's gonna make the mac pro CPU look like a cow turd. A fresh one, that you can't even burn for fuel.
You were setting fire to it wrong.
things I won't work with (Score:2)
Sand Won't Save You This Time [sciencemag.org]
Probably just not proven yet... (Score:2)
why did apple not use this for the mac pro? they have AMD video cards why not also use there cpus?
That is pretty interesting, I think for the Mac Pro maybe Intel did have the best high end option to be found there...
I find it more mysterious why they don't use AMD chips in the laptops, I've always thought that would be a nice mix in recent years with really good AMD chips. But maybe the roadmap to move to ARM is so close it makes no sense to make that change at this point or in recent history.
Could also jus
Re: (Score:2)
Actually, a better question would be - why didn't AMD use their A10s in the Mac Pro? Compatible w/ the iOS devices, comes from the same supply as iPhones and iPads, has the power savings that they might want, and AMD would also have greater control on the supply chain, given that it's used for iPhones as well.
Re: (Score:2)
They were pondering it, but the socket costed 999,--
Re: (Score:2)
The answer to your question is in the phrase "is working on".
There's no way you could have a machine on sale by the fall based on a processor that doesn't have engineering samples available now. Note that that supposed source that the article cites actually claims a January 2020 release, not the Q4 2019 which the article writer then claims.
Re: (Score:2)
Re: (Score:2)
why did apple not use this for the mac pro? they have AMD video cards why not also use there cpus?
Same thing that hold the iPhone back in many cases from using the latest and greatest. When they went to x86 from PowerPC, they were asked the same thing and they simply stated they went with Intel because they had expectations of selling models in the millions of units, and Intel could promise those production numbers and AMD couldn't. They're a small part of the market, but only have a relatively small number of models of computers. Since they don't want models to sit on their store marked as "Out of Stoc
Re: (Score:2)
Apple don't like AMD. They hate NVIDIA. There's a difference.
Re: (Score:2)
why did apple not use this for the mac pro? they have AMD video cards why not also use there cpus?
Prices are not close to similar for mass manufacturers when compared to consumer or small-scale manufacturers. Intel likely gave Apple an offer they couldn't refuse and locked it in with a contract.
Unless you make the operating system (Score:2)
> AMD is a drop-in replacement for Intel, unless you're running an operating system specifically made for one or the other
Ftfy
Re: (Score:2)
Re: (Score:2)
Why? There's nothing that ever stopped them from adapting FreeBSD as their kernel, particularly since they employ some BSD devs. They started years ago in NEXTSTEP w/ that combination of Mach and BSD, and later on, revved up both the Mach version as well as the BSD one. But if there was anything monstrous about Darwin, they'd have dumped it for just a FreeBSD core, and gone w/ it. Not just that, Darwin is something that is open source: what is not open source is Quartz. Those are the company jewels
Bu
Why is this "monstrous"? (Score:2)
Did it lurch from the ocean and stomp a city?
Re: (Score:2)
Re: (Score:2)
Wccftech (Score:4, Insightful)
next...
Conventional Wisdom?? (Score:2)
It's got to be 7nm to fit within power budget and still keep reasonable single thread performance. Plus that's conventional for intel. AMD's TR cores have lined up with with the ryzen cores with the first number of the model name have matched. .
Re: (Score:2)
Correct. If this comes in under 200W they'll fly off the shelves. I believe the boards expect up to 250W parts, so they might slip that high.
Yay for AMD64 (Score:2)
Java streams? (Score:2)
But if they can take my huge file and magically parallel process it then we have a, really, do I have to say it
Re: (Score:2)
> But if they can take my huge file and magically parallel process it
xz --threads=0 -0 mybigfile
Take my money (Score:2)
64 cores with abundant PCIe lanes? I don't even care how much it costs. But do make sure it works in the exsiting STR4 boards.
Wow, and it (Score:2)
Cider has many cores too (Score:2)
Why would this be 14nm? (Score:2)
Is Epyc Rome going to be 14nm?
I figured this would a rebranded Rome with less memory channels?
Imagine... (Score:3)
who came up with the name Threadripper? (Score:2)
Can someone please fire AMDs marketing department? In particular whoever thought that was a good name for a CPU. They should just name all of their CPUs after various sharks or dinosaurs. Really anything would be better than Threadripper.
Bad old days (Score:2)
$4K just for the CPU?? This is going back to the Bad Old Days where you either had to take out a loan to buy your PC, or put it on a credit card. And then it takes forever to pay off. I'll wait till prices come down.
Re: Binford? (Score:2)
Re:Ignorant rumors (Score:5, Interesting)
32 cores of the current Threadripper ... two clusters of cores are not directly connected to RAM controllers in the highest models and accessing memory has additional latency.
That was the original Threadripper, the new chiplet based design is very different. All cores are connected to a central IO core, so that every core is equally close/distant from main memory. There is some overhead to access L3 cache on a different core, but this new design is not definitely not Numa.
Re: (Score:2)
a typo
Re: (Score:2)
I have 32GB of RAM and that's not enough.
Re: (Score:2)
The massive amount of L3 (256 MB) should help
Re: (Score:2)
Yeah but imagine that number of cores with useful performance!
Re: (Score:2)