Intel Launches New Core i9-9980XE 18-Core CPU With 4.5GHz Boost Clock (hothardware.com) 192
MojoKid writes: When Intel officially announced its 9th Generation Core processors, it used the opportunity to also unveil a refreshed line-up of 9th Gen-branded Core-X series processors. Unlike other 9th Gen Core i products, however, which leverage an updated Coffee Lake microarchitecture, new processors in Intel's Core-X series remain based on Skylake-X architecture but employ notable tweaks in manufacturing and packaging of the chips, specifically with a solder TIM (Thermal Interface Material) under their heat spreaders for better cooling and more overclocking headroom. The Core i9-9980XE is the new top-end CPU that supplants the Core i9-7980XE at the top of Intel's stack. The chip features 18 Skylake-X cores (36 threads) with a base clock of 3.0GHz that's 400MHz higher than the previous gen. The Core i9-9980XE has max Turbo Boost 2.0 and Turbo Boost Max 3.0 frequencies of 4.4GHz and 4.5GHz, which are 200MHz and 100MHz higher than Intel's previous gen Core i9-7980XE, respectively.
In the benchmarks, the new Core i9-9980XE is easily the fastest many-core desktop processor Intel has released to date, out-pacing all previous-gen Intel processors and AMD Threadripper X series processors in heavily threaded applications. However, the 18-core Core i9-9980XE typically trailed AMD's 24 and 32-core Threadripper WX series processors. Intel's Core i9-9980XE also offered relatively strong single-threaded performance, with an IPC advantage that's superior to any AMD Ryzen processor currently.
In the benchmarks, the new Core i9-9980XE is easily the fastest many-core desktop processor Intel has released to date, out-pacing all previous-gen Intel processors and AMD Threadripper X series processors in heavily threaded applications. However, the 18-core Core i9-9980XE typically trailed AMD's 24 and 32-core Threadripper WX series processors. Intel's Core i9-9980XE also offered relatively strong single-threaded performance, with an IPC advantage that's superior to any AMD Ryzen processor currently.
Pricing (Score:5, Insightful)
The pricing [hothardware.com] though... AMD still edges out in my book.
Re: (Score:3)
If that's the price of bragging rights then I'll skip this one.
Re: (Score:2)
If that's the price of bragging rights then I'll skip this one.
The price of bragging rights is always more than what most can afford. Otherwise, what's there to brag about?
Re: (Score:2)
For bragging rights it's tough to beat a 32 core 2990WX Threadripper, [techradar.com] now going $1730, that is, $150 less than the Intel part with 14 more cores. For that matter, a 16 core 1950X for $650 probably still makes you the best desktop on the block.
Of course, what we all really want is a 7nm 32 core Castle Peak Threadripper, possibly going to be announced about eight weeks from now. The ultimate desktop hotrod. Still TR4, so can do the build now with a 1950X as a placeholder, or a 1900X for $353, still a highly r
Re: (Score:2)
Go watch the Linus Tech Tips review released yesterday. That is how I wish all reviewers of products were. Whether getting the parts free or paying for them from their own pocket.
Re: (Score:3)
SMP should be perma-off I'd say
So you want a single core computer? It has nothing to do with hyperthreading (which itself is related to SMT). Speculative execution is only one of the optimizations involved, and the one that has all the security issues.
Re: (Score:2)
As soon as you look at prices and availability, Intel is utterly naked.
Re: (Score:3)
You would need to look at where the chip is diffused. I believe that the current AMD Zen chips are being produced at Global Foundaries Fab 8 which is located in New York. The next batch are being done at TSMC, so you would have a point there.
I believe that the design teams for both CPUs are largely in t
Re:Pricing (Score:4, Informative)
Actually you are both wrong. Intel's major fabs are in the US and AMD uses Global Foundries whose main CPU fabs are in Germany and the US. They both have plants elsewhere but they handle other products. Most other products including Apple's SoCs are made in Taiwan though under contract with TSMC who are aside from Global Foundries probably the largest gun for hire fab around. Samsung has their own fabs as well. There are a lot of fabs in the PRC and many chinese chips are made there but Intel and AMD chips aren't
Re: (Score:2)
Afaik AMD chips are assembled in the USA. Some in New York and some in Florida. My R7-1700 was assembled in Florida. Both AMD and Intel are US companies. Meaning they were all designed here.
Re: (Score:2)
In other words, the Israeli guys knew what they were doing.
Re: (Score:2)
Those smart guys who were busy elongating the pipeline?
Re: (Score:2)
From reading, well atleast attempting to... You have 0 room to speak.
Re: (Score:2)
My Ryzen says "diffused in USA", which translates as "made in USA, packaged in China". Wherever it's made, the lion's share of the profits are from selling the chip, not making it, and that all comes back to USA.
Re: (Score:2)
I'm going to assume atleast one more once consumer zen 2 hits middle of next year since they won't have 10nm ready until end of year(maybe).
Solder TIM? (Score:2)
So Intel finally adopt something that the modding community have been doing for years? Seriously late to the game guys. There's a reason Intel de-lidding is frequently done while there's borderline no point in doing it on AMD's high end offerings.
Fewer Cores and Hypertrheading is likely better (Score:3)
For almost all desktop use.
Unless your desktop is doing something that parallelizes really well you probably will never notice the benefits of this.
Even things that benefit from parallel processing are far better served by running them on truly parallel architectures. You have an application that can support fine grained parallelism, why run it on 18 cores of X86 when you run it on 1500 cores off a graphics card ?
Re: (Score:3)
What someone could use this for is virtualization on your desktop. But at that price you might as well get a Xeon proc and call it a day.
Re: (Score:2)
Sounds about right. I am sure it would be the cat's pajamas at simulating a small network of processors and testing your program on them.
Re: (Score:3)
Shared memory parallel codes (OpenMP) could benefit, though. Many (originally single-threaded) or homemade scientific applications run in this space: get some parallelism for relatively little work (insert pragmas, be careful to be thread safe, and test test test), without all the extra work of redesigning those simulations for efficient message passing.
You certainly find problems where it is much better bang for the buck to throw an expensive processor and OpenMP ( O($10^2 to $10^3) ) at a problem than to
Re: (Score:2)
I don't agree with your blanket generalization, ending in an unresolved comparison. Better for what?
The processor is made and optimized for running parallel tasks. Chores like raytracing, video transcoding and export, photo editing filters, science applications, and web-servers generally like this kind of processor.
I am in no way an Intel fan, but a 4.5GHz boost clock is quite respectable. I should think that it can handle single-core and low-core tasks well.
Re: (Score:2)
Unless your desktop is doing something that parallelizes really well
I thought I was pretty clear saying that. Just how much of a performance boost is your web browser going to get from extra cores when it should just stop running scripts/vidoes in windows/tabs that don't have focus ? or for that matter your word processor or video game ?
Chores like raytracing, video transcoding and export, photo editing filters, science applications, and web-servers generally like this kind of processor
Well ray tracing is almost certainly better handled by a GPU these days, same for video transcoding. Web Server that needs parallelism is a server application not a desktop application. Some science applications certainly but if they paralle
Re: (Score:3)
You have an application that can support fine grained parallelism, why run it on 18 cores of X86 when you run it on 1500 cores off a graphics card ?
Because a graphics card is not just 375 traditional CPUs jammed into a single package and just because something can scale to 18 cores doesn't mean it would run better on 1500 GPU cores.
Re: (Score:2)
You have an application that can support fine grained parallelism , why run it on 18 cores of X86 when you run it on 1500 cores off a graphics card ?
Because a graphics card is not just 375 traditional CPUs jammed into a single package and just because something can scale to 18 cores doesn't mean it would run better on 1500 GPU cores.
You really have a desperate need to learn how to read or at least learn what the relevant terminology actually means.
Re: (Score:2)
And you desperately need to understand the differences between a CPU and a GPU in the way that their processing actually works. Or you need to go home and spin up 1500 VMs using only a single core on your graphics card. Good luck getting that to boot before Christmas.
"I have cores" != "I can do anything you can do" and regardless of how parallel your application gets they will not necessarily run faster or better on a GPU, a specific device designed to run a very VERY specific subset of instructions compare
Re: (Score:2)
And you desperately need to understand the differences between a CPU and a GPU in the way that their processing actually works. Or you need to go home and spin up 1500 VMs using only a single core on your graphics card. Good luck getting that to boot before Christmas.
You can't seriously be that stupid ?
I have cores" != "I can do anything you can do" and regardless of how parallel your application gets they will not necessarily run faster or better on a GPU, a specific device designed to run a very VERY specific subset of instructions compared to a GPU.
I guess you can be that stupid. But at least you looked up the meaning of Fine Grained Parallelism. Unfortunately you failed to comprehend.
You have heard of this new thing that was invented called math maybe ? Why don't you just do the numbers and work out just how much of a performance advantage would be needed for 18 processors to outperform 1500 or say 3000 if you run 2 or well you get the idea.
Oh and this part ""I have cores" != "I can do anything you can do" that is
Re: (Score:2)
Fine grained parallelism does not have anything to do with if something is better or worse to do on a CPU than a GPU. But since you're all insults and no substance I'm sure you realised that a while back too. But whatever I'll go down to your level.
Oh and this part ""I have cores" != "I can do anything you can do" that is just fundamentally wrong.
Oh wow. I can't believe you called me stupid and then wrote a line like that when we were discussing performance. Tell you what, go dig out the old Turing machine (which as I think I may need to point out to you anyway is actually Turing complete), have it proces
Re: (Score:2)
Oh wow. I can't believe you called me stupid
Why is that difficult to believe with your attitude I am sure lots of people call you stupid.
Re: (Score:2)
Interesting given who it was that started with the name calling. I think everyone has learnt a bit about you today. Your other post just now was equally retarded.
Re: (Score:2)
Your desktop is almost always doing something that parallelizes really well. For example, browsing - each tab runs in a separate thread. And with Vulkan/DX12 games now will use as many cores as you have, to feed a big GPU. The classic one is gaming and streaming, that used to be an issue before Ryzen.
If you are compiling or doing anything with video there is no such thing as too many cores.
Re: (Score:3)
Browsing separate tabs really shouldn't be taking cpu at all. Haven't looked at Firefox's source but I will typically have a hundred plus tabs open and notice very little draw on my cpu resource. Right now it's pulling 5.1% on a quad core cpu with a guess at around a 100 tabs open and and at least 5 that are interactive. Streaming once again suspect the optimum use of the dollars is buying a higher end graphics card.
Re: (Score:2)
Play videos in multiple tabs. Access multiple crappy javascript sites. There are any number of ways to consume cpu in multiple tabs. I can only presume that you never looked at CPU consumption while browsing. In theory, browsing should be efficient. In practice, it isn't.
Re: (Score:2)
Actually I just did that's where my numbers came from. Can't say I am ever playing more than 2 videos at a time
Re: (Score:2)
And you don't seem to be clear on the distribution of work between CPU and GPU. It takes more cores to feed a bigger GPU. You go tell the streamers that they don't need multiple cores. They know otherwise.
Re: (Score:2)
And you don't seem to be clear on the distribution of work between CPU and GPU. It takes more cores to feed a bigger GPU. You go tell the streamers that they don't need multiple cores. They know otherwise.
Yeah that's a function of memory bandwidth more than anything else. You can throw all the cores you want at it, it doesn't matter if you don't have the bandwidth. You may have noticed that GPUs generally have much much wider memory ?
Anyway just for reference. You're taking data from the frame buffer, ideally encoding on the card using the card, then moving it out and maybe formatting with the CPU.
You might want to look up how the architecture is actually laid out and how this works.
Re: (Score:2, Informative)
You are still confused. GPUs have high on-board memory bandwidth because they use it internally for texel and vertex fetching etc. Graphics features like filtering are highly memory intensive with typically multiple accesses per texel per raster op in on board memory. The bandwidth the CPU uses to upload primary data to the GPU is comparatively much less. Unless you made a major mistake, like not populating both controller channels, your streaming setup is unlikely to bottleneck on memory, including reading
Re: (Score:2)
You are still confused. GPUs have high on-board memory bandwidth because they use it internally for texel and vertex fetching etc.
Says the guy who can't separate outcome from cause. Why they were initially designed that way is irrelevant.
You don't encode video on the GPU while rendering unless you are OK with dropping the frame rate.
Well seeing as encoding on a GPU is 4 to 5 times faster than on a CPU and it saves the time time of fetching unencoded video from the frame buffer to system memory. I'll be glad to trade the overhead. That is trade.
I feel that you are just burping out random factoids
Projection seems to be strong with you, so is feeling over actual understanding
Re: (Score:2)
So, crashing the party to showcase your awe inspiring intellect then.
You don't encode video on the GPU while rendering unless you are OK with dropping the frame rate.
seeing as encoding on a GPU is 4 to 5 times faster than on a CPU and it saves the time time of fetching unencoded video from the frame buffer to system memory. I'll be glad to trade the overhead.
Why is it necessary to explain it to you in words of one syllable? Lose frame rate.
Re: (Score:3)
Thanks for showing me wrong.
I should have gone with my first impression that you were an idiotic troll when you said this
Play videos in multiple tabs. Access multiple crappy javascript sites. There are any number of ways to consume cpu in multiple tabs
https://slashdot.org/comments.... [slashdot.org]
But I gave you the benefit of the doubt. My bad.
Now you are coming up with this
Why is it necessary to explain it to you in words of one syllable? Lose frame rate.
And you have removed all doubt.
You turn on streaming you are going to lose frame rate no matter what you do. The questions are, how are you are going to lose more and what is the best use of resources to build the system.
Re: (Score:2)
the questions are, how are you are going to lose more and what is the best use of resources to build the system.
Tying up GPU compute units isn't it. You obviously are no gamer. But you are a loudmouth.
Geez can't you even get the terms you're talking about right ? It isn't the set of gamers but subset of gamers that stream.
Instead of being an asshole, try actually learning about what you are talking about. Then you wont wind up saying stupid things and cherry picking horrifically bad examples like "Trying to watch videos in closed tabs", to show you're not an idiot.
Re: (Score:2)
Did you really just type "watch videos in closed tabs"? You're losing it, go take your meds.
No you did
Play videos in multiple tabs. Access multiple crappy javascript sites. There are any number of ways to consume cpu in multiple tabs
https://slashdot.org/comments [slashdot.org].... [slashdot.org]
So you are lecturing others on not being an asshole, got it.
I expect you get lectured about being an asshole a hell of a lot.
Re: (Score:2)
Just curious, how did "multiple tabs" become "closed tabs" in your mind?
Re: (Score:2)
Still don't comprehend how little you know do you ?
Re: (Score:2)
I comprehend than you have a screw loose.
Re: (Score:2)
And I comprehend you don't anything about how a browser works.
Re: (Score:2)
Perhaps your brain fever leads you to imagine that videos stop decoding when you switch tabs.
Re: (Score:2)
Mine no. People who make the browser could be.
Re: (Score:2)
So you understand that a browser can consume CPU per tab by decoding a video per tab.
Re: (Score:2)
So you understand that a browser can consume CPU per tab by decoding a video per tab.
HAHAHA
You still are so fucking stupid.
Try doing a little research on how this actually works and how CPU video decoding actually works.
Re: (Score:2)
Just in case, I verified using appropriate tools. I doubt you are capable of that.
Re: (Score:2)
Unh hunh
Somehow I doubt you even knew what to check
Re: (Score:2)
You've made it abundantly clear your technical skill rounds to zero. And you're delusional, that's heady stuff. Must be confusing to be inside you.
Re: (Score:2)
You've made it abundantly clear your technical skill rounds to zero. And you're delusional, that's heady stuff. Must be confusing to be inside you.
Unh hunh. Just for the fun of it, I am going to suggest you bounce your position off someone else.
I am going to guess they will do exactly what I did, which is first try to explain why you are wrong and then write you off as the moron you are.
Ciao
Re: (Score:2)
Your attempt at explanation was exactly what confirmed you have no technical skill. You're the old fart shouting at the cloud.
Re: (Score:2)
Oh I don't know
I am not the guy who thinks playing videos you can't see is a good reason to have more cores.
Re: (Score:2)
So the straw you're hanging onto is, exactly one use case doesn't apply to you.
Re: (Score:2)
Ryzen and Threadripper have more than just extra cores. More PCIe lanes, for example. Stuff that matters for workstations.
Re: (Score:2)
Ryzen and Threadripper have more than just extra cores. More PCIe lanes, for example. Stuff that matters for workstations.
Absolutely true. Without looking I would also bet the supporting chipsets are higher end as well
Re: (Score:2)
The main reason to have more cores is to be able to do more things at once. Remember when you used to be limited by your cpu how many things you could do at once without bogging everything down? You no longer have to worry about closing chrome when you go to play your game for a few hours. Providing you have the ram overhead to make up for chromes memory leaks(seriously they have had memory leaks for a decade, literally since day 1 what gives Google??)
Re: (Score:2)
Yeah that's true.
It's just that for most people the number of things they want to do at the same time is less than what their rigs can currently handle
Re: (Score:2)
But we're nerds... Do we really care about those people?? :)
Vulnerabilites (Score:5, Insightful)
I didn't see any mention of addressing Meltdown, Spectre, L1TF.. so I assume those general architecture issues are not yet addressed, this is still Skylake.
Re: (Score:2)
And no one running these processors will care. In fact most of the people affected by Spectre and Meltdown are likely running Xeons.
Re: (Score:2)
Fact is, people do care. Whether it is a perception of sloppy Intel engineering, or security, people do care.
Re: (Score:2)
Nope. A couple of angry nerds care, and a couple of system administrators of large virtual servers care. If there is one thing that has been made 100% clear by people, their reaction to this, Intel's shareprice, Intel's marketshare, it's that people in the general case to mean the vast majority of computers users, most definitely do NOT care.
Re: (Score:2)
You are out of touch. Take a quick run around the comments section on any Intel vs AMD article and you will find Meltdown frequently cited. And Meltdown has gone mainstream. [theguardian.com] Even the business pages talk about it because it is affecting Intel's stock price.
See, it's like GMO, it may or may not affect you directly but it is always a concern and a source of endless debate, such as this. The only way out of this for Intel is to fix it definitively in hardware as opposed to papering over by minor circuit tweaks,
Re: (Score:2)
Frequently cited means nothing ultimately. Bitcoin and blockchain technology is frequently cited too, and ultimately it was a management discussion fad that went no where. While we frequently cite things it means nothing when ultimately business practices haven't changed.
Now that isn't universal. There has definitely been work in the cloud space, which makes perfect sense too since they actually have direct exposure to the issue as their business model relies on having people run their code on machines you
Re: (Score:2)
actual working exploits outside of carefully controlled lab experiments, or balls out just prove I can copy some random bits which I can't identify as belonging to something have yet to be seen or developed... a full year later.
Wow, where have you been? [github.com]
Q: Has Meltdown or Spectre been abused in the wild?
A: We don't know. [meltdownattack.com]
Re: (Score:2)
Thanks for proving my point. You just linked me to a whole series of lab experiements which require up front knowledge of the computer in question.
If someone is in a position to gain enough knowledge about your machine to use any of the examples you just linked to, to pardon my French, you're already properly fucked, ... or your a cloud / VM provider which as I pointed out earlier are exactly the kind of people who are actively at risk here.
In terms of security risk for the 99.9% of people out there, this r
Re: (Score:2)
In terms of security risk for the 99.9% of people out there, this ranks lower than...
Says random internet guy, knowing better than the security researchers.
branch prediction vulnerability? (Score:2)
Re: (Score:2)
If they did, they could not fake benchmarks like this.
Re: (Score:2)
Is sure addresses them! They're even faster than before!
Of Course A Thread Ripper Outperforms It (Score:2)
I have a system encoding 8 1080p video streams
Re: (Score:2)
No you can not build a quad TR system. One cpu per machine until you get to epyc. The 2 becomes maximum currently.
Hello again Mr.7980XE (Score:5, Insightful)
I see you got a fancy new power curve, soldered TIM and nothing else!
As Linus said: (Score:2)
Now it can do it in under 2 seconds!
It's not the fastest desktop processor (Score:2)
It's not the fastest desktop processor when it trails 24 and 32 core ThreadRippers.
That's not how it works. Fastest doesn't mean slower.
Re: (Score:2)
It's not the fastest desktop processor when it trails 24 and 32 core ThreadRippers.
That's not how it works. Fastest doesn't mean slower.
In the benchmarks, the new Core i9-9980XE is easily the fastest many-core desktop processor Intel has released to date
I didn't realize Intel was releasing ThreadRipper CPUs.
95-core Pantyripper (Score:2)
I'm holding out for all the cores.
Hardware vs Software (Score:2)
Re: (Score:2)
This also kills me with cell-phones. People will pay ~$1000 for a phone... but refuse to buy $1 apps to use on it. They will go WAY out of their way to find a "free" app that does something similar...
I absolutely cannot understand this phenomenon.
Re: (Score:2)
Many do the opposite, they will buy (or pirate) expensive software because "its the thing to have" but then skimp out on the hardware to run it on.
Re: (Score:2)
I'd rather pay nothing for a pirate version that removes drm code and thus runs faster than the paid version...
Re: (Score:2)
All you fanboys need to chill (Score:2)
Re: (Score:2)
AMD is claiming Zen 2 29% IPC lift. If it's anything like what they said before release of zen 1 we may see 35-40% IPC lift. Even the former outs Intel in 2nd place. Hopefully they can get 10nm working sooner than later to keep innovation flowing.
YAY more Hz... (Score:2)
SteveJobs R&D proved cores greater than 2X exhibit diminishing throughput on Intel for Darwin. A lot has changed; Darwin included as well as MacOS X with GPU onboard processing et. al. with cores doing look ahead, graphics, memory, etc...
Could a generous anonymous type Avie Tevenian kernel nerd step in to raise all knowledge; level to the state of art on silicon? Are Hz marketing ' Intel' real world throughputs 'Inside'.
Re: (Score:3)
warm my home and play Crysis.
On medium settings
Re: (Score:2)
Re:Or maybe not (Score:5, Insightful)
Yep, and you demonstrate that wonderfully. The AMD Threadripper is better for some things, the Intel chip is better for some things, and *depending on your needs and budget* each could be "better".
Intel wins the IPCs, but is crushingly expensive. Many people would take the half price of the AMD part and be quite happy with it. Others for whom money isn't that much of an object will go with the 9980XE. Still others who need Blender, Cinebench or POVRay workloads done would be fools to buy anything but the Threadripper.
All in all, everyone has a different need and will cherrypick based on that need.
Re: (Score:3)
Intel wins the IPCs
Not for very much longer. [wccftech.com] And for me, like most of us, value is the decider. I'm also finding AMD's thermal performance excellent these days, and I just love how long the sockets last. AM4 really delivered on its future-proof promise.
Re: (Score:2)
That's one of the leading reasons AMD is considered the budget conscious chip maker. Sure you may not be able to use all of the latest and greatest features of the new chip but you can buy it and use it until you can afford the newer chipset that supports all of the new features. And it will still be faster than the old generation chip you replaced it with. If you're not like me and keep all old hardware, you could even sell it used to offset the price of the new motherboard with no downtime waiting to buy
Re: (Score:2)
Zen 2 should fix that IPC offset. Putting Intel in 2nd place for he first time in a decade! The next few cpu cycles should should be great for us consumers! I love when the competition is razer sharp on both sides! Consumers always win in these situations. But as of today you're 100% correct and you need to always use the right tool for the job.
Re: (Score:3)
I like the TR parts but they really need to cut down the idle power, 100W+ at idle (!)
Tom's Hardware says 35 watts for the 2990WX [tomshardware.com] at idle.
For my trusty Ryzen 1700 box, the entire system power measured at the wall is 38 watts at idle.
Re: (Score:2)
That 64 core processor is going to have a hearty price tag at launch though. Not to mention the board and cooler for it.
Re: (Score:2)
Well, you will get fu**** by Intel. No idea whether that counts for you.
Re: (Score:2)
Aaaaaaand fail. There are tasks that are inherently single-thread and these are not exotic ones.
Re: (Score:2)
Define "Linux can now complete infinite loops in 2 seconds, instead of 5!" please. I have never had such problem. I have seen the infinate boot loops due to bad motherboard firmware or windows updates. I have not had either of those two issues on Linux what wasn't on an embedded device.
Re: (Score:2)
It's Intel, did you expect anything less? Let's be fortunate we have reviewers like Linus that care more about informing the viewer/consumer than appeasing the company for giving him free hardware.