Preview of AMD Ryzen Threadripper Shows Chip Handily Out-Pacing Intel Core i9 (hothardware.com) 180
MojoKid writes: AMD is still days away from the formal launch of their Ryzen Threadripper family of 12 and 16-core processors but OEM system builder Dell and its Alienware gaming PC division had an inside track on first silicon in the channel. The Alienware Area-51 Threadripper Edition sports a 16-core Ryzen Threadripper 1950X processor that boosts to 4GHz with a base clock of 3.4GHz and an all-core boost at 3.6GHz. From a price standpoint, the 16-core Threadripper chip goes head-to-head with Intel's 10-core Core i9-7900X at a $999 MSRP. In early benchmark runs of the Alienware system, AMD's Ryzen Threadripper is showing as much as a 37% percent performance advantage over the Intel Core i9 Skylake-X chip, in highly threaded general compute workload benchmarks like Cinebench and Blender. In gaming, Threadripper is showing roughly performance parity with the Core i9 chip in some tests, but trailing by as much as 20% in lower resolution 1080p gaming, as is characteristic for many Ryzen CPUs currently, in certain games. Regardless, when you consider the general performance upside with Ryzen Threadripper versus Intel's current fastest desktop chip, along with its more aggressive per-core pricing (12-core Threadripper at $799), AMD's new flagship enthusiast/performance workstation desktop chips are lining up pretty well versus Intel's.
AMD DECLARED WINNER! (Score:5, Funny)
When setting a mug of coffee on the AMD CPU it will heat it faster than the puny Intel CPU for the same amount of processing!
Re: (Score:2)
Re: AMD DECLARED WINNER! (Score:1)
250 watts on amds side. Intel is much cooler (relatively speaking)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
The reason Intel was eating AMD's lunch for over half a decade was that Intel was two generations ahead on processor fab technology, and as a result Intel had an absolutely huge advantage in power efficiency.
AMD made the difficult decision to skip one generation completely and they are now fabbing 14 nm chips; they have caught up to Intel. (Someday Intel will move to 10 nm and the race will continue.)
According to a table released by Intel [pcworld.com] the top i9 chips will be rated for 165 Watts TDP. AMD's chips are r
Re: (Score:1)
Generally speaking, AMD get ahead when Intel screw up. Which is what they've been doing for the last few years, getting lazy with only making minor tweaks to the same architecture.
Once Intel sharpen their pencils and get to work, AMD have a hard time keeping up when Intel's R&D budget is larger than AMD's revenue.
Then Intel screw up again and the cycle repeats.
Re: (Score:2)
Then Intel screw up again and the cycle repeats.
There's something to what you say. But AMD's current lineup looks very strong, and AMD should be able to carve out a niche as the price/performance brand.
It was very tough for AMD to compete when they were two generations behind. Now they should do very well for a while. And even if Intel goes to 10 nm, AMD should do okay with 14 nm parts... 14 vs. 10 is an easier battle than 32 vs. 14!
Re:AMD DECLARED WINNER! (Score:4, Insightful)
Or Intel screws up and slows down to avoid killing AMD. When AMD is in trouble, Intel is in trouble - you don't want the nice cushy arrangement with patents and market leadership to be upset because your competition dies out do you?
AMD was in dire straits running out of money. They got a reprieve in the form of Sony and Microsoft, likely because Intel pawned them off to give AMD 10 years of guaranteed cash.
Intel's letting Ryzen/Epyc/Threadripper play out on purpose - let AMD build up its cash reserves to the point where folding is no longer likely to give them government regulators and competition bureaus off Intel's back. Let AMD get some more marketshare so they appear good competition, and then keep them where they are.
Killing AMD does no one any good - not us as users, not Intel (they'd lose those nice zero-dollar cross-patent licenses, and likely have to pay others like ARM for the same patents, plus who knows how many years of government oversight, maybe even forced to break up - you can have fab side, you can have the design side, but not both). AMD where they are is good for Intel. AMD looking good is also good for Intel - hopefully AMD puts all the money in the bank for the lean times.
Re: (Score:2)
Or Intel screws up and slows down to avoid killing AMD. When AMD is in trouble, Intel is in trouble - you don't want the nice cushy arrangement with patents and market leadership to be upset because your competition dies out do you?
AMD was in dire straits running out of money. They got a reprieve in the form of Sony and Microsoft, likely because Intel pawned them off to give AMD 10 years of guaranteed cash.
Intel's letting Ryzen/Epyc/Threadripper play out on purpose - let AMD build up its cash reserves to the point where folding is no longer likely to give them government regulators and competition bureaus off Intel's back. Let AMD get some more marketshare so they appear good competition, and then keep them where they are.
Killing AMD does no one any good - not us as users, not Intel (they'd lose those nice zero-dollar cross-patent licenses, and likely have to pay others like ARM for the same patents, plus who knows how many years of government oversight, maybe even forced to break up - you can have fab side, you can have the design side, but not both). AMD where they are is good for Intel. AMD looking good is also good for Intel - hopefully AMD puts all the money in the bank for the lean times.
You must be in La-la land. Business is business, in this quarter, Intel is bleeding. Intel is actually looking to expand outside of chips and hardware. When you can't compete, you flee to alternatives.
Re: (Score:2)
The reason Intel was eating AMD's lunch for over half a decade was that Intel was two generations ahead on processor fab technology, and as a result Intel had an absolutely huge advantage in power efficiency.
That, and the unfortunate decision to share FPUs between cores, resulting in great integer (compilation) performance but less great gaming, compression and spreadsheeting. But those are secondary effects. The primary cause was Intel cutting off AMD's air supply by illegal trust-making strategems.
Re: (Score:2)
The reason Intel was eating AMD's lunch for over half a decade was that Intel was two generations ahead on processor fab technology, and as a result Intel had an absolutely huge advantage in power efficiency.
AMD made the difficult decision to skip one generation completely and they are now fabbing 14 nm chips; they have caught up to Intel. (Someday Intel will move to 10 nm and the race will continue.)
According to a table released by Intel [pcworld.com] the top i9 chips will be rated for 165 Watts TDP. AMD's chips are rated for 180 Watts TDP. A 15 Watt difference is not a big deal, and AMD chips are so much less expensive that you will save money even if electricity is expensive where you live.
The most wasteful AMD chips would be the 220 Watt Vishera-core chips... fabbed on 32 nm, ouch. Newegg still sells them but I'd sooner buy a Threadripper.
From what I read and what AMD presents, their 1700 series has a tdp of 65 watts. Intel's is twice that amount of wattage, and with less performance.
What about power consumption? (Score:2)
Gaming is great and all but my real interest is on the computing power per Watt. This is a tech site and I would think people would want to know if datacenters are about to switch their boxen to AMD in the near future. This actually is something that matters.
Re: (Score:2)
Re: (Score:3)
Sorry to break it to you but AMD has backdoor of it's own called PSP. I just hope they make it open for scrutiny [slashdot.org] but I wouldn't hold my breath.
Re: (Score:3)
You can't have a CISC chip, or even a RISC-based chip that has CISC features (like ARM), if you want to be protected from backdoors. You need a real RISC chip where your registers are literally registers, and there is no microcode. Basically, the smallest of the microcontrollers from the 1980s. You'll have to build everything from the ground up with a cluster of those things.
Or, accept that you don't know what microcode does, you don't really trust it, and just decide which one to use anyway. And then you'r
Re: (Score:2)
It isn't a firmware in the sense of an operating system it is essentially instruction scheduling and other related things. It makes it easier to resolve large classes of minor bugs that would otherwise be showstoppers without microcode.
Microcode can in a sense add instructions to the CPU... as that is where they are defined. But the hardware to execute said instru
Re: (Score:2)
As a firmware programmer, I don't even read comments that start off with idiocies like telling me what I don't know. I do know that there could be nothing in my comment that would mean I don't know things I really do know, so when I read you say that shit I know you're not even being remotely logical and you're just going to spew. So I spend the time I would have spent reading your words to tell you I'm not reading your words.
If you assume I *do* know what I was talking about, you could always re-read it wi
Re: (Score:2)
Todays cpus (incl ARM) have bloated "multimedia/3D" SIMD instruction sets and out-of-order execution with register renaming and dynamic vliw data paths.
So for MIPS that would be... R5000? What is it for SPARC, Super? Hyper? Couldn't have been all the way to Ultra. And it should be every POWER chip, right?
Re: (Score:2)
The SS20 also has an SX SMID engine... used mainly as it's built in graphics processor basically the CPU shoves instructions for it on a hardware queue and it runs them independently. It's good enough to allow a 50Mhz SuperSparc run KDE 3.x for instance which honestly is fairly impressive.
Bear in mind that RISC never was intended to mean minimal instructions set... merely minimized / op
Re: (Score:2)
POWER did: AltiVec
Re: (Score:2)
You can literally still buy the "true" RISC micros from the 1980s, built with modern processes so they consume very little power. None of it went out of production.
They cost about the same as what you pay for a single discrete transistor.
Re: What about power consumption? (Score:1)
Re: (Score:2)
You lost all credibility with "boxen"
Re: (Score:2)
You lost all credibility with "boxen"
Hand in your geek card.
Re: (Score:2)
I would think people would want to know if datacenters are about to switch their boxen to AMD in the near future.
No big operator is going to share their plans with you. Suffice to know that data center operators are the least loyal users on the planet... probably already running QA on threadripper boxes.
Re: (Score:2)
My point was that if it is cheaper and more efficient that it's clear that datacenters would switch and that's something people on this site would like to know about.
Re: (Score:2)
My point was that if it is cheaper and more efficient that it's clear that datacenters would switch and that's something people on this site would like to know about.
My point was that you don't really need to ask that question because you know they will. Of course, I appreciate a nice leak as much as the next guy. So data center guys hanging on /. as anoncows: tell us all the juicy details of what you're thinking about installing and shy please, thx.
Re: (Score:2)
My point was that you don't really need to ask that question because you know they will.
The question was about the power consumption, expressly, is it higher or lower than what Intel is offering. The rationale for that question is that it dictates what datacenters would do. So, do you know the answer? Because I do not.
Re: (Score:2)
The question was about the power consumption, expressly, is it higher or lower than what Intel is offering. The rationale for that question is that it dictates what datacenters would do. So, do you know the answer? Because I do not.
Not quite right: in case of a tie, datacenters select on cost. So I will settle for a tie (likely) and so will data center operators. And just keep asking for those leaks, they will come.
Re: (Score:2)
The point about datacenters was an aside. My question was about the power consumption of the chips. Do you know which is lower?
Re: (Score:2)
I presume that mips/watt is roughly a tie, because the process is roughly a tie. The big tiebreaker is price.
Re: (Score:2)
Re: (Score:2)
AMD Threadripper Hackintosh? (Score:2)
Is it possible to build an AMD Threadripper Hackintosh? The performance data looks very good, high performance, low power. Time to rip some threads!!!
Re: (Score:2)
Re: (Score:2)
No, you need to use a processor at least closely related to one which appeared in an actual Mac. Last I checked the had un-open-sourced the components that would allow the community to fix that, but that was a long time ago, maybe that bit of source is available again.
Pity you can't run BSD or Linux on it (Score:2, Insightful)
https://hothardware.com/news/freebsd-programmers-report-ryzen-smt-bug-that-hangs-or-resets-machines
They will issue a Microcode update to fix it... (Score:2)
... at the cost of 30% of the performance of the chip...
Deja-vú...
Buldozer anyone?
Re: (Score:2)
Re: (Score:2)
Pity you can't run BSD or Linux on it
So apparently you don't know about this thing called "microcode". Worst case scenario: they have to write a driver that is tweaked for Ryzen. If we can make Linux run on a Tamagotchi, we can make it run on Ryzen.
Re: (Score:2)
Do modern CPUs even have writable microcode? I know IBM systems in the 1970s and 1980s had microcode that loaded from floppies at boot.
Re: (Score:2)
Are you kidding me?! There has been microcode for every x86 processor made in the last decade!
Re: (Score:2)
You missed the part about writable...
Re: (Score:2)
They are ALL writable.
Threadipper has more pci-e then intel at X2 costs (Score:3)
Threadipper at $550 has more pci-e then Intel at X2 the cost.
For $599 you still only get 28 lanes with intel
Even on the desktop not high level you get more as well.
And it still sucks at gaming (Score:2, Informative)
Re: (Score:2)
Re: (Score:3)
Gaming on intel vs AMD is a quite bizarre complex case.
If you check this digitalfoundry video:
https://www.youtube.com/watch?... [youtube.com]
You will see that intel or AMD victory depends a lot on not only the game, but what the game is doing.
Crysis 3 for example gets the best performance on the intel chips while only displaying some characters on close up, but as soon the camera pans to action with grass, helicopters, explosions etc, the AMD part starts to win and hard.
Also you only get a significant difference IF you'r
Re: (Score:1)
"I'd personally settle for 60 FPS gaming."
VR needs 90+, and has to render once for each eye.
Re: (Score:2)
"VR needs 90+" - Sure, and my 3D TV needed a new HDMI spec when it was bleeding edge. Of course, I haven't turned on 3D in a few years, because outside of a few fun demos, it just wasn't justifiable.
I finally pulled the trigger on a Ryzen system, because my Phenom II Black platform was starting to have driver issues finally. I didn't really have any major issues doing whatever I wanted to throw at it in 2K, but I don't spend my nights juicing a few FPS out of benchmarks anymore. The biggest gains for years
Re: (Score:2)
Too bad for Intel with MB incompatibility (Score:2)
Just a double whammy there. The new Intel CPUs aren't compatible with the old motherboards
http://www.pcgamer.com/intels-... [pcgamer.com]
It looks like they are practically driving people AMD's way. Nice to see the shakeup though it's been far too long.
Re: (Score:2)
Re: (Score:2)
My AM2+/AM3 mobo supported both DDR2 and DDR3.
Intel ever offer anything like that?
Re: (Score:2)
If you're buying a new CPU you're also going to want faster memory too. So that would mean... buying a new motherboard!
Me absolutely. It's been at least 20 years since I did anything but a full system upgrade. Other people need to ask them. I know people that upgrade their systems component by component. Right now I am 32gig of high end ram so I could see someone not wanting to make the purchase again.
What a surprise (Score:2)
Re: (Score:3)
AMD performance per watt anything Intel has (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Um if you only look at synthetic benchmarks yes it does win but sadly rest of the results are don't put it so amd chip's way.
Could you revise your post so that we know what "it" refers to with regards to winning? And could you revise your sentence "sadly rest of the results are don't put it so amd chip's way." so that it uses grammar and makes sense? I would like to understand your point.
Re:yea um (Score:5, Funny)
He said that porn doesn't feel the same on an AMD chipset.
Re: (Score:2, Funny)
Re: (Score:3)
It's not all that surprising that gaming benchmarks don't scale as well to large numbers of cores. Videogame programming isn't a field in which performance can simply scale nearly linearly based on the number of hardware threads available. That's because the CPU is performing a huge number of very diverse tasks among all it's engine components, and there's a great deal of global coordination that occurs on a central database. It's essentially a heterogeneous workload, and those just don't scale as well.
T
Re: (Score:2)
That's because the CPU is performing a huge number of very diverse tasks among all it's engine components, and there's a great deal of global coordination that occurs on a central database.
Which is more likely to be just old architecture. In general, game worlds do exhibit at least some kind of locality or metrics amenable to distributing work. If nothing else, it should at least allow the emergence of new classes of games. For example, virtual worlds with smarter NPCs with actions based on actual reasoning or emulating more complex economies and such.
Re: (Score:2)
For rendering it comes down to how the scene is set up which is based on the viewport of the player as well as the action in the game. There is no way to predict how many points or triangles there will be on-screen at any given time and you can't run these calculations in parallel because there is only one player. It's not an architecture problem. It's a nature-of-the-problem problem that has been well known in gaming since at least Quake 3.
Re: (Score:2)
Re: (Score:3)
Because of optimization. You don't render all points in a space. You only render points in a viewport. That means part of the job of the game loop is constantly keeping track of which points in space you need to worry about for rendering purposes versus physics purposes. There are tons of algorithms to do this and it is one of the areas that people optimize pretty aggressively.
Once you have a point list, you can build a scene and render it based on a ton of other factors (z-order, shaders, shadows, ligh
Re: (Score:2)
The micro op cache probably does have a massive effect on synthetics... which is why it was ignored for bulldozer. Bulldozer was actually competitive when given real workloads in many cases. The fact is a Ryzen CPU is very similar in many ways to an Intel one... just different. The similarities are there mainly in order to remain competitive at ticking check boxes.
In the e
Re: (Score:2)
Videogame programming isn't a field in which performance can simply scale nearly linearly based on the number of hardware threads available.
Actually, it is. But there is a limiting factor: videogame management are cheapass bitches who can't stomach the idea of paying more money than they pay themselves to bring in the kind of engineer who can design and develop high performance parallel code.
Re: yea um (Score:2)
For home use then gaming performance counts, in business TPC-C may be more interesting.
Re: (Score:1)
Yep. What I see is a 16-core AMD CPU losing out to the 10-core Intel CPU in all of the 3DMark tests and Tomb Raider, which is the only real test that was conducted.
AMD's Ryzen 1950X has 60% more cores and a 100MHz clockspeed advantage and it still can't keep up with Intel.
Re: (Score:2)
Re: (Score:2)
That 18 core Intel processor will probably cost $600 more than the 16 core Threadripper processor, though. I'm not sure if the 15% performance boost will be worth the price.
Re: (Score:2)
The ThreadRipper probably still had plenty of cores on tap in those GPU benchmarks. While the Intel CPU was taking advantage of its longer, higher clocking and more wasteful pipelines to boost one core a bit faster to drive the GPU a tiny bit harder.
Lets not forget that ThreadRipper and EPYC have twice the IO and 50% more memory bandwidth than their competitor chips.
Re: (Score:2)
Re: (Score:3, Informative)
3D Applications Have Use GPU For Long Time (Score:4, Informative)
While no where near the power we have now, SGI was making dedicated 3D chips that were utilized not only in the creation of 3D scenes, but also in the final render. This was over 20 years ago. Professional houses have been using PC cards all the way back to the Voodoo 2 in 1999.
Now it would be almost unheard of, for any final rendering stage not to use the GPU.
Heck ILM has their own rendering plug-in with customized graphics drivers to try to cope with the rendering load.
No, a graphics card cannot handle all the textures, polygons and shaders needed to render a final scene, but they don't have to. They load in what is needed at the time, render their part, then load in the next part, only keeping the frame in the card's memory.
Actually it is very common on blockbuster movies for multiple cards to be working on one scene at the same time with each card rendering a section of the frame.
Re: (Score:1)
What?
Have you seen the real power consumption on Skylake-X?
Re: (Score:2, Insightful)
uh... electricity is cheap dude.
Re:Still a power hog (Score:5, Informative)
You are wrong. https://img.purch.com/o/aHR0cD... [purch.com]
Ryzen 1700 uses 35W less than a 7700k and 1800X uses 25W more. In gaming a Ryzen uses around 15% less which is typically the upper end how much slower it is in games compared to a 7700k. E.g. it is as efficient (games) or tons more efficient (when all cores can be used) than a Intel i7
Intel however is certainly ignoring their own power envelope with their factory overclocked CPU and from all news, their Skylake-X are worse, even the low end chips, in their mad dash to beat AMD. I doubt this will change with Threadripper which uses the same dies as Ryzen.
It doesn't matter if it's AMD or Intel: they always ignore your mythical "power envelope", especially when they are behind like Intel now and AMD before or when they have to press out the last bit of performance from an aging architecture like Intel now or AMD with the 9590.
Re: (Score:3)
I consider myself a gamer, but I WILL NOT burn through the power that a GTX 1080 consumes.
So you wouldn't buy a $500 graphics card because it'd cost you $5/year in electricity. Got it.
Re: Still a power hog (Score:3, Funny)
Re: (Score:2)
"I WILL NOT burn through the power that a GTX 1080 consumes"
I bet you had no problem saying that while running a fucking GTX 780, at nearly DOUBLE the TDP of a fucking Threadripper or GTX 1080.
Re: (Score:1)
I use a GTX 960, topping out at less than one third the consumption of a GTX 1080, and a fraction of the price. It would have been worse had I chosen the equivalent AMD Product at the time, which is why I chose Nvidia that occasion; AMD had gotten my vote the time before. Having shredded your desperate little hypocrisy theory, I wonder just what it is that motivated you to post it? What sacred little cow are you hiding in your barn?
Re: Still a power hog (Score:2)
That would be nice and all if it were true, but it's complete bullshit. TDP of a 960 is 120W, per Nvidias site. S
Re: Still a power hog (Score:2)
Bleeding edge performance always consumes a lot of power. Doesn't matter if it's computers or cars.
Re: (Score:2, Troll)
Re: (Score:2)
What are you talking about? These benchmarks show a 35% performance gap. Other benchmarks (do a quick google) show a 16% power penalty. The Ryzen is more efficient based on the benchmarks we've seen so far.
Still open-source friendly. (Score:2)
On the other hand AMD's GPU are still much more open-source friendly compared to Nvidia, and at the same time are still relevant when compared to Intel (even if not as power-efficient as Nvidia).
AMD has Linux devs on their payroll, is supporting 2 stack one of which (their long term goal) is opensource (runs a classical DRI/Mesa stack), while the other (eventually targetting for professionnals who need some weird features) leverages the same kernel driver.
The AMD opensource drivers are decent, offer support
Re: (Score:2)
I know a lot of you don't think small power consumption issues are a big deal but I thought I'd highlight a few points:
x) We now operate in a space where the physics of chips well into the future is already known, planned, and targetted for production. 7nm and below is atoms-wide production that we have been theorizing about for over a decade and what you are seeing is the culmination of a lot of that work today. This requires high skill and high tech to just be able to prototype, let alone mass produce.
Re: (Score:1)
I think it's amusing how you assume that I have a favorite doggie in the contest just because you do. Having a perpetual favorite is the behavior of idiots who can't even protect their own self-interest. Have fun with that.
Re: (Score:2)
"...but AMD is doing it at the continued cost of a significantly larger chunk of electricity..." [tomshardware.com]
Talk about "having a perpetual favorite" :)
Re: (Score:1)
I think it's amusing how you assume that I have a favorite doggie in the contest just because you do. Having a perpetual favorite is the behavior of idiots who can't even protect their own self-interest. Have fun with that.
Re: (Score:3, Informative)
But you have, it's blindingly obvious. Either that, or you're actually retarded.
Look at the percentage difference in power consumption. Then look at the difference in cost of acquisition. Now, think about how much electricity you can buy for that sum.
You'll find you'll be burning an awful lot of electricity before the Intel even hypothetically begins to pay for itself.
Re:Still a power hog (Score:4, Informative)
"...but AMD is doing it at the continued cost of a significantly larger chunk of electricity..." [tomshardware.com]
Talk about "having a perpetual favorite" :)
Re: (Score:2)
Re: (Score:2)
Not true.
Intel is continuing their idiotic practice of using poor quality thermal goo between the chip and the package.
This is not only untrue, it's the opposite of the truth. AMD's choice to group cache around units of 4 cores kills inter-process communication between cores when