Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Intel Hardware Technology

Preview of AMD Ryzen Threadripper Shows Chip Handily Out-Pacing Intel Core i9 (hothardware.com) 180

MojoKid writes: AMD is still days away from the formal launch of their Ryzen Threadripper family of 12 and 16-core processors but OEM system builder Dell and its Alienware gaming PC division had an inside track on first silicon in the channel. The Alienware Area-51 Threadripper Edition sports a 16-core Ryzen Threadripper 1950X processor that boosts to 4GHz with a base clock of 3.4GHz and an all-core boost at 3.6GHz. From a price standpoint, the 16-core Threadripper chip goes head-to-head with Intel's 10-core Core i9-7900X at a $999 MSRP. In early benchmark runs of the Alienware system, AMD's Ryzen Threadripper is showing as much as a 37% percent performance advantage over the Intel Core i9 Skylake-X chip, in highly threaded general compute workload benchmarks like Cinebench and Blender. In gaming, Threadripper is showing roughly performance parity with the Core i9 chip in some tests, but trailing by as much as 20% in lower resolution 1080p gaming, as is characteristic for many Ryzen CPUs currently, in certain games. Regardless, when you consider the general performance upside with Ryzen Threadripper versus Intel's current fastest desktop chip, along with its more aggressive per-core pricing (12-core Threadripper at $799), AMD's new flagship enthusiast/performance workstation desktop chips are lining up pretty well versus Intel's.
This discussion has been archived. No new comments can be posted.

Preview of AMD Ryzen Threadripper Shows Chip Handily Out-Pacing Intel Core i9

Comments Filter:
  • by Anonymous Coward on Monday August 07, 2017 @09:38PM (#54960581)

    When setting a mug of coffee on the AMD CPU it will heat it faster than the puny Intel CPU for the same amount of processing!

    • From what I've read, the TDP of AMD's current chips is very acceptable, perhaps even better than Intel's given they measure it differently. Time will tell, but it seems like you won't eat ten times the cost savings of the AMD chip in terms of electrical waste with this lineup. I just wish the chipset supported nvme raid.
    • by steveha ( 103154 )

      The reason Intel was eating AMD's lunch for over half a decade was that Intel was two generations ahead on processor fab technology, and as a result Intel had an absolutely huge advantage in power efficiency.

      AMD made the difficult decision to skip one generation completely and they are now fabbing 14 nm chips; they have caught up to Intel. (Someday Intel will move to 10 nm and the race will continue.)

      According to a table released by Intel [pcworld.com] the top i9 chips will be rated for 165 Watts TDP. AMD's chips are r

      • by 0123456 ( 636235 )

        Generally speaking, AMD get ahead when Intel screw up. Which is what they've been doing for the last few years, getting lazy with only making minor tweaks to the same architecture.

        Once Intel sharpen their pencils and get to work, AMD have a hard time keeping up when Intel's R&D budget is larger than AMD's revenue.

        Then Intel screw up again and the cycle repeats.

        • by steveha ( 103154 )

          Then Intel screw up again and the cycle repeats.

          There's something to what you say. But AMD's current lineup looks very strong, and AMD should be able to carve out a niche as the price/performance brand.

          It was very tough for AMD to compete when they were two generations behind. Now they should do very well for a while. And even if Intel goes to 10 nm, AMD should do okay with 14 nm parts... 14 vs. 10 is an easier battle than 32 vs. 14!

        • by tlhIngan ( 30335 ) <slashdot.worf@net> on Tuesday August 08, 2017 @04:22AM (#54962617)

          Generally speaking, AMD get ahead when Intel screw up. Which is what they've been doing for the last few years, getting lazy with only making minor tweaks to the same architecture.

          Once Intel sharpen their pencils and get to work, AMD have a hard time keeping up when Intel's R&D budget is larger than AMD's revenue.

          Then Intel screw up again and the cycle repeats.

          Or Intel screws up and slows down to avoid killing AMD. When AMD is in trouble, Intel is in trouble - you don't want the nice cushy arrangement with patents and market leadership to be upset because your competition dies out do you?

          AMD was in dire straits running out of money. They got a reprieve in the form of Sony and Microsoft, likely because Intel pawned them off to give AMD 10 years of guaranteed cash.

          Intel's letting Ryzen/Epyc/Threadripper play out on purpose - let AMD build up its cash reserves to the point where folding is no longer likely to give them government regulators and competition bureaus off Intel's back. Let AMD get some more marketshare so they appear good competition, and then keep them where they are.

          Killing AMD does no one any good - not us as users, not Intel (they'd lose those nice zero-dollar cross-patent licenses, and likely have to pay others like ARM for the same patents, plus who knows how many years of government oversight, maybe even forced to break up - you can have fab side, you can have the design side, but not both). AMD where they are is good for Intel. AMD looking good is also good for Intel - hopefully AMD puts all the money in the bank for the lean times.

          • Generally speaking, AMD get ahead when Intel screw up. Which is what they've been doing for the last few years, getting lazy with only making minor tweaks to the same architecture.

            Once Intel sharpen their pencils and get to work, AMD have a hard time keeping up when Intel's R&D budget is larger than AMD's revenue.

            Then Intel screw up again and the cycle repeats.

            Or Intel screws up and slows down to avoid killing AMD. When AMD is in trouble, Intel is in trouble - you don't want the nice cushy arrangement with patents and market leadership to be upset because your competition dies out do you?

            AMD was in dire straits running out of money. They got a reprieve in the form of Sony and Microsoft, likely because Intel pawned them off to give AMD 10 years of guaranteed cash.

            Intel's letting Ryzen/Epyc/Threadripper play out on purpose - let AMD build up its cash reserves to the point where folding is no longer likely to give them government regulators and competition bureaus off Intel's back. Let AMD get some more marketshare so they appear good competition, and then keep them where they are.

            Killing AMD does no one any good - not us as users, not Intel (they'd lose those nice zero-dollar cross-patent licenses, and likely have to pay others like ARM for the same patents, plus who knows how many years of government oversight, maybe even forced to break up - you can have fab side, you can have the design side, but not both). AMD where they are is good for Intel. AMD looking good is also good for Intel - hopefully AMD puts all the money in the bank for the lean times.

            You must be in La-la land. Business is business, in this quarter, Intel is bleeding. Intel is actually looking to expand outside of chips and hardware. When you can't compete, you flee to alternatives.

      • The reason Intel was eating AMD's lunch for over half a decade was that Intel was two generations ahead on processor fab technology, and as a result Intel had an absolutely huge advantage in power efficiency.

        That, and the unfortunate decision to share FPUs between cores, resulting in great integer (compilation) performance but less great gaming, compression and spreadsheeting. But those are secondary effects. The primary cause was Intel cutting off AMD's air supply by illegal trust-making strategems.

      • The reason Intel was eating AMD's lunch for over half a decade was that Intel was two generations ahead on processor fab technology, and as a result Intel had an absolutely huge advantage in power efficiency.

        AMD made the difficult decision to skip one generation completely and they are now fabbing 14 nm chips; they have caught up to Intel. (Someday Intel will move to 10 nm and the race will continue.)

        According to a table released by Intel [pcworld.com] the top i9 chips will be rated for 165 Watts TDP. AMD's chips are rated for 180 Watts TDP. A 15 Watt difference is not a big deal, and AMD chips are so much less expensive that you will save money even if electricity is expensive where you live.

        The most wasteful AMD chips would be the 220 Watt Vishera-core chips... fabbed on 32 nm, ouch. Newegg still sells them but I'd sooner buy a Threadripper.

        From what I read and what AMD presents, their 1700 series has a tdp of 65 watts. Intel's is twice that amount of wattage, and with less performance.

  • Gaming is great and all but my real interest is on the computing power per Watt. This is a tech site and I would think people would want to know if datacenters are about to switch their boxen to AMD in the near future. This actually is something that matters.

    • by Dunbal ( 464142 ) *
      So long as performance is in the ballpark I'll be willing to switch. Rather a small performance hit than buying a chip with a built in backdoor.
      • Sorry to break it to you but AMD has backdoor of it's own called PSP. I just hope they make it open for scrutiny [slashdot.org] but I wouldn't hold my breath.

      • You can't have a CISC chip, or even a RISC-based chip that has CISC features (like ARM), if you want to be protected from backdoors. You need a real RISC chip where your registers are literally registers, and there is no microcode. Basically, the smallest of the microcontrollers from the 1980s. You'll have to build everything from the ground up with a cluster of those things.

        Or, accept that you don't know what microcode does, you don't really trust it, and just decide which one to use anyway. And then you'r

        • by cb88 ( 1410145 )
          Microcode isn't what you think it is. Microcode is essentially a big look up table for decisions in how the processor interprets an instruction...

          It isn't a firmware in the sense of an operating system it is essentially instruction scheduling and other related things. It makes it easier to resolve large classes of minor bugs that would otherwise be showstoppers without microcode.

          Microcode can in a sense add instructions to the CPU... as that is where they are defined. But the hardware to execute said instru
          • As a firmware programmer, I don't even read comments that start off with idiocies like telling me what I don't know. I do know that there could be nothing in my comment that would mean I don't know things I really do know, so when I read you say that shit I know you're not even being remotely logical and you're just going to spew. So I spend the time I would have spent reading your words to tell you I'm not reading your words.

            If you assume I *do* know what I was talking about, you could always re-read it wi

    • -1,000 for using the word boxen.
    • You lost all credibility with "boxen"

    • I would think people would want to know if datacenters are about to switch their boxen to AMD in the near future.

      No big operator is going to share their plans with you. Suffice to know that data center operators are the least loyal users on the planet... probably already running QA on threadripper boxes.

      • My point was that if it is cheaper and more efficient that it's clear that datacenters would switch and that's something people on this site would like to know about.

        • My point was that if it is cheaper and more efficient that it's clear that datacenters would switch and that's something people on this site would like to know about.

          My point was that you don't really need to ask that question because you know they will. Of course, I appreciate a nice leak as much as the next guy. So data center guys hanging on /. as anoncows: tell us all the juicy details of what you're thinking about installing and shy please, thx.

          • My point was that you don't really need to ask that question because you know they will.

            The question was about the power consumption, expressly, is it higher or lower than what Intel is offering. The rationale for that question is that it dictates what datacenters would do. So, do you know the answer? Because I do not.

            • The question was about the power consumption, expressly, is it higher or lower than what Intel is offering. The rationale for that question is that it dictates what datacenters would do. So, do you know the answer? Because I do not.

              Not quite right: in case of a tie, datacenters select on cost. So I will settle for a tie (likely) and so will data center operators. And just keep asking for those leaks, they will come.

              • The point about datacenters was an aside. My question was about the power consumption of the chips. Do you know which is lower?

                • I presume that mips/watt is roughly a tie, because the process is roughly a tie. The big tiebreaker is price.

  • Is it possible to build an AMD Threadripper Hackintosh? The performance data looks very good, high performance, low power. Time to rip some threads!!!

    • Comment removed based on user account deletion
    • No, you need to use a processor at least closely related to one which appeared in an actual Mac. Last I checked the had un-open-sourced the components that would allow the community to fix that, but that was a long time ago, maybe that bit of source is available again.

  • Seems that if you max out the Ryzen on Linux or BSD, it can (under certain conditions) cause a reset:
    https://hothardware.com/news/freebsd-programmers-report-ryzen-smt-bug-that-hangs-or-resets-machines
  • Threadipper at $550 has more pci-e then Intel at X2 the cost.

    For $599 you still only get 28 lanes with intel

    Even on the desktop not high level you get more as well.

  • Are we still waiting for these mystery drivers/patches to make any new AMD CPU decent at games? What processor do you buy if you want raw grunt and be good at games? Hint: it's not AMD.
    • Comment removed based on user account deletion
    • by Z80a ( 971949 )

      Gaming on intel vs AMD is a quite bizarre complex case.
      If you check this digitalfoundry video:
      https://www.youtube.com/watch?... [youtube.com]
      You will see that intel or AMD victory depends a lot on not only the game, but what the game is doing.
      Crysis 3 for example gets the best performance on the intel chips while only displaying some characters on close up, but as soon the camera pans to action with grass, helicopters, explosions etc, the AMD part starts to win and hard.

      Also you only get a significant difference IF you'r

  • Just a double whammy there. The new Intel CPUs aren't compatible with the old motherboards

    http://www.pcgamer.com/intels-... [pcgamer.com]

    It looks like they are practically driving people AMD's way. Nice to see the shakeup though it's been far too long.

    • Agreed, a friend is rebuilding / upgrading his wife's PC and his own and he's going full AMD - something we both used to run but gave in and switched to intel because AMD was just so far behind.
  • Summary of the video: 16 core threadripper beats 10 core i9 in a highly parallel job. Wow, wow, wow.
  • by bongey ( 974911 ) on Tuesday August 08, 2017 @01:19AM (#54961629)
    Zen/Ryzen has better performance per watt than anything Intel has currently or will offer for at least the next 6 months or longer. Example a 4-core 7700k Intel chip uses more power than a 8 core Ryzen 1700. It is basically impossible for Intel to get wattage down without a new lithography and arch, which isn't happening for more than 6 months or longer, cannonlake got pushed back until second half 2018, 10nm isn't going well.
  • Comment removed based on user account deletion

Kleeneness is next to Godelness.

Working...