Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Intel Hardware

'Intel 11th-Generation Rocket Lake-S Gaming CPUs Did Not Impress Us' (arstechnica.com) 68

ArsTechnica: Today marks the start of retail availability for Intel's 2021 gaming CPU lineup, codenamed Rocket Lake-S. Rocket Lake-S is still stuck on Intel's venerable 14 nm process -- we've long since lost count of how many pluses to tack onto the end -- with features backported from newer 10 nm designs. Clock speed on Rocket Lake-S remains high, but thread counts have decreased on the high end. Overall, most benchmarks show Rocket Lake-S underperforming last year's Comet Lake -- let alone its real competition, coming from AMD Ryzen CPUs. Our hands-on test results did not seem to match up with Intel's marketing claims of up to 19 percent gen-on-gen IPC (Instructions Per Clock cycle) improvement over its 10th-generation parts. It shouldn't come as an enormous surprise that Core i9-11900K underperforms last year's Core i9-10900K in many multithreaded tests -- this year's model only offers eight cores to last year's 10. On the plus side, Intel's claims of 19% gen-on-gen IPC are largely borne out here, mostly balancing the loss out in Passmark and Geekbench. This year's Core i5 makes a much better showing than its Core i9 big sibling. In Cinebench R20, Core i5-11600K almost catches up with Ryzen 5 5600X, and it easily dominates last year's Comet Lake i5 equivalent. It doesn't catch up to its Ryzen competitor in Passmark or Geekbench multithreaded tests, but it outpaces last year's model all the way around.
This discussion has been archived. No new comments can be posted.

'Intel 11th-Generation Rocket Lake-S Gaming CPUs Did Not Impress Us'

Comments Filter:
  • So what? (Score:4, Informative)

    by fred6666 ( 4718031 ) on Tuesday March 30, 2021 @12:30PM (#61217746)

    The i9 is slower because it has 8 cores instead of 10.
    All others CPUs are faster than before. And they seem competitive with AMD.

    • Yeah, seems like it's not terrible, not great just kind of the status quo with where we already are. The thing is, though, that Intel can crank out a shitload of those things on their very (too, lol) mature 14nm process while everyone else is going to be fighting over TSMC capacity.

      Kind of a missed opportunity, though. If they had 10nm cranking on all cylinders right now and were rolling out 7nm, they'd be dominating massively due to the fact of the capacity constraint everyone else is having. Instead they

    • Re:So what? (Score:5, Insightful)

      by UnknowingFool ( 672806 ) on Tuesday March 30, 2021 @12:56PM (#61217836)

      Is that all you read? All the reviews I have watched have said Intel's top of line 11900K is barely better than their previous generation 10900K in many benchmarks. The 11900K also loses many benchmarks tests to the Ryzen 7 5800X which is not even AMD's top of line. The Ryzen 9 5950X handily beats the 11900K in multicore workloads due to having more cores. Some reviewers noted that in matching the 5800X performance, the 11900K can consume significantly more power.

      The 11900K wins in a very few cases like AVX-512 workloads, and it should be readily available. While the Ryzen 7 5800X is not generally available everywhere all the time, multiple outlets have them. The 5900X and 5950X are harder to get. Price wise the 11900K is about $50 more at MSRP than the 5800X.

      In summary, Intel's newest and top of line Core i9 11900K loses out to AMD's midrange Ryzen 7 5800X in many benchmarks, pricing, and power consumption. The only real advantage is that you may be able to buy one easier and that is a strong "maybe". It is a "meh".

      • Re: (Score:3, Interesting)

        by nathanh ( 1214 )

        Ryzen 9 5950X handily beats the 11900K in multicore workloads

        Best of luck finding one. Those things are rarer than comments in enterprise code. Good thing about Intel is at least you will be able to buy them - owning their own fabs has at least one benefit Intel still maintains over the competition.

        Winning on paper then failing to produce (and sell) enough silicon is no way to build marketshare.

        • Then get the 5800X if you do anything but AVX workloads. The supply is the only thing saving Intel right now. If AMD could supply more 5950X chips no one would buy a 11900K
    • by Z00L00K ( 682162 )

      Intel is on place 20 in the Passmark test [cpubenchmark.net], so I'd hardly call them competitive.

      They are only living well on the single thread benchmarks, but on modern systems that's rarely a big advantage.

      • Correction: Intel used to live well on single core execution. These days that advantage is increasingly minor. Technically the 11900K is the going to be the fastest gaming chip by a small margin; however, it loses out in multicore performance. Someone building a gaming PC has a real choice when it comes to AMD or Intel these days.
      • Again, you miss one important point: pricing.

        I'm not buying any of these EPYC CPUs in the top 20 anyways. I'm not buying a Ryzen 9 either. Too expensive. Anything over Core i7/Ryzen 7 is a niche products that most people don't care about.

        So if I'm looking at the Ryzen 5 or Core i5, both seem competitive for the price they ask. My next CPU could well be an Intel, if the i5 is cheaper on the day of my purchase. Of course, I will also consider the motherboard price. AMD still had the advantage on this side, la

    • You'd be silly to buy a 11900K over the 10850K, unless you have AVX-512 workload requirements.
      • by nikclev ( 590173 ) *
        And you should pay attention to power consumption and heat (if that matters to you at all) numbers while doing AVX-512 workloads. At least one review site has noted some pretty big numbers, which if you don't have good cooling can mean bad things... From Anandtech: "The Core i9-11900K in our test peaks up to 296 W, showing temperatures of 104C, before coming back down to ~230 W and dropping to 4.5 GHz." (This was in AVX-512)
        • There is absolutely no problem if you don't have glorious cooling.
          Other than the processor clocking down to stay into what it thinks it is its "safe" operating zone.
          So, if you have a gaming mainboard with no "super ultra mega boost" limitations (PL2 and the like), and the top air cooler or water cooler, your processor will happily run at 280+ W.
          If your mainboard has limited "time at maximum boost" settings, no matter the cooling your processor will boost up to maximum turbo then slow down closer to "on the

          • by Cederic ( 9623 )

            So you absolutely do need to pay attention to power consumption and heat, or you end up with a very expensive chip that's throttled down to speeds you could have achieved with something cheaper.

            Top-end cooling is noisy and expensive, and needs factoring into a purchase.

    • And they seem competitive with AMD.

      Except for that power draw. Holy crap.
      One has to wonder just how much overclocking headroom is left on 14nm.

      • Not much according to GamersNexus video review https://www.youtube.com/watch?v=3n0_UcBxnpk [youtube.com] which is titled Waste of Sand. For perspective's sake, he's talking about the lack of progress over previous gen cpus, heat dissipation issues, lack of headroom for overclocking but also because 8 core cpu's don't really serve gamers or content creators (not sure I agree tho with the last point).
        • There isn't a game to be made that doesn't use all tue cores it can get.
          This isn't 2010 anymore.

          • A lot of games still don't take advantage of all those cores. Even the ones that do, they are usually constrained by the GPU unless the CPU is particularly weak.

  • by _Shorty-dammit ( 555739 ) on Tuesday March 30, 2021 @12:32PM (#61217754)

    "It's not showing 19% IPC improvements over the last gen like Intel says it does. But at least it is showing 19% IPC improvements over the last gen like Intel says it does." Okay, then.

    • It's quite weird (Score:5, Interesting)

      by Ecuador ( 740021 ) on Tuesday March 30, 2021 @12:56PM (#61217830) Homepage

      Yeah, it's quite weird, it's not Shroedinger's CPU, if you read their article, the improvement on same frequency on single threaded tests is from 16% to 38% - they are just benchmarks, but they are the benchmarks THEY run easily confirm's Intel's 19% claim. What they should have written is something like "Despite the 19% IPC increase which was delivered as promised to reach parity with rival AMD in single threaded performance, with a reduced core count they lag further behind in multi-threaded performance, performance/cost and performance/watt".
      I guess it is impressive that they reached Ryzen level single-core performance on their 14nm process, but that's not enough right now. They'll get there eventually...

  • Why would any Intel or AMD CPU literally impress anyone nowadays? The improvements are very.. "incremental". Performance per watt improvements are far from great. ARM is "where it's at", it's "The Future"(tm). Also, almost all "reporters" focus on nothing but "PERFORMANCE" (EXTREEEEM) and hardly ever mention what's really important - performance per watt.
    • I disagree. For desktop, especially, what I am interested in is performance per dollar.

      I wouldn't want the most efficient 1W CPU. I'd rather have a 65W CPU which is "only" 5x faster, especially if it has low idle power (my desktop CPU is idle over 99% of the time).

      • Indeed. And for those youngsters out there, 65W is only five watts over what used to be the power requirement of a single light bulb.

        Even if it's not impressive computing power per watt anymore, it's still a lot better than wasting that much energy to only get light and heat.

        • I believe each of the reviews I read talked about performance per watt. I don't care so much about the cost of electricity as much as my ability to cool my cpu to 80C in such a way that my computer doesn't sound like a jet engine. The right cpu for me was AMD's 5800x. But even with it's lower TDP of 105W, I still run it in ECO mode which reduces TDP to I believe 65W with very little of a performance hit. It's now easy to air cool and keep to 80C max. Anyway, my whole point here is that power draw is no
      • Don't forget that a 24" screen on its own consumes something like 20-25W. There's diminishing returns in lowering power consumption by lowering performance. In the end you'll not save much but you'll be waiting for the results longer, with you waiting for them and looking at a screen that will burn power in the CPU's stead.
        • The only real advantage are power constrained platforms AKA laptops. shave off some 5-15% of power usage, it means you use less power, don't need as big of a heatsink/spreader/fan to cool the thing down which means the space can be used for more battery. On the desktop it is really all just pointless. thing is these days the laptops are selling more than the desktops.
    • by Areyoukiddingme ( 1289470 ) on Tuesday March 30, 2021 @12:49PM (#61217818)

      ...almost all "reporters" focus on nothing but "PERFORMANCE" (EXTREEEEM) and hardly ever mention what's really important - performance per watt.

      That is not an accident. Intel issues "guidelines" to its pet reviewers for how to present their chip performance. Fail to follow those guidelines and you'll find yourself mysteriously missing from the list that receives free review chips. Intel marketing has been doing this a long long time now, and they went low and stayed low while GenX were still little kids. They're perfectly willing to use a big stick to keep "reviewers" in line.

      • Sorry but no. That conspiracy would only work if reviewers who actually buy chips and don't receive them for free all magically care about performance per watt. The reality is no one gives a shit for this series of CPU.

        You want to review a laptop CPU, then year let's complain about performance per watt being absent.

    • You mean how Zen 3 processors launched 6 months ago to 20% IPC improvement at the same power consumption? What were you expecting?
    • by UnknownSoldier ( 67820 ) on Tuesday March 30, 2021 @01:19PM (#61217950)

      > Why would any Intel or AMD CPU literally impress anyone nowadays?

      Having grown up with 8-bit 1 MHz CPUs the last decade of x86 CPUs have been BORING due to Intel shipping yet-another-incremental upgrade. Until Ryzen came along Intel literally held-up quad-core gaming by almost a decade.

      Ryzen 5000 and Threadripper has me, and other developers, VERY jazzed about CPUs again. When Intel CPUs are now considered budget gaming rigs you know the market has shifted dramatically. The performance of 3rd gen Ryzen is amazing bang-for-buck -- whether you are gaming, streaming, compiling, rendering, or doing HEDT.

      > The improvements are very.. "incremental".

      Huh?? With Intel yes, but Ryzen 1st gen, 2nd gen, and 3rd gen IPC changes have amazing performance uplifts. Did you also forget that Intel dropped the price of the 18C/36T i9 10980XE by a whopping 50% compared to the previous i9 9980XE due to Threadripper??

      Someone joked that AMD's unofficial slogan is We put the AMD in Amdahl's Law! with their Cores for Cheap strategy. It appears the business strategy of making a better product is paying off.

      > Performance per watt improvements are far from great. ARM is "where it's at"

      Traditionally,

      * ARM couldn't scale performance up,
      * x86 couldn't scale power down.

      Or at least the was the conventional wisdom. Apple's M1 Silicon is a potential game changer.

      So yes, due to how Silicon, can't reliably scale past 5 GHz "Elephant in the Room", Performance/Watt has finally taken over but it has taken over a decade for the industry to move past quad core.

    • Comment removed based on user account deletion
      • Yeah. I'm running a Haswell CPU right now and, granted I don't do very heavy things at home but to me it's more than enough. Even Linux VMs run pretty fast.
        The only reason I'm going to upgrade in the short term is that consoles have finally catched up in CPU performance and hence I'll need a more powerful CPU to run AAA titles soon. Compared to the 90s and early 2000s keeping the same CPU for 7 years is a looong time and I consider it well amortized.
    • I remember back in 1992 when I got my 486dx 50mhz, this was an upgrade of my Amstrad PC 8086 CPU

      It was really a major upgrade in all fronts, and you could tell the difference, things ran MUCH Faster, Graphics were Much Better.

      Now I had replaced a Core I7 3rd gen CPU with a Core I7 8th gen CPU. With more Ram and extra stuff, and I was unimpressed with the money I had spent. Not that my Computer was bad, but just that most software today really doesn't need too beefy of a Computer Anymore, except for the la

      • Yup, nowadays you won't probably notice a very big difference in day to day tasks. I currently run a 7-year-old CPU and I'm only looking into upgrading because of games. The rest? It runs more than fine.
    • and hardly ever mention what's really important - performance per watt.

      If not one is focusing on what's "really important", it's usually because it's not at all "really important". No one buying an i9 or comparable AMD gives a flying fuck about the performance per watt of their CPU. They care about two things:
      1) Performance
      2) Does it work. i.e. Performance per watt only becomes limiting if the watts start causing the power to drop.

    • Why would any Intel or AMD CPU literally impress anyone nowadays? The improvements are very.. "incremental".

      Everything is incremental these days.

      Performance per watt improvements are far from great. ARM is "where it's at", it's "The Future"(tm). Also, almost all "reporters" focus on nothing but "PERFORMANCE" (EXTREEEEM) and hardly ever mention what's really important - performance per watt.

      Personally couldn't care less about performance per watt. PCs sit idle most of the time.

    • because shock horror, most people don't give a shit about performance per watt in a desktop. ARM is fantastic at performance per watt, but for most workloads or gaming this is not an important factor.
      • by Cederic ( 9623 )

        They maybe don't, but they should.

        It's always a compromise. Needing 280W on top of a graphics card and everything else in a box means a beefier power supply, higher electricity costs, extra/noisier cooling.

        You're welcome to accept those downsides, but at least make a conscious choice.

        • bullshit. the requirements list for most graphics cards are extremely inflated and the CPU is the small part of the overall power consumption. Unless you are at the very top end your standard 500 or 600w PSU will be fine and if you are going for something more powerful then I doubt they will give a shit about the $20-50 extra they have to spend on the PSU or a better CPU cooler. Electricity costs are a rounding error and noise will mostly be from the GPU. The CPU noise and power consumption is one of the le
          • by Cederic ( 9623 )

            Unless you are at the very top end your standard 500 or 600w PSU will be fine

            Yeah, I'm going to buy the most expensive consumer chip and put it in a shitty system with a PSU that fails when I play a computer game.

            Shit, https://outervision.com/power-... [outervision.com] tells me you need more than 600W for the previous generation chip if you're adding in an rtx3080 (not even the top-end), basic storage/ram and a couple of USB devices.

            It tells me my current (four year old) PC needs 411W. That'll be the PC that's got a 750W PSU because you do the maths then add contingency and make sure you're not stre

  • And have no intention to move to an Armpple Silicon Mac wehn it stops getting supported.

    So, for me, even this chip is an upgrade ;-)
    And since it will be actually available to buy... more the better.

    • So what are your plans for the future? Continue running macOS "Big Sur" until you die, switch to Windows or switch to Linux?

      • So what are your plans for the future? Continue running macOS "Big Sur" until you die, switch to Windows or switch to Linux?

        Continue to run my air 2016 and macmini 2018 while suported as Macs (i.e. OS updates and/or security patches), then switch them to Windows via Bootcamp, and buy a more powerful machine (intel + windows) for gaming.

        I switch not because of the M1 per se (although it does not help) but because I do not agree with the direction Apple hardware is taking (soldered SSDs and RAM, + Glued on batteries), etc.

        As for Linux: I used FreeBSD in '95 and Linux in '96 for my thesis. I continued for years having a linux parti

    • And have no intention to move to an Armpple Silicon Mac wehn it stops getting supported.

      And when do you think that is?

  • Blast from the past.

    https://www.youtube.com/watch?... [youtube.com]

    How did Intel get the last couple or so gens so badly, badly wrong? Kudos to AMD from coming back with a roar.

  • by backslashdot ( 95548 ) on Tuesday March 30, 2021 @12:59PM (#61217864)

    First SpaceX's rocket blows up, now this .. what's going on with rockets lately? Aliens must really hate us.

  • I don't really see why they would bother since last time they were able to beat AMD without creating a superior product. Seems like marketing is a much better investment than engineering, especially when you're an established player with lots of connections into the new internet infrastructure, political, economic and technical.

  • by nathanh ( 1214 ) on Tuesday March 30, 2021 @03:24PM (#61218368) Homepage

    Another worthless article from the raging dumpster fire that is Ars Technica. Don't waste your time, go straight to someone who knows a thing or two, Dr. Ian Cutress from Anandtech.

    https://www.anandtech.com/show... [anandtech.com]

    • It's sad to see, but Ars really has plummeted precipitously in quality the past 5 years or so. Heavy on the political news, light on the technical. That and them pretending they are a serious "news" organization when they refuse to cover their own reporter getting arrested (and convicted) for pedophilia just makes them more of a joke than anything else.

  • by BLToday ( 1777712 ) on Tuesday March 30, 2021 @03:25PM (#61218370)

    It's a miss opportunity for Shania Twain jokes.

    "Oh-oh, you think you're special (14 ++ nm)
    Oh-oh, you think you're something else (10 nm)
    Okay, so you're a Rocket Lake

    That don't impress me much
    So you got the IPC, but have you got the cores?

    Now, now, don't get me wrong—yeah, I think you're alright
    That power draw will keep me warm on the long, cold, lonely night"

  • Impressive that Rocket Lake-S can hold out this well despite being manufactured on Intel 14nm (37.5 million transistors/mm^2 [wikipedia.org]). AMD is produced on TSMC 7nm (originally 96.5 MTr/mm^2, now 114 MTr/mm^2 [wikipedia.org] Intel 10nm is about 100 MTr/mm^2). Apple's M1 is produced on TSMC 5nm (173 MTr/mm^2 [wikipedia.org]).

    So Rocket Lake-S is handicapped by a 3x to nearly 5x disadvantage in lithographic density (which translates into both slower speed and higher power consumption). Apple is only expected to use about 40% of TSMC's 5nm capaci

IOT trap -- core dumped

Working...