Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Intel Hardware

Intel's Alder Lake Reviewed: 12th Gen Core Processors Bring Fight Back To AMD (hothardware.com) 154

MojoKid writes: After months of speculation and early look teases, Intel's 12th Gen Core processors are finally ready for prime time. Today marks the embargo lift for independent reviews of Alder Lake and it's clear Chipzilla is back and bringing the fight again versus chief rival AMD. Intel 12th Gen Core processors incorporate two new CPU core designs, dubbed Efficiency (E-core) and Performance (P-core). In addition to this new hybrid core architecture, 12th Gen Core processors and the Z690 motherboard chipset platform also feature support for the latest memory and IO technologies, including PCI Express Gen 5, DDR5, Thunderbolt 4 and Wi-Fi 6E. The new Core i9-12900K features a monolithic, 16-core (24-thread) die with 8 Performance cores and 8 Efficiency cores, while the Core i5-12600K has 10 cores/16-threads, comprised of 6 P-cores and 4 E-cores. Alder Lake E-cores don't support HyperThreading but P-cores can process two threads simultaneously while E-cores can manage only one, hence the asymmetric core counts.

In the benchmarks, the 16-core Core i9-12900K doesn't sweep AMD's 16-core Ryzen 9 5950X across the board in multi-threaded tests, but it certainly competes very well and notches plenty of victories. In the lightly-threaded tests though, it's a much clearer win for Intel and gaming is an obvious strong point as well. Alder Lake's performance cores are as fast as they come. The $589 (MSRP) 16-core Core i9-12900K competes well with the $750 16-core Ryzen 9 5950X, and the $289 10-core Core i5-12600K has a lower MSRP than a $299 6-core Ryzen 5 5600X. The new Core i5's power and performance look great too, especially when you consider this $289 chip outruns Intel's previous-gen flagship Core i9-11900K more often than not, and it smokes a Ryzen 5 5600X.

This discussion has been archived. No new comments can be posted.

Intel's Alder Lake Reviewed: 12th Gen Core Processors Bring Fight Back To AMD

Comments Filter:
  • by flyingfsck ( 986395 ) on Friday November 05, 2021 @05:07AM (#61959427)
    Were these tests also done with the inefficient on AMD, Windows 11 version?
    • by fazig ( 2909523 ) on Friday November 05, 2021 @05:18AM (#61959441)
      There are tests that did explicitly test with W10 because of unreliable W11.

      i9-12900k review: https://www.youtube.com/watch?... [youtube.com]
      i5-12600k review: https://www.youtube.com/watch?... [youtube.com]
      Some W11 testing to compare the results with W10 will also follow from Gamers Nexus.


      This is how the CPU market should look like, with competitors doing some actual competing. You can expect that AMD's next Zen architecture will beat Intel again until Intel makes advancements on their own.
      This is good for the consumer.
      • by AmiMoJo ( 196126 ) on Friday November 05, 2021 @08:17AM (#61959773) Homepage Journal

        The very important thing to note about these tests is that to get the maximum performance from the Intel parts you need extreme cooling. For desktop that means a high airflow case and large heatsink/fan (or water cooling). For laptops it means you are screwed and unless you want to lug around a small suitcase you won't be seeing top tier performance.

        The AMD last-gen parts are only a little behind, but use a fraction of the power and generate a fraction of the heat. With Ryzen 6 due soon I'd be more inclined to wait for that, rather than rushing out and buying an Intel CPU right away.

        As well as being a lot more efficient, Ryzen 6 will be AMD's first change of socket in many years. With Intel they change a lot more regularly so you lose the ability to upgrade your CPU by more than a generation or two.

        • The AMD last-gen parts are only a little behind, but use a fraction of the power and generate a fraction of the heat.

          In my mind, that would mean they are ahead.

        • by fazig ( 2909523 )
          It all depends on what the User needs/wants.

          From a lot of personal experience, building PCs for others, most people don't care about power consumption at all. They want processing power so they can run their games or whatever as fast and responsive as possible (for the money).
          The power bill is more like a tertiary concern.
          What most people care about is noise. This of course is a concern with CPUs that put out around 250W of heat. If you want to sustain that performance just with tower coolers and case fa
        • by Z00L00K ( 682162 )

          When I look at the figures for PassMark software then the Intel processor is good at single thread operation, but lags behind on multi thread operation quite a bit.
          And single thread performance is so yesterday even though a few programs still depends on it those programs are dinosaurs.

        • Um

          You don't know what AMD is going to release, actually, since they've been really unclear about their product roadmap for 2022. Expected releases:

          AM4: "Zen3D" - 5900X/5950X refreshes with massive L3 dice stacked on top. Will probably be a halo part to knock off the 12900k.
          AM4: B2-stepping Vermeer (5-series) refresh, possibly with the XT monicker. So think 5600XT, 5800XT, etc.

          Those are both expected by late January. Beyond that, it's a crapshoot. When is AM5 showing up with Rembrandt? We don't know! An

    • Were these tests also done with the inefficient on AMD, Windows 11 version?

      How do you even get an inefficient Windows 11 version? If you install Windows 11 right now it will perform just fine on AMD machines, the bug was after all fixed within 3 days of discovery.

  • by Valgrus Thunderaxe ( 8769977 ) on Friday November 05, 2021 @05:08AM (#61959429)
    Are they fixed, or are they still selling processors with these dangerous flaws?
    • by gweihir ( 88907 )

      Are they fixed, or are they still selling processors with these dangerous flaws?

      They are probably just being swept under the rug. I wonder how many tech "journalists" Intel bribed to make that possible.

    • If they had fixed them they would be bragging about how much they have improved security, because that's SOP. Deny that a problem is serious until you address it, then brag about how you've solved a serious problem. Most people are too stupid to understand the inherent contradiction there, if they even notice.

      • by Rhipf ( 525263 )

        Has AMD fixed Meltdown? I haven't heard them crowing about improved security.

        On 8 October 2018, Intel is reported to have added hardware and firmware mitigations regarding Spectre and Meltdown vulnerabilities to its latest processors.[

        https://en.wikipedia.org/wiki/... [wikipedia.org]

        If there were hardware/firmware solutions in 2018 I'm almost positive that these new CPUs at least have those fixes as well so Spectre and Meltdown shouldn't be a concern.

        With the Intel's 11th Gen "Rocket Lake" processors featuring Cypress Cove cores as the 14nm backport of the Sunny Cove, curiosity got the best of me to look at the Spectre mitigation performance impact. Rocket Lake processors are not affected by Meltdown, MDS, L1TF, iTLB Multihit, SRBDS, or TSX Async Abort. However, on the Spectre front there are software-based mitigations still being applied for Spectre V1, V2, and V4 / Speculative Store Bypass (SSB).

        https://www.phoronix.com/scan.... [phoronix.com]

        So it looks like at least Meltdown was taken care of with Gen 11 CPUs so it would be really surprising if the fix wasn't also in the Gen 12 CPUs.

        The chart on this page [intel.com] indicates that both Meltdown

    • Are they fixed, or are they still selling processors with these dangerous flaws?

      A) Meltdown was fixed.
      B) Spectre is a diverse set of attacks, so only most have been fixed but much has been relegated to using compiler/software based fixes. They will need to completely redesign their microarchitecture rather than just tweak it. This means a definitive fix is likely several more generations away.

      Honestly though, Intel is bailing out slower than the water is rising: https://www.intel.com/content/... [intel.com]

    • Fixed in hardware since Rocket Lake/Ice Lake-U. And in various Xeons (I think Cascade Lake-SP has them both fixed).

    • Are they fixed, or are they still selling processors with these dangerous flaws?

      Oh god I hope not. *Fixed I mean. I hope they didn't fix them. I prefer to live dangerously to get my performance. I also only have 2 locks on my front door which isn't secure against getting hit with a battering ram, and I'm sitting in front of a window where a sniper could take me out at any moment. Insane isn't it. I understand risk, and therefore live *this* dangerously, including not wanting any "dangerous" flaws fixed.

      I really hope no one from Intel reads your post, I'd hate to lose performance for a

  • by Lonewolf666 ( 259450 ) on Friday November 05, 2021 @05:29AM (#61959457)

    On one hand, the thread Director is very new, and support may not be fully mature yet. For instance, Phoronix.com reports that there are no Alder-Lake specific optimizations for Linux yet.
    On the other hand, Windows 11 had a performance regression with Ryzen until recently. That is supposed to be fixed, but have all reviewers applied the patch?

    And then there is the power consumption. TDP (now TBP) is no longer a hard, meaningful limit. Intel suggested that reviewers use the new "maximum boost bower" in their reviews, that is 241 W. Apparently the used boards all have adjustable settings for that.
    Some reviewers tested with the old limit of 125 W as well, and that gave results more in line with Ryzen's performance.
    Personally, I would like see some tests at a power limit of 144 W, that is the so-called PPT the Ryzens are restricted to. This way the performance/power tests would be on an even playing field.

    • Re: (Score:3, Insightful)

      by Bert64 ( 520050 )

      Well if you're comparing performance vs power consumption, should also add Apple's M1 and M1 pro/max to the mix.

      • That comparison is of limited value, because you would have to find software that is available as native version on both systems. Remember, the Mac uses the ARM architecture. For software that runs in emulation via Rosetta, there is the overhead of Rosetta to consider.

        • True, but if an M1 using Rosetta beats an Intel processor (it currently does easily with the same number of cores), that's a problem for Intel. Even if it has equal performance in these conditions.

          I wonder if there is a way to really benchmark the low-power cores. That's at least tricky on the ARM Macs. The idea is that my laptop would run on one low-power core while I'm typing this post, for example, and takes almost no battery life.

          And it seems that Mac benchmarks have the problem that they report c
          • by Rhipf ( 525263 )

            How is Apple's CPU beating Intel's really all that concerning to Intel?
            It isn't like you are going to buy an Apple machine instead of a Windows machine and then run your Windows software on it. Until Apple decides to release their CPU for anyone to purchase/use (which I would be highly surprise they would ever do) then Intel isn't really competing with Apple. I suppose Apple having a faster CPU will convince some people to move to their platform but I would hazard to guess that that migration would be quite

            • It isn't like you are going to buy an Apple machine instead of a Windows machine and then run your Windows software on it.

              Apple's Mac sales doubled when running Windows became an option. A lot of Mac users want to run software from both operating systems. In addition to wanting native Windows to run games, emulating Windows became practical as the emulator no longer had to emulate the CPU architecture. Running Windows in a VM only took a modest performance hit, it was entire usable unlike when emulating on PowerPC.

              ... would need to repurchase all of your programs/apps again as well.

              Not necessarily. For example license MS Office and you can install the Windows or Mac version.

              • by Bert64 ( 520050 )

                MS make an ARM version of Windows which runs in a VM on the M1 chips with very little overhead. The amount of ARM software for Windows is increasing.

      • Maybe, but it's not really useful for many things. The M1 is pretty limited in the amount of RAM it supports, the amount of fast flash storage it can access and doesn't support CUDA. IOW it's a closed ecosystem laptop processor not a desktop. Even if the M1 perf/watt were 10 times better I could not in practice exchange my workstation for an M1.

        • The M1 is pretty limited in the amount of RAM it supports

          It now supports 16GB, 32GB and 64GB. That covers a lot of users.

          doesn't support CUDA

          It has its own GPU and ML coprocessor that support OpenCL.

          eGPUs may eventually get official support for Nvidia.

      • by AmiMoJo ( 196126 )

        M1 and M1 Pro/Max are not very competitive at the top end of the performance curve. They are efficient for what they deliver, but not going to be maxing out for FPS in games. Not that there are all that many games that are native ARM anyway, and Tom's Hardware recently had some major performance issues with x86 emulated games on those CPUs.

        • Apple is claiming RTX 3080 mobile levels with M1 Max. Lets assume a little fiddling with the numbers took place and back off a couple of notches on the RTX side, say RTX 3060 mobile levels are more realistic. That's quite competitive. Keep in mind RTX 3070 and 3080 are what, less than 4% on Steam? The top 35% of Steam are 1050/60, 1650/60, 2060.
          • by AmiMoJo ( 196126 )

            Their GPU numbers don't hold up, but the main problem is that their chips only support the subset of functionality needed by their own API. Everything else gets software emulated or is just unavailable. Throw in x86 emulation and it makes for a terrible gaming experience.

            • by drnb ( 2434720 )

              Their GPU numbers don't hold up, but the main problem is that their chips only support the subset of functionality needed by their own API. Everything else gets software emulated or is just unavailable. Throw in x86 emulation and it makes for a terrible gaming experience.

              I'd never consider CPU architecture emulation for gaming, unless we are talking 10 year old hardware. It barely, painfully, works for productivity apps. At least that was my experience in PowerPC days. However with Rosetta2 we don't really have emulation. We have a permanent binary to binary translation of all the code, the ARM cores are running native code. Now that code may not be as optimized as if it were natively compiled from source but I believe the translation is generally within 5-10% of recompiled

              • by Bert64 ( 520050 )

                Why would it need to work for Windows?
                Windows already runs on ARM, and already has its own rosetta-like translator, so it's only the application software that needs translation not the OS itself for either windows or macos.

              • by AmiMoJo ( 196126 )

                Keep in mind that the GPU is sharing RAM with the CPU. It's not GDDR either. The idea that it could be anywhere near a mobile GPU with dedicated RAM is laughable.

  • by evanh ( 627108 ) on Friday November 05, 2021 @05:48AM (#61959485)

    It's notable that most reviews don't or barely mention power consumption. Too much of an elephant?

    • by Lisandro ( 799651 ) on Friday November 05, 2021 @06:04AM (#61959515)

      Indeed. These 12th gen Intel CPUs consume twice the power of equivalent AMD Ryzen offerings, for pretty much the exact same performance.

      240w for a desktop CPU is absolutely bonkers. AMD is selling 8/16 cores CPUs that run at 65w with impressive specs, for Pete's sake.

      • by TheDarkMaster ( 1292526 ) on Friday November 05, 2021 @06:15AM (#61959529)
        To me it is as if Intel is actually selling an processor overclocked to death from the factory.
        • To me it is as if Intel is actually selling an processor overclocked to death from the factory.

          This isn't unusual for a chipmaker. Up the wattage to compete, and anyone who disagrees is called a hater. They'll crank the heck out of chip if they think it'll last through the warranty period.

          A tried and true model all the way back to the AMD K6-2.

        • Which kinda begs the question how long these CPUs are going to last.

          I mean, it's nice to have a fast PC, but if it croaks curiously right when the warranty expires...

          • Don't be silly, everyone would be furious if they croaked right when the warranty expires. They're probably going to croak a few weeks after the warranty expires.

          • Which kinda begs the question how long these CPUs are going to last.

            Just as long as the AMD CPUs, the systems will simply be louder on the Intel side as the fans run harder. I'm expecting Pentium 4 levels of noise. In short, we've been here before.

          • Probably long enough to become useless. Has this ever been a problem really?

            • Kinda, yes. The CPU in the server next to me has been running for about 15 years now, and I'm pretty happy that I didn't have to replace it yet because it still does what it did for the past 15 years with the same efficiency.

          • Which kinda begs the question how long these CPUs are going to last.

            I mean, it's nice to have a fast PC, but if it croaks curiously right when the warranty expires...

            CPUs have been dynamically "overclocking" themselves for most of the past decade already. No your PC won't croak when the warranty expires. Yes if you put it under load a CPU purchased in the past 10 or so years will push its own thermal envelope pretty damn far.

            I'm reminded of when AMD's Zen CPUs came out and the overclockers collectively shat themselves, "OMG my system monitor says core voltage is at 1.6V, it's going to catch fire, it's going to melt, it's going to, it's going to ... it's going to be alri

        • by AmiMoJo ( 196126 )

          Intel has put a lot of work into the mechanical design of their CPUs, in order to support dissipating this much power. Of course the stock cooler is useless, you need a massive heatsink with a couple of 120mm fans and a very high airflow case to hit those performance numbers.

          Mobile versions are going to suck.

      • by splutty ( 43475 )

        As a comparison the 3970x (32 core/64 thread threadripper) has a max TDP of 280W...

        • As another comparison that threadripper can do the 280W continuously whereas these intel chips will do it for a time period measured in milliseconds. The chips in question consume 120W, not 240W. 240W is a short duration boost spec or as intel calls it "max turbo power".

      • Most reviews point this out so I dunno what the OP is on about.

        It's worth noting though that this seems to happen under heavy load on all cores, and in other tasks Alder Lake is equally or more power efficient than the Ryzens: https://www.igorslab.de/en/int... [igorslab.de]

        And the extreme power consumption is probably because of removed power limits, where the CPU is pushed way past the point of diminishing returns just because it has the power and thermal headroom. I'd definitely like to see more testing but unless you'

      • Hmm, 240W would keep a room nice and toasty if you are living in Nunavut or Anchorage - in Phoenix, it may be a problem.
        • Considering the chip only consumes 240W for less than 1 second it wouldn't make for a very good heater. (That's a short duration turbo boost spec).

          Now imagine a beowulf cluster of these.

      • 240w for a desktop CPU is absolutely bonkers

        Yes it would be. Fortunately it's not 240W. It's 120W. The 240W specification can't be held for longer than 10ms. Let me repeat that: the chip uses 120W (which is more a bit more than the 105W AMDs it is being compared to), but the chip will *not* consume 240W even at full throttle for more than TEN MILLISECONDS. It's a load perfectly manageable with a normal heatsink.

        It's quite a bit less than Zen 2 Threadrippers or higher end Xeons which often find their way into workstations.

    • Anandtech does. Basically Ryzen processors are limited at some 141W, and the 12th generation from Intel goes up to 240W on Performance Cores alone, 270W on Performance plus Efficiency.
      The best processor (i9-12900K) at some $600-650 wins some and loses some benchmarks against both $550 and $800 competition from AMD.
      Meanwhile, it uses up to 270W, so if you want to stay "at speed" you need a bigger/noisier cooling system.

      So, depending on your circumstances, there are cases when it actually makes sense financia

      • by AmiMoJo ( 196126 )

        Not just a bigger cooler, but a beefy PSU that can supply a lot of power on the 12V rails, and a beefy motherboard that has high end power delivery capable of sustaining 270W.

        • Not just a bigger cooler, but a beefy PSU that can supply a lot of power on the 12V rails,

          All PSUs have prioritized 12V since the Pentium IV.

          and a beefy motherboard that has high end power delivery capable of sustaining 270W.

          That's a bigger problem. Also my entire current system (which is old, and has SLI!) consumes 350W while totally pinned. 270W is a ridiculous power budget for a CPU alone.

          • Delivering that much power is easy. Dynamically adjusting voltage to meet current flow demands (remember, kids, E = IR is actually e(t) = i(t)z(t) for every value of t) so that random bits don't flip in the CPU is a lot harder.
    • by Junta ( 36770 )

      They do... way near the end...

      The summary is you get slightly better than Ryzen 9 Performance with Threadripper level power consumption, at a price somewhere between the two.

      So... I'm still going to pass on Alder Lake

    • Anyone else find it ironic, that it's Intel that has the new chips with the most power draw? I remember not too long ago it was AMD that had a problem with requiring too much power.
      • The ancients here will remember a time when you needed to put a fan not unlike a turbofan engine (along with the relevant noise generation) on an AMD chip to keep it remotely cool enough to not cook off immediately.

        Guess Intel finally caught on. What kind of noisemaker do you have to put on these chips to keep them operable?

        • The ancients here will remember a time when you needed to put a fan not unlike a turbofan engine (along with the relevant noise generation) on an AMD chip to keep it remotely cool enough to not cook off immediately.

          We also remember when the P54C melted its socket.

        • Or when Intel had the same power hungry chips before as well. Anyone else remember having to cool a Prescott P4 with liquid magma from a geothermal well?

    • Depends. If the computer is used in workloads, long term the higher usage of electricity may not detract enough from shorter workloads. For gamers, they may not care if they can get more FPS while pwning someone in Fortnite.
    • One the plus side of high power consumption, CPU and GPU mining are deterred. :-)
    • That's not all, most publications concentrate on its gaming performance, hailing it the new gaming king, but there is no mention that there are dozens of games that doesn't work at all on these CPUs, unless you boot with disabled E-cores [pcgamer.com]. Yes, it is usually due to the DRM solution, which is another reason DRM sucks, but perhaps not being able to play several popular games should be an issue when declaring the "gaming king".
      I don't play games anymore myself, so I welcome any CPU market competition no matter

    • For decades now, it's been the platforms that define the allowable TDP. You then make an processor SKU that meets the TDP of the platform.

      You can't just arbitrarily pick a new TDP. If you can define a new platform, then you can sell a processor SKU that meets the new TDP. Otherwise, you're either selling an SKU into a platform that can accommodate a higher TDP leaving space for a competitor to offer a higher performance, or you're exceeding the platform TDP and it will fail.

      So what's happening here is that

    • It's notable that most reviews don't or barely mention power consumption. Too much of an elephant?

      No, too much of an irrelevant. The market for these specific CPUs does not even remotely care about power consumption. They aren't going into server farms, office PCs, laptops, household desktops, or anywhere else where thermal performance is a consideration.

      Heck I'll wager a good percentage of them would have a water cooling system on them regardless of the CPU's power consumption, because that's what the target market does.

  • Comparisons are usually made with similar and unbiased environments. Here it was not the case. The OS had known issues with the AMD CPUs and the power consumption or the two CPUs were nowhere close to each other, the Intel one drawing more than double of the other one's.
    • Also, Intel on DDR5 is almost always faster than on DDR4, and up to 40% faster (DDR5 to DDR4), and DDR5 is much more expensive now.

      It's a great improvement though - in the past, with all the biases, Intel still had to "carefully" "choose" its "benchmarks".

      Now, it even wins some relatively common industry benchmarks (or better sub-benchmarks) (even at $600 processor price versus the $800 Ryzen. No idea about platform costs though).

    • Linus Tech did benchmarks using a build of Windows 11 patched for the AMD performance regression, and the Intel chips still beat AMD but not by as much. They did use DDR5 memory on the Intel chips, though, so it wasn't entirely one to one.

      Still, you are comparing a brand new chip to a series of chips released about a year ago. I think AMD's next series of chips are set to be released in a couple of months, and I'm sure they'll be competitive.

  • Who'd have thunk that having more than one option at CPU would force vendors to get better products to market?

    And you don't have to be an Intel shill or fanboi to love that PCIe5 and DDR5 are finally available. How many years were we stuck on PCIe3? Love to see these companies going at it -- we all benefit.

  • ...what was the idea behind doing comparisons with a crippled AMD chip when it looks like that the chip is a contender fair and square?

    Granted I've stopped trying to understand you earthlings as a general rule but you just keep flinging curve balls at me that I don't expect despite all cynicism.

  • Very few people buy systems with flagship processors. Most of them are way further down the line, with sub-$200 CPUs. I can still play AAA games on a total ~$1000 PC that I built like five plus years ago with a sub-$200 CPU, but putting that aside this processor not only needs new expensive RAM but it also needs a big expensive power supply. Its TDP is within 33% of my entire system running flat out, literally! Most people won't care about the power consumption but they will care about the cost of the hardw

  • by nagora ( 177841 ) on Friday November 05, 2021 @07:52AM (#61959705)

    Intel is dead, no matter how much "mojoKid" is paid to gush like a 12-year-old girl about their products.

    • Intel is dead, no matter how much "mojoKid" is paid to gush like a 12-year-old girl about their products.

      I wouldn't call them dead, not any means. I also wouldn't make the ridiculous claim that they are in any way ahead of AMD right now though.
      This is just a pathetic advertisement for Intel.

  • The prices quoted for the new Alder Lake CPUs are based on quantity-1000 bulk prices. As yet AFAIK there are no official retail prices for one-off consumers. Some reviewers have made estimates of the likely markup based on previous Intel CPU releases, guessing that a retail 12900K may cost about 50 bucks more than the bulk price.

    As for the performance comparison with the current AMD high-end Ryzen CPUs, the Alder Lake P-cores (performance cores) run out-of-the-box at 5.1GHz while the AMD CPU cores run at 3.

  • In the lightly-threaded tests though, it's a much clearer win for Intel and gaming is an obvious strong point as well

    Now that's some serious power. Let me check if there's a new, cool MMORPG!

    - One from Amazon (?!?)
    - All others are old or port retreads

    What?

  • LTT did a review and they seem to be pretty open about the
    - better performance in most situations (but not all)
    - the very high power (and resulting heat)
    - the fact that the i5 ultimately seems to be a better value proposition for most users.

    https://www.youtube.com/watch?... [youtube.com]

    I understand this review was actually posted before the end of the requested info block...maybe LTT had their reasons, but I think that's pretty disappointing.

    • Generally the flagship CPUs like the 12900k is meant for prosumer crowd or the "I only care that it gives me 5 more FPS" crowd who are not as concerned with power consumption. The 12600K is competitive especially at the price but it draws more power. I was a reading review that gaming wise, Intel only slightly beat AMD as most games are GPU bound. The bottom line was for the first time in years, consumers have a viable choice to make.
  • by a site called Hot Hardware.

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...