Intel Launches 9th Generation Core Processors; Core i9-9900K Benchmarked (hothardware.com) 130
MojoKid writes: Intel lifted the embargo veil today on performance results for its new Core i9-9900K 9th Gen 8-core processor. Intel claims the chip is "the best CPU for gaming" due to its high clock speeds and monolithic 8-core/16-thread design that has beefier cache memory (now 16MB). The chip also has 16-lanes of on-chip PCIe connectivity, official support for dual-channel memory up to DDR4-2666, and a 95 watt TDP. Intel also introduced two other 9th Gen chips today. Intel's Core i7-9700K is also an 8-core processor, but lacks HyperThreading, is clocked slightly lower, and has 4MB of smart cache disabled (12MB total). The Core i5-9600K takes things down to 6 cores / 6 threads, with a higher base clock, but lower boost clock and only 9MB of smart cache. In benchmark testing, the high-end Core i9-9900K's combination of Intel's latest microarchitecture and boost frequencies of up to 5GHz resulted in the best single-threaded performance seen from a desktop processor to date. The chip's 8-cores and 16-threads, larger cache, and higher clocks also resulted in some excellent multi-threaded scores that came close to catching some of Intel's many-core Core X HEDT processors in a few tests. The Core i9-9900K is a very fast processor, but it is also priced as such at $488 in 1KU quantities. That makes it about $185 to $225 pricier than AMD's Ryzen 7 2700X, which is currently selling for about $304 and performs within 3% to 12% of Intel's 8-core chip, depending on workload type.
What security? (Score:5, Interesting)
Is it really going to be any faster after inevitable microcode and OS patching to address gross security flaws?
Re: (Score:1)
Re:What security? (Score:5, Insightful)
If you are buying for gaming which this is aimed at you really don't care about the security flaws.
What rubbish. If you are buying for gaming you are online for sure, in a swamp of script kiddies that know your IP and have a vested interest in learning your passwords. If you are gaming online then you are more at risk than the general population. And of course there is the usual swarm of professional hackers. Sometimes I take a moment to watch them hitting my firewall, it's like a cloud of bugs hitting your windshield at sunset. It is that bad, take a look for yourself. I would not advise any gamer to buy an Intel rig as of today for security alone, never mind the value factors.
Re: (Score:3)
In order for someone to even exploit the flaw, they would already have to have remote access to the system.
Completely wrong. Please stop spreading dangerous disinformation. Meltdown can be exploited by Javascript, [tomshardware.com] meaning that any website you visit can end up owning any private data you have on your Intel machine. Meltdown is not just a problem for web host operators, but anyone running a browser with Javascript enabled.
Re: (Score:2, Troll)
Oh noes, shared data. Whoop de fucking do. Without knowing what that shared data is you're targeting precisely nothing. Also this has been patched against by all browsers.
Get a helmet!
Re: (Score:1)
I hope you don't work in any responsible position.
Re: (Score:2)
I do. I work in a position so responsible that I conduct actual risk assessments rather than freaking out about news articles.
Re: (Score:2)
Does the position of Intel Shill come with any responsibility?
Shill? Who is shilling for Intel? We are simply discussing security. Shills exist to promote products in which case let me help you: Buy AMD, the price performance offer is far better in every way.
Signed: An apparent Intel Shill and happy Ryzen 2700X user.
Re: (Score:2)
Slashdot: where the four stages of grief are buried under a think coat of foamy, wanker creme.
Despite their reputation, geeks are usually well informed about the world, including worldly matters we learned about at age 19 instead of age 14. Within those parameters, there's also a sub-tribe of geeks who rapidly become unmoored from reality when reality disrupts their innate sense of law and Lego order, who never outgrow the eternal adolescence of plastic bricks.
Lego porn: it's a thing (Score:2)
One of the reasons I write on social media is that the subconscious mind doesn't convey its knowledge in straight lines.
Maybe some people don't know this, but the five stages of grief model (formally Kübler-Ross) was basically a brilliant conversation starter. Constructive ways to talk about loss, grief, and death are relatively thin on the ground. Its half-century zeitgeist tenure was well deserved.
But the model itself is far from a physical constant of the universe, so I wasn't surprised that I
Honestly they mostly just hit the firewall (Score:3)
Spectre/Meltdown are a problem because they enable a bunch of exploits that let you get out of a hypervisor and into the host OS. If you're in a data center that's a huge deal. If you're a gamer it's, well, not.
You'll notice that there's been no gaming apocalypse. No massive class action lawsuits because of lost performance. And no big exploits. No big wins
Re: (Score:3, Insightful)
To make good use of the micro code bugs you usually need root/admin. And if somebody's got that you're already boned.
And you are boned for being stupid. [tomshardware.com]
Spectre/Meltdown are a problem because they enable a bunch of exploits that let you get out of a hypervisor and into the host OS. If you're in a data center that's a huge deal. If you're a gamer it's, well, not.
The boneheadedness is strong in this one. I hope that nobody ever listens to you about anything.
Patched (Score:2)
There's still some theoretical exploits. They require incredibly precise timing and are unlikely to ever be used. Maybe if I was in a high security environment I'd worry about it. I'm playing video games. If the KGB or the CIA decides they want access to my Street Fighter V profile I'm pretty sure they'll find a way to get it with or without spectre/meltdown. Jokes on them, I suck.
Re: (Score:3)
If you are buying for gaming you are online for sure, in a swamp of script kiddies that know your IP and have a vested interest in learning your passwords.
Calm yourself. If a script kiddie is able to get your passwords using Spectre/Meltdown then you have really ballsed up your own security big time. These aren't fly by vulnerabilities that are exploited like a shitty PDF bug or I/E vulnerabilities. Spectre / Meltdown have not left the lab for very good reasons. They are highly targetted attacks that require knowledge and existing access to the machines.
If you're worried about the NSA getting access to your kiddy porn, patch.
If you're worried about giving sha
Re: (Score:2)
Re: (Score:2)
Hey, maybe someone will release a game that uses RTX.
Huh? Is there some AMD equivalent of Fox News around here somewhere that prevents you guys from getting access to facts or something?
Re: (Score:2)
I, for one, think that the whole Meltdown/Spectre nonsense is a hyper over-reaction to a most obscure vulnerability that - again!! - requires that your machine already be infected. The whole JS scripting shit is nonsense also, BTW.
Wow, no hyperthreading? (Score:2)
Wow, for a flagship chip, with the i7-9700K lacking HyperThreading Intel must *finally* be starting to be concerned about security. Guess performance isn't everything when you can get p0wned. =P
Re:Wow, no hyperthreading? (Score:5, Informative)
i7 is the new 85. I9 is the new i7. Pretty cynical of Intel. BTW, hyperthreading only speeds up Meltdown, but Meltdown still will get your passwords even without hyperthreading, it just takes a bit longer. This is because of the way cache is shared between processor cores.
Re: (Score:2)
The side-channel has abysmal transfer rates, can in no way be executed in a way that doesn't impact the performance of the machine, and can be easily mitigated when used in vectors that are going to affect the average person.
Assuming worst case, where it's allowed to run while someone doesn't notice it, a program's virtual address space is *huge*.
These attacks are real, but their real-world applicability is highly hypothetical.
I think
Also: Only 16 PCIe lanes?? (Score:1)
Is that a joke, or are they serious?
So one single decent GPU can't even be fully fed, if you would also like any other type of IO??
The Ryzen at least has 24. And Threadripper a whopping 64!!
If that does not outweigh their single core performance in practical applications, I'll eat my hat^WIntel CEO.
Oh, and any reason single core performance matters more in gaming, is solely due to developers optimizing for the low-thread-count consoles, and do shitty lazy ports later. As soon as commonly used consoles start
Too soon (Score:5, Insightful)
Re:Too soon (Score:5, Informative)
the list of different things going into the "Spectre" bucket keep growning
Re: (Score:2)
no one gives a shit
into the spectre bucket with you too, AC
Re: (Score:2)
Right. There are supposedly some mitigations in this Coffee Lake release but I seriously doubt that they are real hardware mitigations, probably just microcode hacks that cost performance. I am highly skeptical that Intel had enough time to develop and qualify the fundamental cache circuitry changes they need to fix Meltdown properly, let alone changing all the masks.
Re: (Score:2)
Tweaks, maybe. They just did not have time to do anything major. I just don't have a whole lot of confidence they closed up Meltdown definitively. Intel will need at least a few more months to do the job properly and copy AMD in time for the Cannon Lake ramp.
Re: (Score:2)
They haven't had time to fix Spectre and Meltdown, I think I'll pass.
Incidentally there are not some 6 different Speculative Execution attacks on processors. According to the test script my brand new AMD is vulnerable to 5 of them. When I got news of this the first thing I did... disable the work arounds in Windows. Screw performance hits. My enemies are the script kiddies, and trojan makers of the internet, not the NSA or other well funded organisations that actually *may* have the capability to do something *useful* with these exploits.
PCIe (Score:5, Interesting)
"The chip also has 16-lanes of on-chip PCIe connectivity" - this actually sounds EXTREMELY low. And here I am, on a CPU with 40 lanes, and a chipset that provides another 5... in a system that is several years old. This sounds like a massive downgrade. Though, most people I guess only populate 1 slot for the GPU nowadays, and nothing else. Consumer 10gbe isn't quite there yet. Add-on sound cards have gone to the wayside (onboard audio is still shit quality in comparison, but since people only listen to low bit rate streaming MP3s anyways, I guess it doesnt matter!?) The only thing I question is the NVMe craze right now, and how this chip will be able to keep up with that, since most recent ones are usually PCIe (though some are DIMM socket now as well)
Re: (Score:2)
This is standard for Intel CPUs.
16 PCIe lanes on-chip is standard for Intel and any other PCIe lanes are multiplexed off the 4x DMI Interface.
AMD no doubt fully supports this standard for "Intel" CPUs.
Re: (Score:2)
Any chip that's old by "several years" would have at best pcie2, at roughly half the bandwidth per lane of pcie3.
Your 40 pcie2 lanes could still be considered better than only 16 pcie3 lanes, but really not by very much, certainly not enough to call the I/O capability of these newer chips "EXTREMELY low".
Re:PCIe (Score:5, Informative)
Any chip that's old by "several years" would have at best pcie2, at roughly half the bandwidth per lane of pcie3.
Nonsense, CPUs with 16x 3.0 lanes were available more than 5 years ago.
https://ark.intel.com/products... [intel.com]
There is no excuse for 16 lanes in 2018.
Re: (Score:2)
The Ryzen 2700X has 20 lanes. 1x16 and 1x4.
My Epyc system has 64 lanes per processor and dual processors (although I understand the single processor systems get 128 lanes because they don't use 64 of them them for inter processor communications.
Re: (Score:2)
My 2014 Xeon has a lot more lanes than that, and they're PCIe 3rd-generation. The chipset splits some up into PCIe 2nd-generation slots though. I currently have two Quadros in 16x 3rd-gen slots, a 2x40Gbps Ethernet NIC in an 8x 3rd-gen slot, and a SAS controller in a 4x 2nd-gen slot. The SAS controller could use an 8x 3rd-gen slot but I don't have one spare.
Re: (Score:3)
My several year old system is indeed 40x PCIe3 straight from the CPU, with the additional 5x lanes from the north bridge being PCIe2. Nice assumption though!
Re: (Score:2)
Any chip that's old by "several years" would have at best pcie2, at roughly half the bandwidth per lane of pcie3.
PCIe 3.0 is over 8 years old. Dual GPU systems would consume all 16 lanes available which is precisely why AMD has upped the lanes to 20 for even their lower entry Ryzen chips. Gotta leave enough for storage.
Re: (Score:2)
"The chip also has 16-lanes of on-chip PCIe connectivity" - this actually sounds EXTREMELY low. And here I am, on a CPU with 40 lanes, and a chipset that provides another 5... in a system that is several years old. This sounds like a massive downgrade. Though, most people I guess only populate 1 slot for the GPU nowadays, and nothing else. Consumer 10gbe isn't quite there yet. Add-on sound cards have gone to the wayside (onboard audio is still shit quality in comparison, but since people only listen to low bit rate streaming MP3s anyways, I guess it doesnt matter!?) The only thing I question is the NVMe craze right now, and how this chip will be able to keep up with that, since most recent ones are usually PCIe (though some are DIMM socket now as well)
Actually, as I understand it, the i9 has 40 platform PCIe lanes (16 CPU + 24 PCH). 16 lanes are dedicated to devices needing fast access to the CPU like 16x/8x graphics card slots. 24 chipset lanes handle other connectivity, like M.2 slot, network interface, SATA, and other PCIe slots, etc.. How the lanes are allocated are based on the motherboard design.
The Nvidia 2080 is the first graphics card that can max out 8x PCIe 3.0 lanes, and just barely. There is only a 1% to 2% improvement when running with
Re: (Score:3)
For reference, 10gbe is ~1GB/sec. That is sustainable on burst reading on a 8x SATA drive array. I'm currently running over 20 drives in a home server with 10gbe link back to the networking core, and my desktop with a 10gbe link to that core as well. It is trivially easy to saturate a 10gbe link nowadays.
Re: (Score:2)
For reference, 10gbe is ~1GB/sec. That is sustainable on burst reading on a 8x SATA drive array. I'm currently running over 20 drives in a home server with 10gbe link back to the networking core, and my desktop with a 10gbe link to that core as well. It is trivially easy to saturate a 10gbe link nowadays.
If you have a RAID array and the device(s) on the other side have fast storage to handle it, then yes, you can saturate a 10gbe link. But again, you have the necessary components to make use of it. Most people who talk about wanting 10Gbps on a consumer motherboard have no idea.
Also, would you really rely on a 10Gbps chipset built-in to the motherboard vs a dedicated PCIe card that can handle the network offloading? Most consumer motherboards use the CPU for network processing. Server motherboards are a
Re: (Score:2)
24 chipset lanes handle other connectivity, like M.2 slot
And that along with Spectre/Meltdown is why you shouldn't go Intel for systems which require fast storage I/O. You'll note that Ryzens also only dedicate 16 PCIe lanes to graphics, but have an additional 4 dedicated to NVMe.
Meanwhile at AMD... (Score:1)
Threadripper offers 128 PCIe lanes and support for ECC memory.
Re: (Score:2)
Though, most people I guess only populate 1 slot for the GPU nowadays, and nothing else.
16 lanes for GPU. If you have a second GPU, which some do, that's 32 lanes already. NVMe storage is 4 lanes, and the GbE gets at least 1 lane and possibly 4 lanes. 16 lanes is a bad joke.
Re: (Score:2)
Though, most people I guess only populate 1 slot for the GPU nowadays, and nothing else.
16 lanes for GPU. If you have a second GPU, which some do, that's 32 lanes already. NVMe storage is 4 lanes, and the GbE gets at least 1 lane and possibly 4 lanes. 16 lanes is a bad joke.
No, that's not how it works. If you put the two GPUs in the two 16x slots they both downgrade to 8x and 8x. Only the newly released 2080 can saturate an 8x PCIe 3.0 slot. Tests show that a 2080 will perform 1% to 2% faster with a PCIe 3.0 slot at 16x vs 8x. So, while having 32x lanes would give you a bit of graphics boost, it's not that much. NVMe usually gets it's lanes from the 24x PCH lanes. But it depends on the motherboard design.
Thank god for AMD (Score:2)
Thank god for AMD. Intel faces stiff competition once again and still charges 50% more for a 10% faster CPU. Remember the days before good competition? The P66 was introduced at $1000 in 1k quantities back in '94, which is about $1800 now. I mean even the terrible P4s were being sold at a premium (ok using dubious means, but still).
Re: (Score:3)
Actually, i9-9900K is 90% more expensive than Ryzen 2700X. And Intel had to fiddle the gaming benchmarks [tomshardware.com] to make it look faster than it really is. These are on Intel's 14nm process, they were hoping to be on 10nm by now but that isn't happening until some time next year. Meanwhile Ryzen 2 on 7nm will be out while Coffee Lake is still shipping, oops. Ryzen 2 will probably probably put AMD even in IPC and ahead in GHz. Intel's last remaining bragging points gone. And Intel isn't going to catch up any time soo
Re: (Score:2)
Actually, i9-9900K is 90% more expensive than Ryzen 2700X. And Intel had to fiddle the gaming benchmarks [tomshardware.com] to make it look faster than it really is. These are on Intel's 14nm process, they were hoping to be on 10nm by now but that isn't happening until some time next year. Meanwhile Ryzen 2 on 7nm will be out while Coffee Lake is still shipping, oops. .
Ryzen 2 (2xxx) series is based on 12nm process, which are already in the market. You are referring to Zen 2 architecture (Probably will be released as Ryzen 3 series)
Re: (Score:2)
Zen+ is Ryzen 2000 series, Zen 2 will be Ryzen 3000. It's a bit confusing. Ryzen+ means Zen+ mainstream desktop. I think AMD intended Ryzen 2 to mean Zen 2, not Zen+, but there's so much confusion about that now that it's better to stick to the thousands terminology.
Re: (Score:2)
Zen+ is Ryzen 2000 series, Zen 2 will be Ryzen 3000. It's a bit confusing. Ryzen+ means Zen+ mainstream desktop. I think AMD intended Ryzen 2 to mean Zen 2, not Zen+, but there's so much confusion about that now that it's better to stick to the thousands terminology.
Yes, perhaps you should do that, I already did (2xxx). What is clear is that the 7nm process is named Zen 2, while the product using the architecture is yet to be named
Re: (Score:2)
Yes, perhaps you should do that, I already did (2xxx).
I'm just going to have to go ahead and point out that your retort qualifies as kind of snippy, considering that you actually said "Ryzen 2 (2xxx)" which is wrong, or at best adds to the confusion.
Re: (Score:2)
Yes, perhaps you should do that, I already did (2xxx).
I'm just going to have to go ahead and point out that your retort qualifies as kind of snippy, considering that you actually said "Ryzen 2 (2xxx)" which is wrong, or at best adds to the confusion.
Well, I do kind of already using the thousand terminology that you have pointed out above and AMD website refers to the series as 2nd Generation Ryzen Processors [amd.com], while many publications have dubbed them as either Ryzen 2nd Gen, Ryzen+, or Ryzen [tomshardware.com] 2 [youtube.com]. Asking me to use the thousand terminology without addressing your own use of "Ryzen 2 on 7nm" is well.. not fair. What exist today is Zen 2 architecture on 7nm, and whatever product that uses it is not yet named
Re: (Score:2)
What exist today is Zen 2 architecture on 7nm, and whatever product that uses it is not yet named
Pretty safe bet they will call it Ryzen 3, a change from their original plan which was on the dumb side. Confusion is not helpful. BTW there already is a 5nm Zen 3 on the roadmap, I bet that gets the damnatio memoriae treatment too, they will reimagine it as Ryzen 5 (skipping Ryzen 4 as originally planned because 4 rhymes with "dead" in Chinese)
Re: (Score:2)
Oh, and another source of confusion: what is a Ryzen 3? Is that zen-3-formerly-known-as-zen-2 or is it the cheap budget PC bin?
Re: (Score:2)
Oh, and another source of confusion: what is a Ryzen 3? Is that zen-3-formerly-known-as-zen-2 or is it the cheap budget PC bin?
I think that is actually why they formally use the over complicated "2nd Generation Ryzen" instead of Ryzen 2 or Ryzen+. Intel do this as well. To make things worse the first gen mobile and APU parts were based on the 14nm Zen instead of 12nm Zen+ and name 2xxxH and 2xxxG respectively
Of course, I meant to write "they will reimagine it as Zen 5 skipping Zen 4", there's that confusion at work. To make that seem somewhat legit, they can go with "5nm means Zen 5, right?" Then just grin and go with Zen 6 for the 3nm generation.
Or introduce yet another code name.
Re: (Score:2)
the first gen mobile and APU parts were based on the 14nm Zen instead of 12nm Zen+
The 12nm node name surely counts as one of the most egregious terminology abuses in the process wars so far. It uses all the same dimensions as 14nm but tweaks some details for better clocks and power efficiency. It really really should be called 14nm+, but maybe they just felt a compelling need to distinguish it from Intel's unrelated 14nm. And 12nm is better than 14nm, right? And 12nm must be better than 14nm+, so that settles that. What we need to be clear on is, nm no longer means "nanometer", it means
Re: (Score:2)
Of course, I meant to write "they will reimagine it as Zen 5 skipping Zen 4", there's that confusion at work. To make that seem somewhat legit, they can go with "5nm means Zen 5, right?" Then just grin and go with Zen 6 for the 3nm generation.
Re: (Score:2)
Then again, if money is no object and you have the need for speed, Core i9-9900K is the CPU to buy.
Also, out of curiosity, from where are you getting this 90% more expensive from?
The 9900K has an MSRP of $488, the 2700X has an MSRP of $329.
Now, I'm no mathematician, but $488 != $625.
Re: (Score:2)
Prices then flew up once limited supply was apparent. Such is the way of things. You're definitely right that you can't get one for MSRP right now, but you will be able to eventually.
Hey Intel 16 lanes is NOT enough (Score:2, Insightful)
Currently I have:
1 x 16 lane graphics card
1 x 4 lane USB3 controller (four independent USB controllers)
1 x 1 lane USB3 controller
As a result GPU currently only able to use 8 of 16 lanes on my circa 2013 i7. Here it is 5+ years later and NOTHING has changed.
No way will I be spending money on a new CPU with only 16 lanes.
No way will I be spending money on a new CPU without ECC memory.
No way will I be spending money on a new CPU without security bugs fixed.
No way will I be spending money on a new CPU that doe
Re: (Score:2)
Everything else is on my chipset.
Wait, yours are too. Sucks being stupid, doesn't it?
Re: (Score:2, Informative)
Please learn about the difference between PCIe3 and PCIe2
I have a 1080ti GPU that runs PCIe 3.0 @ 8x rather than 3.0 @ 16x due to the additional PCIe cards. Remove the cards and it runs 16x.
and the fact your motherboard chipset will give you more PCIe lanes.
Sure you could add a manifold to a 1" water pipe and get 4 1" pipes. It doesn't allow you to move 4x the water.
All desktop CPUs have had 16 lanes direct to the CPU since your generation CPU.
All desktop *Intel* CPUs that is. AMD processors have no such limit.
And you don't need ECC memory.
There is no basis to for you to conclude anything about me or my needs. You don't even know who I am. I demand ECC memory. My next system WILL have ECC no matter what.
Had to rerun jobs that spin
Beastly Xeon W-3175X (Score:4, Interesting)
Beastly 28 core Xeon W-3175X, obviously targeted at AMD's 32 core Threadripper 2990WX, which you can buy right now on Amazon for $1,720. I'd like to know Intel's price, I guess it's not remotely close.
Note that with these top heavy core counts you always get lower clock frequency because of bus contention. Not a stopper by any means, if you have the use case. But personally I'm a lot more interested in the higher clocked 16 core AMD parts, specifically the 2950X, $900. Slightly higher cost per core but clocked about 10% higher. Boost frequency 4.4 GHz, the technical term for that is awesome.
What a ridiculous comparison (Score:4, Informative)
That's $304 per SINGLE AMD processor, $488 per if you buy a thousand units of the Intel. Unless you're building a thousand computers this makes no sense to compare, and even then, the cost of the AMD processor goes down at those volumes too. This reveals a stupid level of bias in this article.
Re: (Score:2)
these motherboards and chips are basically an *oh crap, we fell behind*
I would say so... i9-9900K is currently "out of stock" on Amazon. Invites the question, is Intel really producing these in volume or is this just a delay tactic hoping some folks will drop their 2700x plans? I'm thinking, Intel had to dust off their 14nm fabs to produce these, do they really have the capacity? If they don't, we're going to be seeing a whole lot of "out of stock" and asking prices will probably spike above MSRP like they did for first gen Epyc. A little voice whispers to me, they don't.
Anywa
Re: (Score:2)
Right, i9-9900K currently available (maybe) from ebay scalpers for $650 - $1100.
Re: (Score:2)
Judging from the elongated production schedule and complete nonavailability of Cannon Lake, Intel must have bulldozed the first gen 10nm production line. Then they had to deal with the question of expanding 14nm production to fill the gap. That one must have been really tough, they needed to add the absolute minimum capacity at that already obsolete node. So Coffee Lake shortages would be because of trying to build less capacity, and also competing for resources with their own Kaby Lake and maybe even Skyla
I'd buy a one, but... (Score:1)
Psst, Intel... (Score:2)
Re: (Score:2)
While you are so busy developing top speed CPUs for gaming, could you remember once in a while to release something new for the few of us who still have to spend their life *working* ?!? Thank you...
Don't worry, AMD has got you covered with loads of PCIe lanes, and encrypted ECC RAM.
What compiler options ? (Score:2)
I'm not a gamer, but I suspect that games are sold and will work on both Intel & AMD CPUs but are generic binaries. This means that the vendors will have used compiler options so that they work on both, but that means that they might work faster on one. I have seen instructions generated that test which CPU & run these instructions or those ones. How much does that favour one CPU type over another ?
Help me understand the point of this (Score:2)
So my question is: who spends $580, on the CPU alone, to build a gaming PC that only plays at 1080p? I understand that 1080p is the most common gaming resolution, but for people spend
Re: (Score:2)
YAY! Now Windows can be even more bloated! (Score:2)
I can hardly wait.