AMD Making a 5 GHz 8-Core Processor At 220 Watts 271
Vigile writes "It looks like the rumors were true; AMD is going to be selling an FX-9590 processor this month that will hit frequencies as high as 5 GHz. Though originally thought to be an 8-module/16-core part, it turns out that the new CPU will have the same 4-module/8-core design that is found on the current lineup of FX-series processors including the FX-8350. But, with an increase of the maximum Turbo Core speed from 4.2 GHz to 5.0 GHz, the new parts will draw quite a bit more power. You can expect the the FX-9590 to need 220 watts or so to run at those speeds and a pretty hefty cooling solution as well. Performance should closely match the recently released Intel Core i7-4770K Haswell processor so AMD users that can handle the 2.5x increase in power consumption can finally claim performance parity."
Awesome (Score:5, Funny)
Re: (Score:2)
Re: (Score:3)
Re:Awesome (Score:5, Funny)
Re:Awesome (Score:5, Funny)
No I meant running my freezer. Hence the reason I typed:
"I always wanted to have a computer running my freezer"
instead of
"I always wanted to have a computer running IN my freezer"
Oh. Then I don't get it.
As a side note, I've always wanted to take an old mini fridge and turn it into a computer case.
Re: (Score:3, Interesting)
Re: (Score:3)
There is a type of freezer that can operate using a heat source for power [howstuffworks.com] (strange but true).
I'm, guessing that's what he meant anyway.
Re: (Score:2)
All of them work that way. Look at how a fridge works.
Re: (Score:3)
Re: (Score:2)
yea I have a propane fridge and freezer, I pulled it out of an old RV at a junk yard for free :)
but it works, and I keep it for emergency purposes. Few years ago my town was without power for 5 days after a Hurricane came through southern Ga. So I connected the freezer/fridge duo to a propane tank and was able to keep most my food from spoiling.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Re: (Score:3)
Re: (Score:3)
i think it would work better running a heated swimming pool, or a grill. But to each their own.
Re: (Score:2)
Comment removed (Score:5, Insightful)
Re:Awesome (Score:4, Insightful)
You got that one wrong. Netburst was about deepening the pipeline to ridiculous extremes in order to ramp the clock. The new AMD story is pure clock ramp via process technology and power management. Big difference there.
Re:Awesome (Score:5, Interesting)
No, this particular story is analogous to a 2004-era story about Intel releasing a new Pentium IV at yet-higher clocks. The current story is about a clock ramp, but the overarching narrative is the same.
The Bulldozer architecture is fundamentally broken, this time due to simple negligence (mainly in management) rather than a faulty assumption. The only way to get reasonable performance from it is to clock it to high speeds, which gives very diminishing returns. Power consumption scales with the *cube* of the clock speed, so you pretty quickly run into a power/heat wall. They clocked the early ones pretty aggressively already, but at the cost of power and heat (and thus, noise). But it's the same story as the Pentium IV - the smart people are on something else.
AMD seems to be trying to put itself back together. Hopefully the PS4/Xb1 wins will give them enough of a cash flow to keep them solvent until they can get a new architecture out, or at least hammer out the IPC problems with Bulldozer. On the bright side, Intel's been distracted by ARM - they threw away a year's lead on performance to chase idle power draw, which should give AMD a bit of time to catch up on performance on the desktop.
Re: (Score:2)
No it's not analogous to anything. It is about observing that AMD did not deepen the pipeline, therefore this story is not like the Netburst debacle.
Re: (Score:2)
Tell me, did Intel deepen the pipeline when they released the Pentium 4 HT 570J, as compared to the Pentium 4 HT 560 launched a few months before? No, they didn't, but they did increase the clock speed. Same story then as now - AMD didn't deepen the pipeline, but they are pushing the clock speeds to inadvisable levels because that's the quickest way to improve performance, and they *desperately* need to improve performance.
Look, I'm not an Intel fanboy. Even with all its problems, I'm actually still plannin
Re: (Score:3)
Re: (Score:3)
Re:Awesome (Score:5, Interesting)
On the bright side, Intel's been distracted by ARM - they threw away a year's lead on performance to chase idle power draw, which should give AMD a bit of time to catch up on performance on the desktop.
In the short term, this appears to be a good thing. In the long term, this is very bad for AMD.
The world is moving to low-powered portables. The future of consumer computing will not be on the desk or lap, but in the hand. Workstations will still use desktop chips, but Intel pretty much has that market cornered.
The low-powered, RISC space is where AMD needs to go. It doesn't necessarily have to be ARM. Instead, there's a market for low-powered x86, which is where Intel is going with Haswell. AMD needs to get ahead of the game and create something that is capable of power sipping (which obviously won't be x86), but is also capable of running legacy x86 code at reasonable speeds.
Basically, they need to create a migration path away from x86, which will never be as efficient as ARM and thus has no chance in the portable space. Yes, Intel tried that with Itanic, but they were aiming in the wrong direction (servers).
Re: (Score:3)
The low-powered, RISC space is where AMD needs to go. It doesn't necessarily have to be ARM. Instead, there's a market for low-powered x86, which is where Intel is going with Haswell. AMD needs to get ahead of the game and create something that is capable of power sipping (which obviously won't be x86)
Actually Intel has already shown they can make x86 phones on par [anandtech.com] with existing ARM phones, not market leading or anything but middle of the road. You want AMD to out-do ARM and Intel, push a new instruction set, create the compiler support and the industry momentum behind it? With a single, financially troubled company who I wouldn't bet is there five years from now? Yes, Itanic was a huge failure but Intel still makes Itaniums for anyone foolish enough to bet on that horse, AMD couldn't make any such promi
Re: (Score:2, Troll)
Power consumption scales with the *cube* of the clock speed, so you pretty quickly run into a power/heat wall.
Bullshit. There is no wall. I don't care all that much about heat/power/noise. I have a water cooling setup and I'm prepared to move to phase change if necessary. What I want is for Moore's Law to mean something again. Giving up on clock speed was a bad move on Intel's part. It's just sad that we still haven't made it to 5 Ghz. This whole shift from raw performance at any price to performance per watt or even wattage walls that cannot be exceeded just sucks.
I haven't bought a new CPU since my Wolfdale Core
Comment removed (Score:5, Informative)
Re: (Score:3)
Perhaps "wall" isn't the best term - it's more of rapidly diminishing gains for rapidly increasing costs. People have hit 8GHz, but through liquid nitrogen cooling, which isn't exactly practical for consumer use.
Intel has actually been doing well at maintaining steady performance increases, except with Haswell. But instead of doing it with clock speed hikes, they've been working on IPC and ISA extensions. They've added AVX, to do 256-bit SIMD operations instead of 128-bit SSE. They've done a lot of work on
Re: (Score:2)
your full of lint there. An AMD FX-8320 compares nicely with the I7-3700K (8K+ on the benchmarks) - This also puts in spitting range of the E3-1245 Xeon as while the 1270/75 tend to push just over 9000. Not bad for a chip that's only $150 (that's $125 less then 1245 and over $200 less then the 1270/75 Xeon.
Re:Awesome (Score:5, Interesting)
I suggest that it is about time programmers started getting used to coding in assembly once again.
You can give up right there. The days of humans getting 100x more efficiency out of a CPU using assembler rather than a higher level language are over. Optimizing compilers are able to devise efficiencies at large scale/detail that a human can at this point. Enterprise level software requiring millions of lines of code are just too large to be optimizable by one human writing in assembler. Speed efficiencies with out of order execution, deep pipelines consumer CPUs will be better utilized by compilers able to make better predictive arrangement of code.
Don't get me wrong. You'll always be able to find ONE "John Henry" that will be able to outcode the "stream compiler". But you can't build a world economy on one programmer. And forget about finding COMPETENT assembler programmers. The people you need to extract these kind of efficiencies are like finding prima ballerinas. Sadly, the world's economy needs more mediocre programmers to generate more working code, and more higher-level, software engineers to implement new solutions for problems addressable by a computer.
Re: (Score:3)
Re: (Score:2)
Let's get something out of the way first: that 5GHz chip will suck. Incredibly. However,
The Bulldozer architecture is fundamentally broken
is not exactly prudent phrasing. Bulldozer sucked. Period. It was obviously half-baked. Trinity is much better. If AMD can repeat their predicted 15% performance improvement on their next BD iteration, then they will have a truly good product. Yes, Intel is still better. An i3 is better than an FX-4300 and an i5 is better than an FX-8350 for most workloads. However, Intel jumped way ahead of AMD with Sandy Bridge and, si
Re: (Score:2)
So who should spend their time optimising the compilers/interpreters? Somebody has to do it. It also needs to be done at the lowest levels to make the bigger impact.
Optimising the CPU is best place to do it.
If you go up too many levels you get the bastard child that was Itanium.
Re: (Score:3)
On one hand we have Slashdot reader say to hell with hand optimizing software because _THEIR_ own time are so important and let the compiler/interpreters do their job and yet the same people complain that the chip designers should do the exact opposite.
1) There are only a few CPUs.
2) There are at least hundreds of thousands of software programs. Many of which require frequent changes as user demands change.
3) The CPUs affect the performance of all those hundreds of thousands of software.
The CPU layout isn't going to change every few weeks/months. Whereas software often changes every few months.
So it makes more sense to have a relatively few very smart and talented people optimize the CPUs. Than have many programmers "optimize" their programs and too often
Re: (Score:2)
I always wanted to have a computer running my freezer
With that kind of power consumption, I wouldn't expect it to stay being your freezer for very long.
Grilled bratwurst anyone?
2013 AMD has a message for 2005 AMD (Score:5, Insightful)
The message is: You got the Megahertz myth wrong! The only myth is that Megahertz isn't important!
Oh, and all that performance-per-watt stuff? You might want to walk that back. Oh and, pull those Youtube videos where you accuse Nvidia users of being fake-pot farmers because their cards pull so much power. Sure it was funny at the time, but we'd rather not have to live that one down now.
Re: (Score:3)
The problem is they are so many ways to judge performance.
GHZ are good for comparing like processors.
MIPS are good for similar instruction sets.
FLOPS are good for similar code (That uses floating points)
You then add these per watt if you want to show it off for a mobile device.
Re: (Score:2)
> You then add these per watt if you want to show it off for a mobile device.
Or any data center.
Re: (Score:3)
Re: (Score:2)
The problem is they are so many ways to judge performance.
GHZ are good for comparing like processors.
MIPS are good for similar instruction sets.
FLOPS are good for similar code (That uses floating points)
Of those, I think GHz is used way too often, while it actually has lost much of its meaning these days. For example we've had 2GHz desktop CPUs for a decade now, but the performance difference between them can be worlds apart.
Re: (Score:3)
describing cpu speed in ghz is like describing engine speed in rpm. it's technically accurate, but tells you nothing at all about what the product can actually do for you unless you're comparing two different examples of the exact same architecture.
Re: (Score:2)
You got the Megahertz myth wrong! The only myth is that Megahertz isn't important!
Tong in cheek and all that, but....
Frankly, today both AMD and Intel are at an IPC wall nearly as much as they have been at a clockrate wall. So, yes faster clock rate is pretty much the only way to get performance if your application doesn't scale with cores. Which is a shocking number of them.
The part I find interesting, is that if they can beat the haswell with this part then they probably have an IPC advantage over intel a
Re: (Score:2)
You sed: "The part I find interesting, is that if they can beat the haswell with this part then they probably have an IPC advantage over intel again. Remember the top end haswell turbo boosts to 4.9Ghz."
Please re-read everything you just said very very carefully. Especially the parts about how a design with a known IPC will magically get huge IPC boosts by only increasing the core clock and power draw (hint: it won't). Please also remember that Haswell only has a 4.9GHz boost speed for incredibly small valu
Re: (Score:2)
My point was, that if they can match or beat a 4.9 Ghz part with a 5Ghz part, then the IPC's are going to be similar.
I didn't say the IPC changes with clock rate...
Re: (Score:2)
That's like saying if Intel increases it's IGP performance by a factor of 10 then AMD will have to worry... of course it would, but the whole problem with that statement is the pesky word "if"
My 4770K is overclocked to 4.6GHz without that much tuning right now, and I guarantee it beats these new parts even in the perfectly multithreaded synthetic benchmarks that are best-case scenarios for AMD. It does that without being a space heater, and if the rumors about prices are true, the 4770K is an outright barga
Re: (Score:2)
You missed the parent's comment that Haswell only boosts up to 3.9GHz, not 4.9GHz.
If you can match a 3.9GHz part with a 5GHz part, well, that proves nothing about IPC - unless you can't beat it by a significant margin, then it proves IPC is lower.
Re: (Score:2)
TDP for this amd part: 220W
Well its a lot, but I think you overestimate the cooling. 120-130W is pretty common for most high end CPUs today. And GPU's do 200+ in a double high PCI slot.
Re: (Score:2)
i should think that would be because most high end cpus are only using 120W or so....
if its generating 220 W of heat...you're not gonna try and only cool it by 120W
Re: (Score:2)
Just to reply again, but my desktop PC probably runs at 100% for less than 30 hours a month. The extra 100 watts for 30 hours or so is going to be pennies. So over 3-5 years, making up even $50 in processor price difference is going to be hard to do.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
TDP for this amd part: 220W
I'm certainly rooting for AMD, but this part looks like a failure.
Keep in mind in addition to providing up to that 220W of power you also have to provide 220W worth of cooling. If that's really how hot this part is going to run then it's gonna need a *HUGE* heatsink, or high end watercooling setup to keep it at acceptable temps (Which at least for me is 30-40C, not the 50-70C all the manufacturers seem to accept nowadays.)
Just curious, why is 50-70C not acceptable to you? If the whole system is designed to live happily at that range, what does it matter?
Re: (Score:3)
If something can run at 70C for 100,000 hours, what's the benefit of running it at 40C? So you get more than 11 years of 24/7 use? The oldest CPU I still have is only 9 years old (a trusty Pentium M, circa 2004). You're talking Pentium 4 era.
Heat, over time, only effectively destroys electrolytic capacitors.
Re: (Score:2)
i cant think they're actually going to succeed with this chip. the extra cooling needed is going to kill any cost savings vs the intel.
Re: (Score:2)
The first 4 GHz are easy.
Here's Ivy Bridge chips pushing ~220 Watts to reach 4.7 GHz
http://www.legitreviews.com/images/reviews/1924/power-consumption.jpg [legitreviews.com]
/AMD's FX-9xxx series uses a 32nm process.
/Intel's Ivy Bridge and Haswell use a 22nm process.
Re: (Score:2)
It'd be interesting to see the numbers on my own system, I suppose... I have an i5-2500k (sandy bridge) that I bought right when ivy bridge came out, which I have clocked at 4.8GHz. That's running with a Cooler Master Hyper 212+ [coolermaster.com], and doesn't exceed 65'C even when transcoding blu-ray/dvd videos to h.264... I've kept it pegged at 100% CPU for 6 days straight without exceeding that temperature....
I'd be surprised if that heatsink could provide >200W of cooling, given that it's a big radiator with a fan. To
Re: (Score:3)
Did you bother to read that graph? Try looking at the bottom where it says "Wattage At the Wall"
You must be an enormous Intel fanboy to think that they have invented technology that allows every single component in the whole computer outside of the CPU to consume zero power in highly-overclocked systems....
Big deal (Score:5, Informative)
Power 6 was running at 5.0ghz 5-6 years ago.
Re: (Score:2)
Power 6 is quite a bit more expensive per processor & system.
Power 6 is corporate, FX-9590 is power to the people.
Re: (Score:2)
House warming, certainly.
Global? Maybe.
What happened to global warming? [bbc.co.uk]
Curry: less warming than predicted. Models seem wrong [news.com.au]
Re: (Score:2)
> Power 6 was running at 5.0ghz 5-6 years ago.
And Power6 sacrificed out-of-order execution to do it. It was also only 2 cores instead of 8
History repeating itself? (Score:3)
Why am I having flashbacks to the Pentium 4?
Re: (Score:2)
Probably because the cores aren't really cores. They're four cores that are basically hyperthreading.
Re: (Score:2)
Re: (Score:2)
It's hard not to. Intel wrote the book on "best way to screw-up a microarchitecture and let your competitor gain an advantage", which they have been taking into account since dropping Netboost. Now comes AMD and follows that very same book quite closely...
"Performance should closely match" (Score:4, Insightful)
The summary suggests that the "performance should closely match the recently released Intel Core i7-4770K Haswell processor", but nothing in the article, or anything released about this chip so far, supports that. It's all just guesswork until we see some actual benchmarks from the chip.
I don't honestly expect we're going to be seeing performance parity from this chip (although I'd love it to be true). But that hasn't been AMD's selling point for me for a long time. Chances are, we're going to see a chip that breaks the 5.0 GHz barrier, under-performs relative to Intel's top end chip, but costs about half as much. That's been their game for a long time now, and I haven't seen anything that leads me to believe that this chip is changing that.
AMD slower / MHz (Score:4, Insightful)
basically, the 8-core AMD was slower performance-wise the 4-core Intel with the AMD running a few MHz faster
Comment removed (Score:5, Interesting)
Re: (Score:3)
The problem with the X8s (well other than the arch, see my previous post with a link on why the BD/PD/EX platform is AMD's netburst) is they simply cost too much to make, for every X8 that comes out with all functioning core they probably get 2 dozen X4s or X6s thanks to bad cores so THAT is where the bang for the buck is, although if given a choice I'd take a Deneb or Thuban over Bulldozer any day of the week.
None of them are good bang for the buck for AMD, the FX-8150/8350 is a big chip of 315 mm^2 versus 216 mm^2 for Sandy Bridge, 160 mm^2 for Ivy Bridge and 177 mm^2 for Haswell. Granted the last two are on 22nm but even the 32nm Sandy Bridge was way smaller than AMD's chip, which means more chips per wafer and lower defect rates. And Intel is planning to move to 14nm next year, so there's absolutely no chance of AMD closing any gap, at best they avoid widening it.
Re: (Score:2, Interesting)
basically, the 8-core AMD was slower performance-wise the 4-core Intel with the AMD running a few MHz faster
Take all benchmarks with a grain of salt. While Intel has been generally winning for awhile now, that doesn't really mean AMD is completely inferior. With like chips there are certain things a modern AMD will out-perform Intel on, such as single threaded tasks. Intel will generally smoke AMD on multithreaded tasks, though. There is also cost, while AMD might be 10% less benchmark happy than a like Intel chip, it generally is over 25% less expensive, and will generally run without need to buy a new costl
Re: (Score:2)
Probably just either Intel or AMD fanboys being mad because I didn't heap unequivocal praise on whatever they like, or scorn on what they don't. I probably should have just posted "Meh. Buy what works for you."
I'll probably survive.
Re: (Score:3)
The summary suggests that the "performance should closely match the recently released Intel Core i7-4770K Haswell processor", but nothing in the article, or anything released about this chip so far, supports that. It's all just guesswork until we see some actual benchmarks from the chip.
If they're just cranking up the clock speed of an existing design, the performance should be quite predictable. The difficult-to-predict thing is the lifespan of the part. Atoms migrate faster as heat and voltage go up.
The limit on clock speed today is from heat dissipation. AMD got 8GHz out of a CPU a few years ago by cooling with liquid helium, but it's not worth the trouble.
Frying eggs with your CPU is now a feature (Score:3)
New AMD CPU, comes bundled with George Foreman grill heatsink.
Re: (Score:2)
Frying eggs with a CPU is old hat. [ncku.edu.tw]
I don't get it. (Score:2)
Re: (Score:2)
I am a programmer (of sorts) and my striped SSD is still the bottleneck.
I have clients who do geological research and the AMD GPU's do everything for them.
As a user, the only thing that ever fucks me is Adobe, Oracle, HP and the windows print spooler. And these are always flaws.
enlighten me as to you bottlenecks.
Lets overclock a Core i7-4770K to 5GHz (Score:3)
and see how the new AMD chip compares. I assure you the i7 won't need to draw 220W to do this.
Or let's look at performance per watt at normal frequencies where, if the AMD processor really does match a 4770K in raw perf, that will mean the Intel processor will be about 2.5x better on perf / watt.
As some people have mentioned, IBM routinely clocks Power architecture processors into the 4-5GHz range AND they draw several hundred watts each. If you think that's progress, I suggest you'll want to reconsider when you see the net throughput of a dense array of low-wattage Haswells cranking out aggregate SPECcpu numbers far beyond an IBM Power 7+ processor with the same total number of watts the IBM socket draws.
wattage is what drove me back to intel (Score:2)
I was sitting here looking to drop a new CPU in my quad core FX, but shit, to support the chip I needed a new power supply (650 watts now), and as it stands its not that far away from tripping a breaker (between 2 AMD computers hoggin power, lamps and a TV) AND its still a slower CPU, needs a replacement heatsink cause the coolermasters that come with the chip are loud as fuck
I like my 3770k
this makes me wonder (Score:2)
just for the record (Score:3)
Re: (Score:2)
Re: (Score:3)
According to Wikipedia [wikipedia.org], AMD is worth $4.5b. Possibly more. Perhaps Apple could convince their shareholders to take less. But we'll call it $4.5b for our purposes.
You think Apple wants to spend that much money to acquire a microprocessor company? A microprocessor company that doesn't even have its own fabrication plants [wikipedia.org]? A microprocessor company that is noticeably lagging behind their main competitors: Intel and nVidia? Whatever your feelings towards AMD, you cannot refute that their market share has b
Re: (Score:2)
They already bought PA Semi. They don't need AMD.
They need to buy TSCM to get their own fab plants and get out from under Samsung's nose & thumb.
Re: (Score:2)
According to Wikipedia, AMD is worth $4.5b. Possibly more. Perhaps Apple could convince their shareholders to take less. But we'll call it $4.5b for our purposes.
That's the balance sheet, in practice the market cap is 2.8 billion - right before Christmas it was about 1.4 billion. At any rate, AMD's technology sucks at power efficiency which makes it a horrible match for all the mobile devices (iPhone, iPad, MacBook Air, MacBook Pro) that Apple wants to sell. Even trying to make "fashionable" non-mobile products like the Mac Mini, iMac or the new Mac Pro would be very much harder with an AMD processor. If you don't mind a big case, big heatsink and big fans AMD will
Re: (Score:2)
Re: (Score:2)
The Nvidia Titan GPU card, with a 7 billion transistor chip at its heart, draws an additional 236 watts [guru3d.com] when it goes from idle to full load. It's not hard to imagine 200 watts feeding into the GPU chip. Other GPU cards on that page draw even more power than the Titan. The Radeon HD 6950 CFX card drew 329 watts. It's not hard to imagine the chip at its heart drew over 220 watts.
If you want to cool a 22
Re: (Score:3)
There are CPUs that draw that much. IBMs EC12 draws about 300 watts.
Re: (Score:2)
... because they don't have access to the advanced process technology that Intel does
That's a wrong way to put it. It's their own process, they paid for it, that's why they have access to it.
Intel's process is at least two generations ahead of everybody else because they understood long time ago that technology alone can crush the competition and decided to pour an insane amount of money into creating the said forefront technology.
What did AMD do? Become a fabless chip maker, at the mercy of the likes of TSMC or GF...
Re: (Score:2)
"That 220w figure is not correct, that is why it is so hard to believe. There has never been a cpu with a TDP that high,"
Um, I think the Itanium II MX2 actually got higher than that, with a TDP of 260.
Re: (Score:3)
Re: (Score:2)
That makes a lot more sennse and is entirely in-line with some of Intel's Xeon line at 120-150W.
Re: (Score:3)
One thing that AMD has been doing quite well is total system power consumption. Intel typically beats AMD in actual CPU draw, but then loses its edge once you include the chipsets/etc.
That is not true any more, and hasn't been since Sandy Bridge was released. Since then, the total system power consumption figures has been in favour of Intel except in the extreme low-end, such as against AMD's E-350. However, if you intend to for example build a small server or something for your home, you're better off with a SB/IB low-power Pentium. Only somewhat higher total power draw, especially if you slap in a passively cooled GT220 or something, and much better CPU performance(MUUUUCH better, beca
Re: (Score:2)
Re: (Score:2)
I can't believe a website as reputable as slashdot would post such utter nonsense.
Welcome to Slashdot.
Re: (Score:2)
sooo, are you saying he's wrong or just attacking him?
Re: (Score:2)
ppppffffft.
I had to install a separate air conditioner to cool my computers. It's like I bought a 100lb CPU cooler.
Re: (Score:2)
Because the P4 is out of production and toasters can't play games.
Re: (Score:2, Informative)
Because bacon.
Re: (Score:2)
Re: (Score:2)
If you put it in an ASUS Mobo, in the UEFI there are 3 buttons: Power Saving, Normal and Performance; Performance being the 'Turbo' mode. I accidentally had mine in Turbo mode when I assembled my Computer, then dropped it back to normal once I knew what was going on; it was a noticeable difference. By the way, it auto-overclocks to the Turbo mode, requiring you to only turn on Performance mode.