AMD Launches Fastest Phenom Yet, Phenom II X4 980 207
MojoKid writes "Although much of the buzz lately has revolved around AMD's upcoming Llano and Bulldozer-based APUs, AMD isn't done pushing the envelope with their existing processor designs. Over the last few months AMD has continued to ramp up frequencies on their current bread-and-butter Phenom II processor line-up to the point where they're now flirting with the 4GHz mark. The Phenom II X4 980 Black Edition marks the release of AMD's highest clocked processor yet. The new quad-core Phenom II X4 980 Black Edition's default clock on all four of its cores is 3.7GHz. Like previous Deneb-based Phenom II processors, the X4 980 BE sports a total of 512K of L1 cache with 2MB of L2 cache, and 6MB of shared L3 cache. Performance-wise, for under $200, the processor holds up pretty well versus others in its class and it's an easy upgrade for AM2+ and AM3 socket systems."
Wait a second... (Score:3)
I just bought a 6-core AMD chip a week ago. Where is the x6 version of this baby?
Re: (Score:3, Informative)
6 core is slower per core than 4 core simply because of thermal envelope.
6 core is superior if you need to use more than 4 cores at same time.
Re: (Score:2)
isn't that what prime95 is for?
Re: (Score:2, Insightful)
Prime95, in this context, is for convincing 0v3rcl0ckz0r kiddiez that their massive overclock is stable even though it's a terrible stability test. A prime number search program is not exactly the world's best method of achieving full test coverage of a CPU, no matter what a billion leetboy forums may tell you.
Just for example, according to its webpage, prime95 only uses 32MB of memory, which means it basically runs from cache on any modern CPU. Which in turn means you're not really exercising memory acce
Re:Wait a second... (Score:5, Informative)
Eh... Prime95 is a darned sight better then a simple memory test, because it actually *does* stress the CPU and L1/L2 cache as well as the RAM. Plus it keeps track of whether the calculations are correct.
Which is the exact same tactic that you'd better take if you're going to "do scientific calculations which have to be right". You run the calculation and either you have built-in checks or you do the calculation twice, on two different machines and compare the results. (Surprise surprise, guess how Mersenne.org checks that the turned-in results are correct?)
I've been using Prime95 ever since it came out. I've personally seen it find RAM that is slightly dodgy on timing where other tools like MemTest86 gave the RAM a free pass. In one case, the RAM was GEIL and was mislabled as a faster CL value then it actually could handle (naughty GEIL, or might have been counterfeit). Let Prime95 run for 24-48 hours with no errors, and you've got a pretty good assurance that there are no issues with timings or the memory / CPU. (Doesn't do jack to test the disk / video, but there are other tools for that.)
Now, you complain that it's not a comprehensive tool. Have you *ever* seen a case where a CPU was bad / dodgy where Prime95 did not throw an error that you caught in some other manner? That was specifically something wrong with the CPU / cache / RAM?
And frankly, there have always been those who think product X is a magic bullet. Your rant is misplaced.
Re:Wait a second... (Score:4, Interesting)
I've had prime95 catch errors on overclocks that passed -everything- else.
Know what, in every one of those instances, it was right. If I kept running at the speed that passed prime but failed everything else, I'd eventually run into random errors, sudden unrepeatable crashes, or other mysterious problems.
I've never had any issues with any overclock that passed 24 hours of prime, including distributed computing projects where they'll yell at you if you're returning bad data (i.e. aren't passing the redundancy tests).
Re: (Score:2)
unless you still use software which runs it's main loop in a single thread, with only relatively minor tasks spun off to other threads. Like say, just about any game on the market.
Re: (Score:2)
As of half life 2, I can't think of a major PC game I've played that wasn't SMP.
Re:Wait a second... (Score:5, Informative)
The Phenom II x 6 core chips already run at 3.7 GHz when 3 or fewer cores are in use (that's what the automatic turbo feature does), so the 980's ability to run 4 cores at 3.7 GHz is only a minor improvement since it basically has no turbo mode. The x6 will win for any concurrency workloads that exercise all six cpus. Intel cpus also sport a turbo mode that works similarly.
The biggest issue w/ AMD is memory bandwidth. For some reason AMD has fallen way behind Intel in that regard. This is essentially the only reason why Intel tends to win on benchmarks.
However, you still pay a big premium for Intel, particularly Sandy-Bridge chipsets, and you pay a premium for SATA-III, whereas most AMD mobos these days already give you SATA-III @ 6GBits/sec for free. Intel knows they have the edge and they are making people pay through the nose for it.
Personally speaking the AMD Phenom II x 6 is still my favorite cpu for the price/performance and wattage consumed.
-Matt
Re: (Score:2)
+1 for you sir
Re: (Score:3, Informative)
And AMD supports ECC memory in "non-server" cpus.
Wait for Bulldozer (Score:4, Insightful)
I'll be waiting for the dust to clear with Bulldozer before I make a commitment for my next build. No reason to buy a $200 Phenom II X4 980 now when there is no application that needs that much power. If you buy a Sandy Bridge or a higher-end AM3 board/processor now, your average gamer or office worker won't be able to max it out for years -- unless he does video editing or extensive photo shop or if he has to get his DVD rips down to a 10 minute rip vs a 15 minute rip per feature film...
Might as well wait for the dust to clear or for prices to fall.
Re: (Score:3)
I don't know man, $200 bucks sounds like a steal. Last time I checked, those Intel i3's and i5's were in the same range! We live in some crazy times IMO.
Re: (Score:2)
If $200 is a steal today, wouldn't $150 be a better deal 4 months from now? Saving 25% on one of the most expensive components of a computer that I'll have for three or four years seems like a worthwhile bargain for waiting a few months...
Re: (Score:3)
Re: (Score:3)
It has hyperthreading goodness and mmm have you ever had eggs fried on your processor?
Re: (Score:2)
well hell, you could just get a p4 chip for like $20, oh the savings!
It's getting into Winter down here, I could use a new space heater.
Re:Wait for Bulldozer (Score:5, Informative)
Well, also remember that Intel has something like 6 (or more) different incompatible cpu socket types in its lineup now, which means you have no real ability to upgrade in place.
AMD is all AM2+ and AM3, and all current cpus are AM3. This socket format has been around for several years. For example, I was able to upgrade all of my old AM2+ Phenom I boxes to Phenom II simply by replacing the cpu, and I can throw any cpu in AMD's lineup into my AM3 mobos. I only have one Phenom II x 6 machine right now but at least four of my boxes can accept that chip. That's a lot of upgrade potential on the cheap.
This will change, AMD can't stick with the AM3 form factor forever (I think the next gen will in fact change the socket), but generally speaking AMD has done a much better job on hardware longevity than Intel has. It isn't just a matter of the price of the cpu. I've saved thousands of dollars over the last few years by sticking with AMD.
SATA-III also matters a lot for a server now that SATA-III SSDs are in mass production. Now a single SSD can push 300-500 MBytes/sec of effectively random I/O out the door without having to resort to non-portable/custom-driver/premium-priced PCIe flash cards. Servers can easily keep gigabit pipes full now and are rapidly approaching 10GigE from storage all the way to the network.
-Matt
Re: (Score:2, Insightful)
This will change, AMD can't stick with the AM3 form factor forever (I think the next gen will in fact change the socket), but generally speaking AMD has done a much better job on hardware longevity than Intel has.
Oh yes, AMD is wonderful about keeping sockets around for a long time. The move to 64 bit CPUs only involved four (754,939,940,AM2) sockets, three (754,939,940) of which were outdated in short order.
Re: (Score:2)
939 was the single-CPU version, 940 was the dual-CPU setup and no CPU that fit in those sockets supported DDR2. Unlike Intel's chips at the time, AMD's memory controllers were *inside* the CPU. To support DDR2, they had to break compatibility at the socket level due to electrica
Re: (Score:2)
I can't help but notice that you didn't bother to compare that with however many it took Intel to make a similar leap in their processor line. Which is really the point, sockets do have to change from time to time, and I can't help but notice that you're excluding the upgrades that were pin compatible with previous sockets.
Re: (Score:3)
Bulldozer will be AM3+ but it has very good forwards and backwards compatibility with AM3. http://en.wikipedia.org/wiki/AM3%2B [wikipedia.org]
Re: (Score:3)
I doubt it spanks this X4. Lies.
Re: (Score:2)
It doesn't keep up with the Sandy 2500K, which can be had for around $200, depending on where you look. Then again, as the article suggests, it's better for an upgrade than a new system, so existing AMD users should be able to appreciate some tangible gains. Too late for me, though, I recently jumped ship from an Athlon X2 6000+ to the Sandy 2500K and I'm blown away by the performance.
Re: (Score:2)
Who does a CPU-only upgrade these days? I think most people just wait until it's time for a [mostly] whole new machine.
Though I don't know many people IRL who still build there own systems..I kind of get a sick satisfaction from it though, so I will continue to do so :)
AAAAanyways, cheers.
Re: (Score:2)
>>Who does a CPU-only upgrade these days?
Well, AMD is better about keeping sockets around longer than Intel, that seems to go through new ones every six weeks.
My last machine was built in Dec 2004, upgraded the CPU (to an AMD X2 4800+) in Jan 2007, which it gave it enough life to make it to April 2011 before I upgraded to Sandy Bridge.
Was easily the longest I've ever used the same motherboard, though admittedly it was on the leading edge of PCI-E and other features.
Re: (Score:2)
>>Who does a CPU-only upgrade these days?
Well, AMD is better about keeping sockets around longer than Intel, that seems to go through new ones every six weeks.
My last machine was built in Dec 2004, upgraded the CPU (to an AMD X2 4800+) in Jan 2007, which it gave it enough life to make it to April 2011 before I upgraded to Sandy Bridge.
Was easily the longest I've ever used the same motherboard, though admittedly it was on the leading edge of PCI-E and other features.
The problem that I run into is that even though it's the same socket as my current setup, the MB maker doesn't update the BIOS to support the new chips.
LK
Re: (Score:2)
Nobody on Intel with their new sockets every year or so, that's for sure.
I upgraded my AMD CPU not too long ago. No different than upgrading your video card.
Re: (Score:2)
Most of the time, unless you went *super* cheap on the initial CPU purchase, the mo
Re: (Score:2, Troll)
Re:Wait for Bulldozer (Score:4, Insightful)
Anyway - assuming that I leave my machine running for 8 hours a day on average, and the overwhelming majority of the time the CPU is at near-idle loads (i.e. consuming 65W), with electricity costing about $0.12 per kWh, I figure that it probably costs me about $24 per year in electricity. If I could shave off 1/3 of the electricity cost, I would only be saving $8 a year. After the 3 years that it takes me to make up that $25 difference, I'm probably in need of a new CPU anyway.
Also - while I haven't spent much time pricing motherboards recently - when I last checked I found that AMD motherboards tend to be cheaper than Intel motherboards, and also that AMD integrated graphics were considerably stronger, allowing me to get by without a discrete graphics card. Furthermore, if I wanted to upgrade my CPU now with the latest and greatest I would be able to do so without replacing my motherboard and buying new memory, I would be able to do so - whereas if I bought an LGA 1156 motherboard a year ago it would now be obsolete.
In other words - I agree with you that with Intel you'll have a faster and more power efficient machine, but I'm not so sure that you'll end up saving any money.
Re: (Score:2)
Yeah, but what about heat? My last Intel CPU ran like a toaster oven and had a fan mount that was very awkward to deal with.
Re: (Score:2)
Re: (Score:2)
I wouldn't say the Core 2 Duo is a cool running chip. I can show you a laptop that has extensive warping on the bottom due to the heat produced by a T7400.
Re: (Score:3, Informative)
X4 - 157 to 252 watts
i5-2500 - 91 to 164 watts.
In other words, it will cost between $20 (normal use, cheap electricity) and $140 (24/7, expensive electricity) per year extra. Spending the extra $10 to get the faster i5 is a no-brainer.
I guess those with mod points dont remeber the 90s (Score:2)
Depends on your principals. A) Preferring to pay more to lower your energy usage or B) Not willing to pay a company your dollar for doing things more than worthy of anti-trust investigations and/or anti-trust lawsuits on a regular basis. Over a year 140 dollars is nothing even for someone in poverty (which I am with less than 15000 a year). Its also a small pittance of a fraction of the KWh being used by unclean energy. Most the unclean energy you use is from the products you consume including gas, plastics and food. There has to be something you dropped money on worth more than 140 dollars this year that was actually more useless than giving your 140 to the electric company over choosing AMD. Mine would probably be beer.
I remember when Intel had a virual monopoly in the 90's and early 200's. Prices of chips were expensive. We complain about $10 on a $200 processor when 10 years ago the entry level proc was over $400 (now they are a scant above $50, take that with inflation). Only when AMD bought out their Athlon 64 proc's out did Intel start to take them seriously with a complete arch redesign (the core series), Intel was so far entrenched in monopoly that before the Core series was released AMD had their X2 series out. O
Re: (Score:3)
The i5 spanked the X4, for only $10 more
When I was shopping around for a new processor (around a year and a half ago), I could get myself a Phenom II x4 965, or a comparable i7 for around $50 more. The i7 trounced the Phenom on most benchmarks (though I doubt I'd ever notice in real life computing). I was tempted, and then I realized I'd have to spend around $70 more for a comparable mother board, and possibly have to replace my perfectly good 6Gb of DDR2 with DDR3 (which at that time would have cost another $100+)... So that $70 turned into $
Re: (Score:3)
I took that $100 and put it towards a decent video card instead.
Processors really aren't that much of a big deal anymore...
It is surprising how many people do not understand this. Rather than getting a bleeding edge CPU (and mobo+RAM to go with it) you will often see much more benefit saving some on the CPU-et-al and spending a but more on the graphics card (if you are a gamer and haven't already specified something silly in that respect) or getting better drives (a reasonable SSD won't set you back too much, and can make make much more useful difference to everyday use than spending the extra on a bleeding-edge CPU).
Re: (Score:2)
rwade said:
there is no application that needs that much power
So, just because you don't plan on buying it, means that a significant portion of software simply doesn't exist. I think your logic is broken; you should look for a new one.
Re: (Score:2)
I'm not saying don't buy the power -- I think that's pretty obvious from even a careless reading of my comment.
But to clarify for you -- I'm saying, don't buy the power today for applications not available today because you can get the same amount of power in a few months for a 25% discount. Even then, the applications will probably not be there yet...
Re: (Score:2)
Re:Wait for Bulldozer (Score:4, Insightful)
My sense is that people who actually *use* a computer also install dozens of applications and end up with complicated and highly tailored system configurations that are time consuming to get right and time consuming to recreate on a new system.
The effort to switch to a new system tends to outweigh the performance improvement and nobody does it until the performance improvement makes it really worthwhile (say, Q6600 to a new i5 or i7).
I've found that because I end up maintaining a system for a longer period, it pays to buy power today for applications very likely to need or use it in the lifetime of the machine. Avoid premature obsolescence.
Re: (Score:3)
My primary machine (Thinkpad T61p) is almost 4 years old already (and the Tecra 9100 before that lasted 5 years). Yes, I wish it had more RAM and maybe a slightly faster video card. But instead of buying a new laptop this year, I dropped a large SSD in it instead.
Wo
Re: (Score:2)
libx264 seems to do a good job using up all the CPU i can though as it (a 1055T x6 now). Emerge does a decent job as well.
Re: (Score:3)
This [arstechnica.com] thread has some interesting information on possible BD performance.
Uh...this is 301 posts of Intel fans vs AMD fan (Score:4, Insightful)
This [arstechnica.com] thread has some interesting information on possible BD performance.
.....
This is 301 posts with back and forth that looks basically to be speculation. Prove me wrong by quoting specific statements of those that have benched the [unreleased] bulldozer. Because otherwise, this link is basically a bunch of AMD fanboys fighting against Intel fanboys. But prove me wrong...
Re: (Score:3)
The guy's trying to prove a point that Bulldozer -- which is, again, unreleased -- is looking underpowered, and he's doing it by pointing to a message board full of fanboy speculation. It's 8 pages of posts. I'm basically calling BS on the guy's suggestion that BD is looking underpowered -- frankly, no one but AMD knows anything about BD's performance.
No.
One.
At.
All.
It is all speculation...based on what? There's all this crap in here about AMD being the reason that we're not still using 2GHz Pentium4s, blah,
Re: (Score:2)
I fully acknowledge that this is largely rooted in speculation at this point, but what's come out so far isn't encouraging - the openbenchmarking results based on an engineering sample, for instance, showed performance that was not substantially improved from the Magny-Cours processor.
Re: (Score:2)
Re:Uh...this is 301 posts of Intel fans vs AMD fan (Score:4, Insightful)
It is likely that these sample chips are as much a test of the new 32nm fab as they are a test of the new cpu architecture, and definitely not a test of how quickly they can be clocked.
Re:Wait for Bulldozer (Score:5, Insightful)
>>I'll be waiting for the dust to clear with Bulldozer before I make a commitment for my next build.
I agree. The Phenom II line is just grossly underpowered compared to Sandy Bridge:
http://www.anandtech.com/bench/Product/288?vs=362 [anandtech.com]
The i5 2500K is in the same price range, but is substantially faster. Bulldozer ought to even out the field a bit, but then Intel will strike back with their shark-fin Boba FETs or whatever (I didn't pay much attention to the earlier article on 3D transistors.)
And then on the high-ish end, AMD has nothing to compete against the i7 2600K. And it's not really that much more expensive (+$100) for the 15% extra gain in performance. It's not like their traditional $1000 high end offerings.
Re:Wait for Bulldozer (Score:5, Interesting)
And then on the high-ish end, AMD has nothing to compete against the i7 2600K. And it's not really that much more expensive (+$100) for the 15% extra gain in performance. It's not like their traditional $1000 high end offerings.
Intel essentially skipped a cycle on the high end because they were completely uncontested anyway. The last high-end socket was LGA 1366, then we've had two midrange sockets in a row with LGA 1156 and LGA 1155. Late this year we'll finally see LGA 2011, the high end Sandy Bridge. Expect another round of $999 extreme edition processors then - with six cores, reportedly.
Re:Wait for Bulldozer (Score:4, Interesting)
And it's not really that much more expensive (+$100) for the 15% extra gain in performance.
On the x264 Pass 1 Encode test, the i7 2600K is 28% faster than the Phenom II X6 1075T, but (right now, at NewEgg) 66% more expensive.
Since AMD and the mobo manufacturers has a track record with AM2/AM2+/AM3 of backwards compatibility with simple BIOS upgrades, I'm going to stick with them until Intel achieves parity with AMD.
Re: (Score:2)
I agree. The Phenom II line is just grossly underpowered compared to Sandy Bridge:
Also, the Phenom II line is over 2 years old. I bought a Phenom II 955 when it was first released in Oz, that was in Feb 2009.
Phenom II beat the old Core 2 Duo's at the time, it stands to reason that a new arch will be competitive with Intels new arch, not their old one.
Plus, I can stick this new proc into my old AM3 board, cant do that with an Intel board. If I wanted to upgrade from a C2D E8400, I'd need a new board. Not that I need a new proc, the 955 is still going strong, a new high end Geforce 5
Re: (Score:3)
Re: (Score:2)
Now suddenly you claim that AMD has to sell for those same prices because they cant compete with the 2600K? Do you often make it up as you go along?
Re: (Score:2)
Re: (Score:2)
Which is sort of the point. Personally, I'd like to move from my dual core up to a triple or quad core, and will next time I upgrade, but for me and my typical usage patterns, I'd be better off sticking with a quad core and getting one with faster cores than moving to 6 or 8 cores.
But if I were really into something like music production, video editing or 3d rendering, I'd probably take as many cores as I could get.
Now, as more and more software gets written with multicores in mind, I may end up getting one
Re: (Score:2)
Go record an orchestra in 24/192 and get back to me on how nothing needs something like that.
The problem is that there are -- compared to the billions of people using PCs -- relatively few uses for that much CPU power.
Heck, I'd *love* to pop for a Phenom II X6 to replace my dowdy old Athlon 64X2 4000+, but when I need some ISOs transcoded into x264, I log into my wife's PC (Athlon II X2 555) and use CLI tools over NFS to chug away at them. It gets 198% of CPU for the 22 hours/day that she doesn't use it, and 100% when she *does* use it...
bash, HandBrakeCLI & NFS FTW!!
Re: (Score:2)
Wow, such a narrow world view.
There are a lot of applications out there where single-core speed matters. And $200 is chump change for a CPU that is at the upper end of the speed range. It wasn't that many years ago that a *dual* core CPU was considered affordable once they dropped below $300 (and it was a happy day when they got below $200).
And no, you wouldn't buy this for the average office worke
Today, the complexity of numbering continues... (Score:2)
Re:Today, the complexity of numbering continues... (Score:5, Informative)
First comes the family name. For desktops, this is usually either "Athlon II" or "Phenom II". The only real difference between them is the amount of cache.
Then comes the core count - X2, X3, X4 or X6. Completely self-explanatory.
This is followed by a number that essentially stands in for the clock speed. Higher-clocked processors have higher numbers, lower-clocked processors have lower numbers.
Finally, certain processors have "Black Edition" appended, which simply means that the multiplier is unlocked, greatly easing overclocking.
Re: (Score:2)
An Athlon 64 3800+ [cpubenchmark.net] performed on par with a 3800 MHz Pentium 4 [cpubenchmark.net]
Re: (Score:2)
I think it basically just got out of hand and now n
3700 megahertz? (Score:2)
Bah.
My ten-year-old CPU does 3100 megahertz. Things have slowed dramatically since the the 80s and 90s, when speeds increased from approximately 8 megahertz (1980) to 50 megahertz (1990) to 2000 megahertz (2000). If the scaling had continued, we would be at ~20,000 by now. Oh well. So much for Moore's Observation.
Re:3700 megahertz? (Score:5, Insightful)
Furthermore, there is no 10 year old CPU that runs at 3ghz unless you did some absurd overclocking.
Re:3700 megahertz? (Score:4, Interesting)
Not exactly, but close for single-core performance. The "MHz Myth" is largely a myth itself. As this table [theknack.net] shows, per-MHz single-core performance between the infamously bad (even at the time) P4 and the current best (Core i7) has only improved by a factor of less than 2.6, since October 2004! (When the Pentium 3.6 EE was released).
Perhaps more importantly, the ratio between the most productive (per-mhz) chip from 2004 (Athlon64 2.6) and the most productive on the chart now is a mere 1.6! That's a 60% improvement in almost 7 years!
That is a joke. For reference, we went from the Pentium 100 (March 1994) to the Pentium 200 (June 1996) - approximately a 100% improvement in a little over 2 years.
So, no, improvements in instructions per cycle are not even close to keeping pace with what improvements in MHz used to give us. (And if you looked at instructions per cycle per transistor, it would be abysmal - which is another way of saying Moore's law is hardly helping single-threaded performance any more).
Re: (Score:3)
I'm sure you didn't mean it quite this way, but a 60% improvement in the amount of work done per clock cycle is some pretty impressive engineering...
Re: (Score:3)
Re:3700 megahertz? (Score:4, Insightful)
Re: (Score:3)
You could find a wider variety of benchmarks with results reported on a wide range of new and old CPUs if you took points/core/MHz.
Re: (Score:2)
per-MHz single-core performance between the infamously bad (even at the time) P4 and the current best (Core i7) has only improved by a factor of less than 2.6, since October 2004! (When the Pentium 3.6 EE was released).
An i7 965OC running at 4060MHz is 2x faster at single-threaded Cinebench than a P4E 670 running at 3800MHz. In 6 years, Intel achieved 100% more performance for 7% more MHz.
Nothing to sneeze at, but why the heck do you think They went multi-core? Because 4GHz is a pseudo-wall.
Multi-threaded, the i7 965OC is 8.5x faster than the P4E 670.
Re: (Score:2)
Re:3700 megahertz? (Score:5, Informative)
Moores observation was about transistor count, not mHz, corecount, speed, wattage, flops, bogomips, or anything else.
Another no hum processor from AMD :( (Score:2)
Is anyone else disappointed that AMD's fastest desktop processor can barely keep up with Intel's midrange Sandy Bridge Core i5 processors in most applications? Sure, AMD's processors are still a great value, but it seems like they fall further behind with their performance parts every year.
I just hope that the performance improvements for Bulldozer are all they're cracked up to be.
Re: (Score:3)
Very few people buy CPUs over $200-$300.
(I stick with AMD for a few reasons. There's never any guesses about whether an Opteron will support hardware virtualization or whether it will be disabled by the chipset/BIOS. Their product lineup is straight forward compared to Intel, and their sockets make sen
Re: (Score:2)
That's why monopolies are bad, mkay. Every time, without fail, AMD gets the upper hand Intel goes and does something like bribe vendors not to carry AMD based products or something similar. I've had a really hard time over the years finding AMD chips integrated into systems. You can usually find them at small shops, but rarely do you see more than one or two at major chains. And even at small shops a lot of them are Intel only these days. You can usually find them without too much trouble on line, but you
"... holds up pretty well"? (Score:3)
Ummm, against what, my obsolete Phenom (I) X4 9850? Funny how true fanbois can read the same review as an objective person and walk away with entirely different conclusions, eh?
The AnandTech review [anandtech.com] was even less forgiving of AMD's underdog status, and basically recommended passing and either waiting for the allegedly awesome new Bulldog line or jumping ship for Intel. Hell, when Sandy Bridge both outperforms AND underconsumes (power), you oughtta be seriously questioning that underdog affection. I certainly am.
We are no longer chasing the Phantom x86... (Score:2, Insightful)
Read this excerpt from an AMD management blog:
"Thanks to Damon at AMD for this link to a blog from AMD's Godfrey Cheng.
We are no longer chasing the Phantom x86 Bottleneck. Our goal is to provide good headroom for video and graphics workloads, and to this effect “Llano” is designed to be successful. To be clear, AMD continues to invest in x86 performance. With our “Bulldozer” core and in future Bulldozer-based products, we are designing for faster and more efficient x86 performance; h
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
please...anything but monopoly....
Re:We are no longer chasing the Phantom x86... (Score:4, Insightful)
Alternatively, even though intel has won, he is acknowledging that for 99.9% of people, CPU performance no longer matters as much as it used to.
In general use for example, I see no difference between the core i5 in my work machine, and then Pentium D in my oldest home box.
Gaming? Sure, however even that is becoming more GPU constrained.
Both AMD and intel are on notice. Hence both are putting more focus into GPUs. In terms of CPU, it won't be that long before some random multi-core strongarm variant is more power than any regular user will ever need, and they absolutely kill x86 in terms of performance per watt.
The focus is no longer absolute CPU performance, it is shifting towards price, size, heat and power consumption. Computing is going mobile and finding its way into ever smaller devices. Rather than building one large box, distributed computing is the new wave (the cloud, clustering, etc).
AMD's CPU business might have a tricky path ahead, bu thent so does x86 in general, barring niche markets. If AMD are doomed, then intel's traditional dominance using x86 won't be too far behind them.
Re: (Score:2)
Re: (Score:2)
I am surprised... (Score:2)
Waiting for Bulldozer (Score:2)
Re: (Score:2)
It's not uncommon. Other benchmarking sites do it too.
Re: (Score:2)
But is it also well known that increasing resolutions and knocking off all the effects dose nothing to the CPU performance?
Re:Weird Benchmarks: chrysis at 800x600 resolution (Score:4, Interesting)
Once the GPU is maxed-out, there's nothing more for the CPU to do. If you're running at 30 FPS at high-res, the CPU might be at 30%. At that point, any number of different CPUs will have identical benchmark results. When you drop the load off the GPU, the CPU hits 100% usage and you can compare 150 fps to 160 fps, for example. This is a very simple and typical way to benchmark CPUs for gaming perf. Reviews and reviewers (such as myself) have been doing this for 10+ years, since the very first 3D accelerators came to the gaming market.
Re: (Score:2)
Re: (Score:3)
That seems like a stupid way to benchmark. It encourages people to be misinformed by thinking that they can get better frame rates by buying a faster CPU even though under real world conditions the game will be GPU bound and the CPU is irrelevant. Why not stick to benchmarking using applications that are actually CPU bound under normal usage?
Re: (Score:3)
Not stupid at all. It shows that if your video is not a factor or you upgrade to an adequate video card when one is available,the better cpu to buy is X.
This is common practice that has been used for at least a decade now.
furthermore - its a SYNTHETIC BENCHMARK. no one wants to play crysis at 800x600, but its a tool we can use to measure cpu performance. no one wants to buy a PC soley to sit and calculate prime numbers all day either, but stressprime and other stuff is used for benchmarking also.
Re: (Score:2)
Not stupid at all. It shows that if your video is not a factor or you upgrade to an adequate video card when one is available,the better cpu to buy is X.
Assuming that GPUs are available that are so fast they move the bottleneck back to the CPU, the way to do it would then be to use one of those GPUs and use normal resolutions. If even the fastest GPUs still result in the GPU as the bottleneck, just find a different benchmark. There is no lack of synthetic benchmarks that someone could use without misleading people into thinking they need a super fast CPU for a heavily GPU-bound game.
This is especially true now that integrated GPUs are becoming respectable.
Re: (Score:3)
Well yes, if such GPUs were available today, sure. They're not. However the gaming benchmark is not useless, because its a real-world mix of code in a typical app the CPU might be used to run. You're looking at the benchmarks expecting them to be some absolute result. They're not. You have to use your head and interpret the results, as with any experiement. A benchmark isn't a "you need to buy this cpu for this game" statement. Its a performance indication on a particular code path.
Synthetic bench
Re: (Score:2)
That seems like a stupid way to benchmark.
Why would you use a GPU-limited benchmark when comparing performance of different CPUs? That would be retarded.
Re: (Score:2)
That's the point. Find the thing people actually do which is CPU-bound, don't just fudge a GPU-limited thing until you get different numbers for different CPUs.
Re: (Score:2)
That's the point. Find the thing people actually do which is CPU-bound, don't just fudge a GPU-limited thing until you get different numbers for different CPUs.
Nothing that the average person does is CPU-bound if they have a fast CPU; most of the time it will be idling. The closest they'll get is gaming when not GPU-bound, which they may not do today, but they will when they replace their GPU in two years.
Ultimately if you want the fastest CPU then you want to run things that are CPU-bound. If you just play crappy console game ports and run them so they're GPU-bound then you'll do fine with a dual-core in most cases.
Re: (Score:2)
Re: (Score:2)
Nothing that the average person does is CPU-bound if they have a fast CPU
Exactly. With a trailing edge GeForce 210, an Athlon 64X2 4000 and proper drivers, even CPU hogs like Flash on Firefox don't burden the system. Mostly the system is waiting for me to type or the network to respond.