Forgot your password?
typodupeerror
AMD Hardware

AMD Launches Fastest Phenom Yet, Phenom II X4 980 207

Posted by samzenpus
from the new-and-improved dept.
MojoKid writes "Although much of the buzz lately has revolved around AMD's upcoming Llano and Bulldozer-based APUs, AMD isn't done pushing the envelope with their existing processor designs. Over the last few months AMD has continued to ramp up frequencies on their current bread-and-butter Phenom II processor line-up to the point where they're now flirting with the 4GHz mark. The Phenom II X4 980 Black Edition marks the release of AMD's highest clocked processor yet. The new quad-core Phenom II X4 980 Black Edition's default clock on all four of its cores is 3.7GHz. Like previous Deneb-based Phenom II processors, the X4 980 BE sports a total of 512K of L1 cache with 2MB of L2 cache, and 6MB of shared L3 cache. Performance-wise, for under $200, the processor holds up pretty well versus others in its class and it's an easy upgrade for AM2+ and AM3 socket systems."
This discussion has been archived. No new comments can be posted.

AMD Launches Fastest Phenom Yet, Phenom II X4 980

Comments Filter:
  • by mr_stinky_britches (926212) on Wednesday May 04, 2011 @08:10PM (#36030670) Homepage Journal

    I just bought a 6-core AMD chip a week ago. Where is the x6 version of this baby?

    • Re: (Score:3, Informative)

      by Anonymous Coward

      6 core is slower per core than 4 core simply because of thermal envelope.

      6 core is superior if you need to use more than 4 cores at same time.

    • Re:Wait a second... (Score:5, Informative)

      by m.dillon (147925) on Wednesday May 04, 2011 @08:59PM (#36031012) Homepage

      The Phenom II x 6 core chips already run at 3.7 GHz when 3 or fewer cores are in use (that's what the automatic turbo feature does), so the 980's ability to run 4 cores at 3.7 GHz is only a minor improvement since it basically has no turbo mode. The x6 will win for any concurrency workloads that exercise all six cpus. Intel cpus also sport a turbo mode that works similarly.

      The biggest issue w/ AMD is memory bandwidth. For some reason AMD has fallen way behind Intel in that regard. This is essentially the only reason why Intel tends to win on benchmarks.

      However, you still pay a big premium for Intel, particularly Sandy-Bridge chipsets, and you pay a premium for SATA-III, whereas most AMD mobos these days already give you SATA-III @ 6GBits/sec for free. Intel knows they have the edge and they are making people pay through the nose for it.

      Personally speaking the AMD Phenom II x 6 is still my favorite cpu for the price/performance and wattage consumed.

      -Matt

  • Wait for Bulldozer (Score:4, Insightful)

    by rwade (131726) on Wednesday May 04, 2011 @08:12PM (#36030676)

    I'll be waiting for the dust to clear with Bulldozer before I make a commitment for my next build. No reason to buy a $200 Phenom II X4 980 now when there is no application that needs that much power. If you buy a Sandy Bridge or a higher-end AM3 board/processor now, your average gamer or office worker won't be able to max it out for years -- unless he does video editing or extensive photo shop or if he has to get his DVD rips down to a 10 minute rip vs a 15 minute rip per feature film...

    Might as well wait for the dust to clear or for prices to fall.

    • I don't know man, $200 bucks sounds like a steal. Last time I checked, those Intel i3's and i5's were in the same range! We live in some crazy times IMO.

      • by rwade (131726)

        If $200 is a steal today, wouldn't $150 be a better deal 4 months from now? Saving 25% on one of the most expensive components of a computer that I'll have for three or four years seems like a worthwhile bargain for waiting a few months...

        • by uncanny (954868)
          well hell, you could just get a p4 chip for like $20, oh the savings!
          • It has hyperthreading goodness and mmm have you ever had eggs fried on your processor?

          • by mjwx (966435)

            well hell, you could just get a p4 chip for like $20, oh the savings!

            It's getting into Winter down here, I could use a new space heater.

      • by m.dillon (147925) on Wednesday May 04, 2011 @09:13PM (#36031070) Homepage

        Well, also remember that Intel has something like 6 (or more) different incompatible cpu socket types in its lineup now, which means you have no real ability to upgrade in place.

        AMD is all AM2+ and AM3, and all current cpus are AM3. This socket format has been around for several years. For example, I was able to upgrade all of my old AM2+ Phenom I boxes to Phenom II simply by replacing the cpu, and I can throw any cpu in AMD's lineup into my AM3 mobos. I only have one Phenom II x 6 machine right now but at least four of my boxes can accept that chip. That's a lot of upgrade potential on the cheap.

        This will change, AMD can't stick with the AM3 form factor forever (I think the next gen will in fact change the socket), but generally speaking AMD has done a much better job on hardware longevity than Intel has. It isn't just a matter of the price of the cpu. I've saved thousands of dollars over the last few years by sticking with AMD.

        SATA-III also matters a lot for a server now that SATA-III SSDs are in mass production. Now a single SSD can push 300-500 MBytes/sec of effectively random I/O out the door without having to resort to non-portable/custom-driver/premium-priced PCIe flash cards. Servers can easily keep gigabit pipes full now and are rapidly approaching 10GigE from storage all the way to the network.

        -Matt

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          This will change, AMD can't stick with the AM3 form factor forever (I think the next gen will in fact change the socket), but generally speaking AMD has done a much better job on hardware longevity than Intel has.

          Oh yes, AMD is wonderful about keeping sockets around for a long time. The move to 64 bit CPUs only involved four (754,939,940,AM2) sockets, three (754,939,940) of which were outdated in short order.

          • 754 was the budget socket. No bets there. If you bought a 754-based system and expected upgrades, you did not do your homework. Budget based systems are design for people who buy a cheap machine, and treat it like a black box.

            939 was the single-CPU version, 940 was the dual-CPU setup and no CPU that fit in those sockets supported DDR2. Unlike Intel's chips at the time, AMD's memory controllers were *inside* the CPU. To support DDR2, they had to break compatibility at the socket level due to electrica
          • by hedwards (940851)

            I can't help but notice that you didn't bother to compare that with however many it took Intel to make a similar leap in their processor line. Which is really the point, sockets do have to change from time to time, and I can't help but notice that you're excluding the upgrades that were pin compatible with previous sockets.

        • Bulldozer will be AM3+ but it has very good forwards and backwards compatibility with AM3. http://en.wikipedia.org/wiki/AM3%2B [wikipedia.org]

    • rwade said:
      there is no application that needs that much power

      So, just because you don't plan on buying it, means that a significant portion of software simply doesn't exist. I think your logic is broken; you should look for a new one.

      • by rwade (131726)

        I'm not saying don't buy the power -- I think that's pretty obvious from even a careless reading of my comment.

        But to clarify for you -- I'm saying, don't buy the power today for applications not available today because you can get the same amount of power in a few months for a 25% discount. Even then, the applications will probably not be there yet...

        • by MoonBuggy (611105)

          The chip doesn't cost that much in the first place, though. I'm not sure I'd go as far as to estimate 25% off in a few months anyway, but even if that is the drop we see, $50 is not an especially significant amount of cash to most people (not the ones in the market for a fairly high-end new machine, anyway). When I see people running out to buy $1200 'extreme edition' chips, I certainly wonder whether they need that extra few percent in performance enough to justify adding the price of an entire high-end la

        • by swb (14022) on Wednesday May 04, 2011 @10:01PM (#36031306)

          My sense is that people who actually *use* a computer also install dozens of applications and end up with complicated and highly tailored system configurations that are time consuming to get right and time consuming to recreate on a new system.

          The effort to switch to a new system tends to outweigh the performance improvement and nobody does it until the performance improvement makes it really worthwhile (say, Q6600 to a new i5 or i7).

          I've found that because I end up maintaining a system for a longer period, it pays to buy power today for applications very likely to need or use it in the lifetime of the machine. Avoid premature obsolescence.

          • I'm in a similar boat, I get things running and then prefer not to migrate. Heck, unless you need raw CPU power and are still running on a dual-core, there's not much incentive for moving to a new system more often now then every 4-5 years.

            My primary machine (Thinkpad T61p) is almost 4 years old already (and the Tecra 9100 before that lasted 5 years). Yes, I wish it had more RAM and maybe a slightly faster video card. But instead of buying a new laptop this year, I dropped a large SSD in it instead.

            Wo
        • by cynyr (703126)

          libx264 seems to do a good job using up all the CPU i can though as it (a 1055T x6 now). Emerge does a decent job as well.

    • Bulldozer is looking increasingly underpowered compared to Sandy Bridge, with some benchmarks indicating potentially worse performance per cycle than the existing K10.5 core.

      This [arstechnica.com] thread has some interesting information on possible BD performance.
      • by rwade (131726) on Wednesday May 04, 2011 @08:29PM (#36030812)

        This [arstechnica.com] thread has some interesting information on possible BD performance.

        .....

        This is 301 posts with back and forth that looks basically to be speculation. Prove me wrong by quoting specific statements of those that have benched the [unreleased] bulldozer. Because otherwise, this link is basically a bunch of AMD fanboys fighting against Intel fanboys. But prove me wrong...

    • by ShakaUVM (157947) on Wednesday May 04, 2011 @08:34PM (#36030840) Homepage Journal

      >>I'll be waiting for the dust to clear with Bulldozer before I make a commitment for my next build.

      I agree. The Phenom II line is just grossly underpowered compared to Sandy Bridge:
      http://www.anandtech.com/bench/Product/288?vs=362 [anandtech.com]

      The i5 2500K is in the same price range, but is substantially faster. Bulldozer ought to even out the field a bit, but then Intel will strike back with their shark-fin Boba FETs or whatever (I didn't pay much attention to the earlier article on 3D transistors.)

      And then on the high-ish end, AMD has nothing to compete against the i7 2600K. And it's not really that much more expensive (+$100) for the 15% extra gain in performance. It's not like their traditional $1000 high end offerings.

      • by Kjella (173770) on Wednesday May 04, 2011 @09:56PM (#36031270) Homepage

        And then on the high-ish end, AMD has nothing to compete against the i7 2600K. And it's not really that much more expensive (+$100) for the 15% extra gain in performance. It's not like their traditional $1000 high end offerings.

        Intel essentially skipped a cycle on the high end because they were completely uncontested anyway. The last high-end socket was LGA 1366, then we've had two midrange sockets in a row with LGA 1156 and LGA 1155. Late this year we'll finally see LGA 2011, the high end Sandy Bridge. Expect another round of $999 extreme edition processors then - with six cores, reportedly.

      • by Nutria (679911) on Thursday May 05, 2011 @12:15AM (#36031924)

        And it's not really that much more expensive (+$100) for the 15% extra gain in performance.

        On the x264 Pass 1 Encode test, the i7 2600K is 28% faster than the Phenom II X6 1075T, but (right now, at NewEgg) 66% more expensive.

        Since AMD and the mobo manufacturers has a track record with AM2/AM2+/AM3 of backwards compatibility with simple BIOS upgrades, I'm going to stick with them until Intel achieves parity with AMD.

      • by mjwx (966435)

        I agree. The Phenom II line is just grossly underpowered compared to Sandy Bridge:

        Also, the Phenom II line is over 2 years old. I bought a Phenom II 955 when it was first released in Oz, that was in Feb 2009.

        Phenom II beat the old Core 2 Duo's at the time, it stands to reason that a new arch will be competitive with Intels new arch, not their old one.

        Plus, I can stick this new proc into my old AM3 board, cant do that with an Intel board. If I wanted to upgrade from a C2D E8400, I'd need a new board. Not that I need a new proc, the 955 is still going strong, a new high end Geforce 5

      • Anandtech uses a very odd selection of benchmarks that seemingly serves just to make Intel CPUs look good, since Intel is a major site sponsor. How else could you explain them doing stuff like using Cinebench 11.5 to develop power draw numbers but completely omitting performance data using that application, and instead using Cinebench R10 to determine CPU performance? Cinebench 11.5 performs much better on AMD's CPUs than R10 does, so if you wanted to show Intel's wares in the best light, you'd use R10. Dit
    • Believe me, there are applications that can make good practical use of that much power. Go record an orchestra in 24/192 and get back to me on how nothing needs something like that.
      • by hedwards (940851)

        Which is sort of the point. Personally, I'd like to move from my dual core up to a triple or quad core, and will next time I upgrade, but for me and my typical usage patterns, I'd be better off sticking with a quad core and getting one with faster cores than moving to 6 or 8 cores.

        But if I were really into something like music production, video editing or 3d rendering, I'd probably take as many cores as I could get.

        Now, as more and more software gets written with multicores in mind, I may end up getting one

      • by Nutria (679911)

        Go record an orchestra in 24/192 and get back to me on how nothing needs something like that.

        The problem is that there are -- compared to the billions of people using PCs -- relatively few uses for that much CPU power.

        Heck, I'd *love* to pop for a Phenom II X6 to replace my dowdy old Athlon 64X2 4000+, but when I need some ISOs transcoded into x264, I log into my wife's PC (Athlon II X2 555) and use CLI tools over NFS to chug away at them. It gets 198% of CPU for the 22 hours/day that she doesn't use it, and 100% when she *does* use it...

        bash, HandBrakeCLI & NFS FTW!!

    • No reason to buy a $200 Phenom II X4 980 now when there is no application that needs that much power.

      Wow, such a narrow world view.

      There are a lot of applications out there where single-core speed matters. And $200 is chump change for a CPU that is at the upper end of the speed range. It wasn't that many years ago that a *dual* core CPU was considered affordable once they dropped below $300 (and it was a happy day when they got below $200).

      And no, you wouldn't buy this for the average office worke
  • 10 years ago it was confusing enough with what could be seen as reliable product APUs fro AMD with 1.4 ghz here, and 1.23 ghz there, name changes to meaningless marketing numbers and names. So, I'll stay ignorant and simply ignore these 'breakthrough' numbers and buy product instead of specifications.
  • Bah.

    My ten-year-old CPU does 3100 megahertz. Things have slowed dramatically since the the 80s and 90s, when speeds increased from approximately 8 megahertz (1980) to 50 megahertz (1990) to 2000 megahertz (2000). If the scaling had continued, we would be at ~20,000 by now. Oh well. So much for Moore's Observation.

    • Re:3700 megahertz? (Score:5, Insightful)

      by DurendalMac (736637) on Wednesday May 04, 2011 @08:59PM (#36031008)
      So clock speed means everything when comparing different CPUs and not their raw performance. Got it.

      Furthermore, there is no 10 year old CPU that runs at 3ghz unless you did some absurd overclocking.
      • Re:3700 megahertz? (Score:4, Interesting)

        by timeOday (582209) on Wednesday May 04, 2011 @11:32PM (#36031736)

        So clock speed means everything when comparing different CPUs and not their raw performance. Got it.

        Not exactly, but close for single-core performance. The "MHz Myth" is largely a myth itself. As this table [theknack.net] shows, per-MHz single-core performance between the infamously bad (even at the time) P4 and the current best (Core i7) has only improved by a factor of less than 2.6, since October 2004! (When the Pentium 3.6 EE was released).

        Perhaps more importantly, the ratio between the most productive (per-mhz) chip from 2004 (Athlon64 2.6) and the most productive on the chart now is a mere 1.6! That's a 60% improvement in almost 7 years!

        That is a joke. For reference, we went from the Pentium 100 (March 1994) to the Pentium 200 (June 1996) - approximately a 100% improvement in a little over 2 years.

        So, no, improvements in instructions per cycle are not even close to keeping pace with what improvements in MHz used to give us. (And if you looked at instructions per cycle per transistor, it would be abysmal - which is another way of saying Moore's law is hardly helping single-threaded performance any more).

        • by AdamHaun (43173)

          I'm sure you didn't mean it quite this way, but a 60% improvement in the amount of work done per clock cycle is some pretty impressive engineering...

          • by timeOday (582209)
            Well, that's the problem... since hitting the MHz wall, it's taking more and more heroic efforts to achieve any speedup in single-core performance. (In fact if I'm not mistaken, the most-productive-per-cycle core on that chart is a couple years old.) But I agree, it's not that engineers are getting dumber or anything like that. It's getting harder, and progress has become slow.
        • Re:3700 megahertz? (Score:4, Insightful)

          by smash (1351) on Thursday May 05, 2011 @12:06AM (#36031898) Homepage Journal
          Don't forget that IPC isn't the be all and end all. If you're stalled due to cache misses, then IPC goes out the Window. Modern CPUs have much more cache and much faster buses to main memory than we had in 2004. That is a large reason as to why they're faster. They also have additional instructions that can do more work per instruction - so comparing IPC from CPUs released today to CPUs released last decade is even more meaningless.
          • by timeOday (582209)
            It's not a synthetic IPC measure, it's CineBench (ray tracing basically). In other words modern CPUs just aren't all that much faster, except for the cores. At least on that benchmark.

            You could find a wider variety of benchmarks with results reported on a wide range of new and old CPUs if you took points/core/MHz.

        • by Nutria (679911)

          per-MHz single-core performance between the infamously bad (even at the time) P4 and the current best (Core i7) has only improved by a factor of less than 2.6, since October 2004! (When the Pentium 3.6 EE was released).

          An i7 965OC running at 4060MHz is 2x faster at single-threaded Cinebench than a P4E 670 running at 3800MHz. In 6 years, Intel achieved 100% more performance for 7% more MHz.

          Nothing to sneeze at, but why the heck do you think They went multi-core? Because 4GHz is a pseudo-wall.

          Multi-threaded, the i7 965OC is 8.5x faster than the P4E 670.

    • by rubycodez (864176)
      er, your 2.0GHz P4 of august 2001 is overclocked to 3.1 GHz? or are you just confused and babbling?
    • Re:3700 megahertz? (Score:5, Informative)

      by LordLimecat (1103839) on Wednesday May 04, 2011 @10:20PM (#36031394)

      Moores observation was about transistor count, not mHz, corecount, speed, wattage, flops, bogomips, or anything else.

  • Is anyone else disappointed that AMD's fastest desktop processor can barely keep up with Intel's midrange Sandy Bridge Core i5 processors in most applications? Sure, AMD's processors are still a great value, but it seems like they fall further behind with their performance parts every year.

    I just hope that the performance improvements for Bulldozer are all they're cracked up to be.

    • I worry about it a little bit, but as long as they are price/performance equivalent to the Intel CPUs in the low-mid range, I don't think there's a huge issue. (Assuming that they keep making a profit.)

      Very few people buy CPUs over $200-$300.

      (I stick with AMD for a few reasons. There's never any guesses about whether an Opteron will support hardware virtualization or whether it will be disabled by the chipset/BIOS. Their product lineup is straight forward compared to Intel, and their sockets make sen
    • by hedwards (940851)

      That's why monopolies are bad, mkay. Every time, without fail, AMD gets the upper hand Intel goes and does something like bribe vendors not to carry AMD based products or something similar. I've had a really hard time over the years finding AMD chips integrated into systems. You can usually find them at small shops, but rarely do you see more than one or two at major chains. And even at small shops a lot of them are Intel only these days. You can usually find them without too much trouble on line, but you

  • by macraig (621737) <mark.a.craigNO@SPAMgmail.com> on Wednesday May 04, 2011 @09:31PM (#36031154)

    Ummm, against what, my obsolete Phenom (I) X4 9850? Funny how true fanbois can read the same review as an objective person and walk away with entirely different conclusions, eh?

    The AnandTech review [anandtech.com] was even less forgiving of AMD's underdog status, and basically recommended passing and either waiting for the allegedly awesome new Bulldog line or jumping ship for Intel. Hell, when Sandy Bridge both outperforms AND underconsumes (power), you oughtta be seriously questioning that underdog affection. I certainly am.

  • by Anonymous Coward

    Read this excerpt from an AMD management blog:

    "Thanks to Damon at AMD for this link to a blog from AMD's Godfrey Cheng.
    We are no longer chasing the Phantom x86 Bottleneck. Our goal is to provide good headroom for video and graphics workloads, and to this effect “Llano” is designed to be successful. To be clear, AMD continues to invest in x86 performance. With our “Bulldozer” core and in future Bulldozer-based products, we are designing for faster and more efficient x86 performance; h

    • by rubycodez (864176)
      nonsense, there are other CPU vendors for the new age of mobile computing. I for one welcome our non x86 overlords.
    • Who would be foolish enough to buy Sanders' folly? The company struggles to make a profit in the shadow of Intel's superior technology. In the US, TI, Micron, Qualcomm and Broadcom are bigger. TI has been in the business before and knows better. The other three don't do the same sort of thing, so it wouldn't be a good match. If a foreign company bought AMD, Intel would feel no (antitrust) compunction against lowering prices to the point that the new owner would lose heaps of money. That leaves PC manufactur
    • by smash (1351) on Thursday May 05, 2011 @12:15AM (#36031920) Homepage Journal

      Alternatively, even though intel has won, he is acknowledging that for 99.9% of people, CPU performance no longer matters as much as it used to.

      In general use for example, I see no difference between the core i5 in my work machine, and then Pentium D in my oldest home box.

      Gaming? Sure, however even that is becoming more GPU constrained.

      Both AMD and intel are on notice. Hence both are putting more focus into GPUs. In terms of CPU, it won't be that long before some random multi-core strongarm variant is more power than any regular user will ever need, and they absolutely kill x86 in terms of performance per watt.

      The focus is no longer absolute CPU performance, it is shifting towards price, size, heat and power consumption. Computing is going mobile and finding its way into ever smaller devices. Rather than building one large box, distributed computing is the new wave (the cloud, clustering, etc).

      AMD's CPU business might have a tricky path ahead, bu thent so does x86 in general, barring niche markets. If AMD are doomed, then intel's traditional dominance using x86 won't be too far behind them.

      • by smash (1351)
        just one thing to add "more power than any regular user will ever need" = for the life of the box they purchase. Of course CPU requirement will continue to scale eventually, but by the time the ARM based machine they buy is no longer quick enough it will be 5+ years old and due for replacement due to wear/warranty expiry/etc.
      • by MikeURL (890801)

        I sort of agree but I also have reservations. Yes, with tablets and smartphones the world is starting to look like it moved past power hungry x86 processors.

        However, it is not inconceivable that software makers could start to come up with things that really do need all the CPU power they can get. I'm thinking about really good speech recognition or even rudimentary AI.

        If the world stays as it is then you are right--x86 will probably die a slow death (as I type on my steaming hot dual core). But if new CP

  • They released their fastest processor? Wow-- unusual for chip makers to make improvements to speed and design.
  • Show me Bulldozer and then we'll talk.

"A mind is a terrible thing to have leaking out your ears." -- The League of Sadistic Telepaths

Working...