Forgot your password?
typodupeerror
AMD Hardware

AMD Making a 5 GHz 8-Core Processor At 220 Watts 271

Posted by Soulskill
from the numbers-getting-bigger dept.
Vigile writes "It looks like the rumors were true; AMD is going to be selling an FX-9590 processor this month that will hit frequencies as high as 5 GHz. Though originally thought to be an 8-module/16-core part, it turns out that the new CPU will have the same 4-module/8-core design that is found on the current lineup of FX-series processors including the FX-8350. But, with an increase of the maximum Turbo Core speed from 4.2 GHz to 5.0 GHz, the new parts will draw quite a bit more power. You can expect the the FX-9590 to need 220 watts or so to run at those speeds and a pretty hefty cooling solution as well. Performance should closely match the recently released Intel Core i7-4770K Haswell processor so AMD users that can handle the 2.5x increase in power consumption can finally claim performance parity."
This discussion has been archived. No new comments can be posted.

AMD Making a 5 GHz 8-Core Processor At 220 Watts

Comments Filter:
  • Awesome (Score:5, Funny)

    by redmid17 (1217076) on Tuesday June 11, 2013 @04:25PM (#43977927)
    I always wanted to have a computer running my freezer
    • by wjcofkc (964165)
      I think you mean in your freezer...
      • by dicobalt (1536225)
        It can do both.
      • Re:Awesome (Score:5, Funny)

        by redmid17 (1217076) on Tuesday June 11, 2013 @04:38PM (#43978131)
        No I meant running my freezer. Hence the reason I typed: "I always wanted to have a computer running my freezer" instead of "I always wanted to have a computer running IN my freezer"
        • Re:Awesome (Score:5, Funny)

          by ArcherB (796902) on Tuesday June 11, 2013 @04:42PM (#43978193) Journal

          No I meant running my freezer. Hence the reason I typed:

          "I always wanted to have a computer running my freezer"

          instead of

          "I always wanted to have a computer running IN my freezer"

          Oh. Then I don't get it.

          As a side note, I've always wanted to take an old mini fridge and turn it into a computer case.

          • Re: (Score:3, Interesting)

            by trum4n (982031)
            Done it. It was fun and functional, but you have to do everything you can to keep the condensation out. Fast a hell tho!
          • by sjames (1099)

            There is a type of freezer that can operate using a heat source for power [howstuffworks.com] (strange but true).

            I'm, guessing that's what he meant anyway.

            • by Lumpy (12016)

              All of them work that way. Look at how a fridge works.

        • by wjcofkc (964165)
          If your freezer is less than fifteen years old, it is most likely already being run by a computer.
          • I doubt that. The temperature measurement mechanics and the on/off pulsing for running the coolant does not need even a simple microcontroller. It's probably driven by completely analog electronics. Unless we are talking about some unordinary deluxe freezer.
            • by wjcofkc (964165)
              Hmm... Makes me wonder if Samsung makes refrigerators. Smart Fridge, burrito please... SMART FRIDGE, BURRITO....! "Waves hands around in open air attempting to open freezer"
        • i think it would work better running a heated swimming pool, or a grill. But to each their own.

    • Re:Awesome (Score:5, Insightful)

      by hairyfeet (841228) <bassbeast1968 AT gmail DOT com> on Tuesday June 11, 2013 @05:03PM (#43978491) Journal

      I would urge those that wonder WTF AMD is doing copying all the old mistakes Intel did with netburst to read this post by a former employee [insideris.com] who lays out exactly why this is happening, the former CEO did the usual Wall street move of slash and burn, get a stock bounce, and cash out.

      They are stuck with the Netburst that is Bulldozer/Piledriver/Suckavator or whatever other names they want to give it because the former CEO FIRED everybody that knew how to make a chip over there and replaced them with computer layouts which as you can see blow through power like shit through a goose while giving worse performance on a per watt basis than the previous Stars arch.

      This isn't coming from some Intel fanboy, I own and sell nothing but AMD at the shop, but when I can no longer get Stars and Liano chips I'm gonna have to seriously look at Intel because these new designs just suuuuck. There is a good reason why you don't see Thuban chips in most benchmarks against the new chips, its because if you matched clock for clock the Thubans and Denebs will win. That is pretty damned sad, when your old chips are actually better while using less power but the CEO they had closed down production of all the Stars cores (again to get a stock bounce and cash out) so there really is no plan B here.

      I just hope the game console chips can give them enough operating capital to keep them afloat while hopefully the new chip designer they hired, the same one that did the Athlon64 and the Apple A6, can come up with a new design to make AMD at least kinda competitive. Until then I'll hang onto to AM3+ and Stars as long as I can and then start looking at the i3s and i5s.

      • Re:Awesome (Score:4, Insightful)

        by Tough Love (215404) on Tuesday June 11, 2013 @05:26PM (#43978751)

        You got that one wrong. Netburst was about deepening the pipeline to ridiculous extremes in order to ramp the clock. The new AMD story is pure clock ramp via process technology and power management. Big difference there.

        • Re:Awesome (Score:5, Interesting)

          by gman003 (1693318) on Tuesday June 11, 2013 @05:52PM (#43978973)

          No, this particular story is analogous to a 2004-era story about Intel releasing a new Pentium IV at yet-higher clocks. The current story is about a clock ramp, but the overarching narrative is the same.

          The Bulldozer architecture is fundamentally broken, this time due to simple negligence (mainly in management) rather than a faulty assumption. The only way to get reasonable performance from it is to clock it to high speeds, which gives very diminishing returns. Power consumption scales with the *cube* of the clock speed, so you pretty quickly run into a power/heat wall. They clocked the early ones pretty aggressively already, but at the cost of power and heat (and thus, noise). But it's the same story as the Pentium IV - the smart people are on something else.

          AMD seems to be trying to put itself back together. Hopefully the PS4/Xb1 wins will give them enough of a cash flow to keep them solvent until they can get a new architecture out, or at least hammer out the IPC problems with Bulldozer. On the bright side, Intel's been distracted by ARM - they threw away a year's lead on performance to chase idle power draw, which should give AMD a bit of time to catch up on performance on the desktop.

          • No it's not analogous to anything. It is about observing that AMD did not deepen the pipeline, therefore this story is not like the Netburst debacle.

            • by gman003 (1693318)

              Tell me, did Intel deepen the pipeline when they released the Pentium 4 HT 570J, as compared to the Pentium 4 HT 560 launched a few months before? No, they didn't, but they did increase the clock speed. Same story then as now - AMD didn't deepen the pipeline, but they are pushing the clock speeds to inadvisable levels because that's the quickest way to improve performance, and they *desperately* need to improve performance.

              Look, I'm not an Intel fanboy. Even with all its problems, I'm actually still plannin

            • by hairyfeet (841228)

              Dude, now you are just being a douche, you really are. Your argument would be like saying "You can't compare any boat sinking to Titanic because that boat wasn't built before 1920!" and it really makes you sound like a pedantic dick, it really does.

              As another pointed out Intel did NOT keep adding ever more pipelines to netburst, they just kept ramping up the speed because they didn't have the IPC, with a 5GHz on their roadmap before they pulled the plug. Likewise Amd is ramping up the speed because they don

          • Re:Awesome (Score:5, Interesting)

            by steelfood (895457) on Tuesday June 11, 2013 @07:05PM (#43979675)

            On the bright side, Intel's been distracted by ARM - they threw away a year's lead on performance to chase idle power draw, which should give AMD a bit of time to catch up on performance on the desktop.

            In the short term, this appears to be a good thing. In the long term, this is very bad for AMD.

            The world is moving to low-powered portables. The future of consumer computing will not be on the desk or lap, but in the hand. Workstations will still use desktop chips, but Intel pretty much has that market cornered.

            The low-powered, RISC space is where AMD needs to go. It doesn't necessarily have to be ARM. Instead, there's a market for low-powered x86, which is where Intel is going with Haswell. AMD needs to get ahead of the game and create something that is capable of power sipping (which obviously won't be x86), but is also capable of running legacy x86 code at reasonable speeds.

            Basically, they need to create a migration path away from x86, which will never be as efficient as ARM and thus has no chance in the portable space. Yes, Intel tried that with Itanic, but they were aiming in the wrong direction (servers).

            • by Kjella (173770)

              The low-powered, RISC space is where AMD needs to go. It doesn't necessarily have to be ARM. Instead, there's a market for low-powered x86, which is where Intel is going with Haswell. AMD needs to get ahead of the game and create something that is capable of power sipping (which obviously won't be x86)

              Actually Intel has already shown they can make x86 phones on par [anandtech.com] with existing ARM phones, not market leading or anything but middle of the road. You want AMD to out-do ARM and Intel, push a new instruction set, create the compiler support and the industry momentum behind it? With a single, financially troubled company who I wouldn't bet is there five years from now? Yes, Itanic was a huge failure but Intel still makes Itaniums for anyone foolish enough to bet on that horse, AMD couldn't make any such promi

          • Re: (Score:2, Troll)

            by 0111 1110 (518466)

            Power consumption scales with the *cube* of the clock speed, so you pretty quickly run into a power/heat wall.

            Bullshit. There is no wall. I don't care all that much about heat/power/noise. I have a water cooling setup and I'm prepared to move to phase change if necessary. What I want is for Moore's Law to mean something again. Giving up on clock speed was a bad move on Intel's part. It's just sad that we still haven't made it to 5 Ghz. This whole shift from raw performance at any price to performance per watt or even wattage walls that cannot be exceeded just sucks.

            I haven't bought a new CPU since my Wolfdale Core

            • Re: Awesome (Score:5, Informative)

              by DigiShaman (671371) on Tuesday June 11, 2013 @07:46PM (#43979989) Homepage

              IANAP. I don't think we will ever see CPUs clocked to 10Ghz unless there's some asynchronous timing voodoo used. I'm of the understanding that the speed of light and signal propagation is the real limitation here with regards to higher frequencies.

            • by gman003 (1693318)

              Perhaps "wall" isn't the best term - it's more of rapidly diminishing gains for rapidly increasing costs. People have hit 8GHz, but through liquid nitrogen cooling, which isn't exactly practical for consumer use.

              Intel has actually been doing well at maintaining steady performance increases, except with Haswell. But instead of doing it with clock speed hikes, they've been working on IPC and ISA extensions. They've added AVX, to do 256-bit SIMD operations instead of 128-bit SSE. They've done a lot of work on

              • your full of lint there. An AMD FX-8320 compares nicely with the I7-3700K (8K+ on the benchmarks) - This also puts in spitting range of the E3-1245 Xeon as while the 1270/75 tend to push just over 9000. Not bad for a chip that's only $150 (that's $125 less then 1245 and over $200 less then the 1270/75 Xeon.

          • Let's get something out of the way first: that 5GHz chip will suck. Incredibly. However,

            The Bulldozer architecture is fundamentally broken

            is not exactly prudent phrasing. Bulldozer sucked. Period. It was obviously half-baked. Trinity is much better. If AMD can repeat their predicted 15% performance improvement on their next BD iteration, then they will have a truly good product. Yes, Intel is still better. An i3 is better than an FX-4300 and an i5 is better than an FX-8350 for most workloads. However, Intel jumped way ahead of AMD with Sandy Bridge and, si

    • by Applekid (993327)

      I always wanted to have a computer running my freezer

      With that kind of power consumption, I wouldn't expect it to stay being your freezer for very long.

      Grilled bratwurst anyone?

  • by CajunArson (465943) on Tuesday June 11, 2013 @04:27PM (#43977953) Journal

    The message is: You got the Megahertz myth wrong! The only myth is that Megahertz isn't important!

    Oh, and all that performance-per-watt stuff? You might want to walk that back. Oh and, pull those Youtube videos where you accuse Nvidia users of being fake-pot farmers because their cards pull so much power. Sure it was funny at the time, but we'd rather not have to live that one down now.

    • The problem is they are so many ways to judge performance.
      GHZ are good for comparing like processors.
      MIPS are good for similar instruction sets.
      FLOPS are good for similar code (That uses floating points)

      You then add these per watt if you want to show it off for a mobile device.

      • > You then add these per watt if you want to show it off for a mobile device.

        Or any data center.

        • by smash (1351)
          That. Increased power = increased battery, UPS, cooling, power bills, carbon taxes and physical space to install it all. All of which are expensive. The most important criteria in a modern datacenter is performance per-watt.
      • The problem is they are so many ways to judge performance.
        GHZ are good for comparing like processors.
        MIPS are good for similar instruction sets.
        FLOPS are good for similar code (That uses floating points)

        Of those, I think GHz is used way too often, while it actually has lost much of its meaning these days. For example we've had 2GHz desktop CPUs for a decade now, but the performance difference between them can be worlds apart.

        • describing cpu speed in ghz is like describing engine speed in rpm. it's technically accurate, but tells you nothing at all about what the product can actually do for you unless you're comparing two different examples of the exact same architecture.

    • by bored (40072)

      You got the Megahertz myth wrong! The only myth is that Megahertz isn't important!

      Tong in cheek and all that, but....

      Frankly, today both AMD and Intel are at an IPC wall nearly as much as they have been at a clockrate wall. So, yes faster clock rate is pretty much the only way to get performance if your application doesn't scale with cores. Which is a shocking number of them.

      The part I find interesting, is that if they can beat the haswell with this part then they probably have an IPC advantage over intel a

      • You sed: "The part I find interesting, is that if they can beat the haswell with this part then they probably have an IPC advantage over intel again. Remember the top end haswell turbo boosts to 4.9Ghz."

        Please re-read everything you just said very very carefully. Especially the parts about how a design with a known IPC will magically get huge IPC boosts by only increasing the core clock and power draw (hint: it won't). Please also remember that Haswell only has a 4.9GHz boost speed for incredibly small valu

        • by bored (40072)

          My point was, that if they can match or beat a 4.9 Ghz part with a 5Ghz part, then the IPC's are going to be similar.

          I didn't say the IPC changes with clock rate...

          • That's like saying if Intel increases it's IGP performance by a factor of 10 then AMD will have to worry... of course it would, but the whole problem with that statement is the pesky word "if"

            My 4770K is overclocked to 4.6GHz without that much tuning right now, and I guarantee it beats these new parts even in the perfectly multithreaded synthetic benchmarks that are best-case scenarios for AMD. It does that without being a space heater, and if the rumors about prices are true, the 4770K is an outright barga

          • You missed the parent's comment that Haswell only boosts up to 3.9GHz, not 4.9GHz.
            If you can match a 3.9GHz part with a 5GHz part, well, that proves nothing about IPC - unless you can't beat it by a significant margin, then it proves IPC is lower.

    • by TubeSteak (669689)

      The first 4 GHz are easy.

      Here's Ivy Bridge chips pushing ~220 Watts to reach 4.7 GHz
      http://www.legitreviews.com/images/reviews/1924/power-consumption.jpg [legitreviews.com]

      /AMD's FX-9xxx series uses a 32nm process.
      /Intel's Ivy Bridge and Haswell use a 22nm process.

      • It'd be interesting to see the numbers on my own system, I suppose... I have an i5-2500k (sandy bridge) that I bought right when ivy bridge came out, which I have clocked at 4.8GHz. That's running with a Cooler Master Hyper 212+ [coolermaster.com], and doesn't exceed 65'C even when transcoding blu-ray/dvd videos to h.264... I've kept it pegged at 100% CPU for 6 days straight without exceeding that temperature....

        I'd be surprised if that heatsink could provide >200W of cooling, given that it's a big radiator with a fan. To

      • Did you bother to read that graph? Try looking at the bottom where it says "Wattage At the Wall"

        You must be an enormous Intel fanboy to think that they have invented technology that allows every single component in the whole computer outside of the CPU to consume zero power in highly-overclocked systems....

  • Big deal (Score:5, Informative)

    by Anonymous Coward on Tuesday June 11, 2013 @04:27PM (#43977959)

    Power 6 was running at 5.0ghz 5-6 years ago.

  • by Carnildo (712617) on Tuesday June 11, 2013 @04:34PM (#43978067) Homepage Journal

    Why am I having flashbacks to the Pentium 4?

    • Probably because the cores aren't really cores. They're four cores that are basically hyperthreading.

    • It's hard not to. Intel wrote the book on "best way to screw-up a microarchitecture and let your competitor gain an advantage", which they have been taking into account since dropping Netboost. Now comes AMD and follows that very same book quite closely...

  • by somarilnos (2532726) on Tuesday June 11, 2013 @04:39PM (#43978133)

    The summary suggests that the "performance should closely match the recently released Intel Core i7-4770K Haswell processor", but nothing in the article, or anything released about this chip so far, supports that. It's all just guesswork until we see some actual benchmarks from the chip.

    I don't honestly expect we're going to be seeing performance parity from this chip (although I'd love it to be true). But that hasn't been AMD's selling point for me for a long time. Chances are, we're going to see a chip that breaks the 5.0 GHz barrier, under-performs relative to Intel's top end chip, but costs about half as much. That's been their game for a long time now, and I haven't seen anything that leads me to believe that this chip is changing that.

    • AMD slower / MHz (Score:4, Insightful)

      by SpaceManFlip (2720507) on Tuesday June 11, 2013 @04:45PM (#43978235)
      you're probably right - I was slightly shocked recently when I compared the performance benchmarks of an 8-core AMD to a 4-core Intel. I saw the 8-core on sale for about $179 and thought "wow!" but then I was more like "wow...." after seeing the benches.

      basically, the 8-core AMD was slower performance-wise the 4-core Intel with the AMD running a few MHz faster

      • Re:AMD slower / MHz (Score:5, Interesting)

        by hairyfeet (841228) <bassbeast1968 AT gmail DOT com> on Tuesday June 11, 2013 @05:28PM (#43978765) Journal

        Yeah but how much was the 4 core Intel? And you can probably buy that 8 core for $150 or less now if you watch the sales. I'm running the Thuban X6 and what did 6 cores cost me? $105 shipped, if you compare like to like the only chip I could get from Intel at $105 was the Pentium Dual core which the X6 outperforms so in that case the bang for the buck squarely landed in the AMD camp.

        The problem with the X8s (well other than the arch, see my previous post with a link on why the BD/PD/EX platform is AMD's netburst) is they simply cost too much to make, for every X8 that comes out with all functioning core they probably get 2 dozen X4s or X6s thanks to bad cores so THAT is where the bang for the buck is, although if given a choice I'd take a Deneb or Thuban over Bulldozer any day of the week.

        But if you are strictly wanting the most bang for your bucks and like most of us don't have unlimited budgets the best bets would probably be the Athlon X4 for $67 [tigerdirect.com] although for an extra $8 I'd probably go for the Phenom II X4 for $75 [tigerdirect.com] and for more than 4 cores the best bang is probably the FX6100 for $99 [tigerdirect.com] or the Phenom II 1035T X6 for $106 [tigerdirect.com]. I think in the benches the Thuban beats the FX6100 but both are good deals. Nice thing about the 1035T is I have one and have sold several and with a low end gaming board like the Asrock boards they have a hell of a lot of OCing room, before deciding I didn't want to deal with the temps I had mine up to nearly 3GHz with a turbocore of nearly 3.5GHz. I probably could have gone higher with a better cooler but my apt gets hot enough as it is without adding a major OC to my system.

        As you can see though you can still get crazy cheap deals on the AMD side if you just know where to look. These chips have more than enough power to do anything your average person is gonna want to do with a PC, heck my youngest is gaming on a 3.4Ghz Athlon X3 and is quite happy with the performance and with my 1035T I can game AND do a transcode AND burn a DVD at the same time with no slow downs so I would say I'm getting my $105 out of it.

        • by Kjella (173770)

          The problem with the X8s (well other than the arch, see my previous post with a link on why the BD/PD/EX platform is AMD's netburst) is they simply cost too much to make, for every X8 that comes out with all functioning core they probably get 2 dozen X4s or X6s thanks to bad cores so THAT is where the bang for the buck is, although if given a choice I'd take a Deneb or Thuban over Bulldozer any day of the week.

          None of them are good bang for the buck for AMD, the FX-8150/8350 is a big chip of 315 mm^2 versus 216 mm^2 for Sandy Bridge, 160 mm^2 for Ivy Bridge and 177 mm^2 for Haswell. Granted the last two are on 22nm but even the 32nm Sandy Bridge was way smaller than AMD's chip, which means more chips per wafer and lower defect rates. And Intel is planning to move to 14nm next year, so there's absolutely no chance of AMD closing any gap, at best they avoid widening it.

      • Re: (Score:2, Interesting)

        by Omestes (471991)

        basically, the 8-core AMD was slower performance-wise the 4-core Intel with the AMD running a few MHz faster

        Take all benchmarks with a grain of salt. While Intel has been generally winning for awhile now, that doesn't really mean AMD is completely inferior. With like chips there are certain things a modern AMD will out-perform Intel on, such as single threaded tasks. Intel will generally smoke AMD on multithreaded tasks, though. There is also cost, while AMD might be 10% less benchmark happy than a like Intel chip, it generally is over 25% less expensive, and will generally run without need to buy a new costl

    • by Animats (122034)

      The summary suggests that the "performance should closely match the recently released Intel Core i7-4770K Haswell processor", but nothing in the article, or anything released about this chip so far, supports that. It's all just guesswork until we see some actual benchmarks from the chip.

      If they're just cranking up the clock speed of an existing design, the performance should be quite predictable. The difficult-to-predict thing is the lifespan of the part. Atoms migrate faster as heat and voltage go up.

      The limit on clock speed today is from heat dissipation. AMD got 8GHz out of a CPU a few years ago by cooling with liquid helium, but it's not worth the trouble.

  • by sinij (911942) on Tuesday June 11, 2013 @05:05PM (#43978509) Journal
    Frying eggs with your CPU is now a feature.

    New AMD CPU, comes bundled with George Foreman grill heatsink.
  • When did CPU become a bottleneck? Is there a new version of java or flash I haven't got yet ?
  • and see how the new AMD chip compares. I assure you the i7 won't need to draw 220W to do this.
    Or let's look at performance per watt at normal frequencies where, if the AMD processor really does match a 4770K in raw perf, that will mean the Intel processor will be about 2.5x better on perf / watt.
    As some people have mentioned, IBM routinely clocks Power architecture processors into the 4-5GHz range AND they draw several hundred watts each. If you think that's progress, I suggest you'll want to reconsider when you see the net throughput of a dense array of low-wattage Haswells cranking out aggregate SPECcpu numbers far beyond an IBM Power 7+ processor with the same total number of watts the IBM socket draws.

  • I was sitting here looking to drop a new CPU in my quad core FX, but shit, to support the chip I needed a new power supply (650 watts now), and as it stands its not that far away from tripping a breaker (between 2 AMD computers hoggin power, lamps and a TV) AND its still a slower CPU, needs a replacement heatsink cause the coolermasters that come with the chip are loud as fuck

    I like my 3770k

  • A lot of the new Vishera chips can be overclocked to 5GHz on air cooling. Even AMD's own promotional and marketing materials say that word for word. So, I'm wondering, do chips really draw that much power when you overclock them like 20%? I would have thought they only hop the exact mathematical increase in clock speed in wattage. Like 10% more speed = 10% more watts roughly. Does it really go up exponentially-ish like this with other chips?
  • by slashmydots (2189826) on Tuesday June 11, 2013 @09:30PM (#43980783)
    In case any of you forgot, I'm pretty sure Intel's 1366 (2011 socket precursor) i7 extreme edition chips ran at 185 Watts. Their current ones are 130 Watts. 220 beats all that but it's not like Intel never upped the power handling for a forced stable overclock and called it a new chip without really changing much if anything in the infrastructure. It's basically like buying a factory superclocked model graphics card. You're paying for better power handling and guaranteed predone overclocking.

Real programmers don't write in BASIC. Actually, no programmers write in BASIC after reaching puberty.

Working...