Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Hardware

AMD Starts Shipping First Bulldozer CPU 202

MrSeb writes "After an awfully long wait, AMD has finally begun shipment of its Bulldozer-based Interlagos (Opteron 6200) server-oriented CPU. If you believe AMD's PR bots, it is the world's first 16-core x86 processor. Unfortunately, and possibly because of reports that AMD is struggling to clock its Bulldozer cores to speeds that are competitive with Intel's Core i7, there's no word of the 8-core desktop-targeted Zambezi CPU. If AMD doesn't move quickly, Intel's Sandy Bridge-E will beat Zambezi to market and AMD will lose any edge that it might have."
This discussion has been archived. No new comments can be posted.

AMD Starts Shipping First Bulldozer CPU

Comments Filter:
  • Sandy Bridge-E (Score:5, Insightful)

    by Anonymous Coward on Wednesday September 07, 2011 @07:52AM (#37325622)

    If AMD doesn't watch out their mainline $200 processor will be made obsolete by Intel's $1000 EXTREME CPUs!

    • Re:Sandy Bridge-E (Score:4, Insightful)

      by CajunArson ( 465943 ) on Wednesday September 07, 2011 @08:01AM (#37325680) Journal

      You may laugh but think of it this way... if that $1000 Gulftown CPU from March of 2010 can still beat an 8 core Bulldozer that comes out 19 months later, then you would only have to realize a marginal benefit of about $1.75 per day to make it economically worth your while to have bought the "overpriced" Gulftown chip. (that includes the cost from Intel motherboards that tend to be more expensive and the extra RAM for a triple-channel configuration). Nevermind the fact that 6-core chips have been sold for $600 for some time as well. I can think of a bunch of professional applications that can easily show a $1.75 / day benefit from the extra cores. Maybe not for playing games, but for a lot of real applications.

        Bulldozer should beat the consumer-level SB chips at perfectly threaded integer benchmarks, but it remains a very open question if it will be able to beat the almost 2 year old Gulftowns at the same tasks, and it is an almost foregone conclusion that it won't beat the 6 core SB-E chips at those tasks. Factor into account the 315 mm^2 die size of EVERY Bulldozer (not just the 8 core ones, but the cheap 4 core ones too since AMD only has 1 die design) and the immaturity of AMD's 32nm process and things could be expensive for AMD on the desktop. That's why it makes sense to ship the server chips first where AMD has some hope of getting higher ASPs.

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        I've (honestly) just been asked what our expected budget requirements are for hardware for the next year. Please inform me where I can go to use the patented Intel time travelling technology so that I can retroactively use things before I decide to purchase them.

        • Please inform me where I can go to use the patented Intel time travelling technology Haha an anti-Intel snarky comment on Slashdot... way to speak truth to power. What's funny is that if I actually told you I had a time machine in 2010 that could bring you the fastest CPU available from AMD at the end of 2011 and you only had to pay a few hundred bucks for my service, you'd probably jump at the chance... which is exactly what Intel effectively did with Gulftown.....

        • by Guspaz ( 556486 )

          Extreme or not, Intel's price range has really gone down with Sandy Bridge; their highest priced chip (and quite possibly the single fastest consumer CPU on the market) is about $300ish. And I believe their highest price Sandy Bridge Xeon is only $600ish.

      • Re:Sandy Bridge-E (Score:5, Interesting)

        by sgt scrub ( 869860 ) <saintium@nOSPaM.yahoo.com> on Wednesday September 07, 2011 @08:32AM (#37325930)

        the immaturity of AMD's 32nm process and things could be expensive for AMD on the desktop.

        That is true, and as you point out, for the desktop. Machines in a data center are cooled so the number of cores is a better measure of functionality. If you build machines that run multiple VM's, which is usually the case, that cheaper 6200 will not only outperform Intel's Gulftown and more likely be preferred when adding more machines to the data center even over the SB-E chips. If AMD can get a better footing in the "cloud" infrastructure they might make enough to move to a die size smaller than 32nm, which is REALLY what they must to do.

        • by afidel ( 530433 )
          It really depends. Even with lots of VM's single core performance can matter. We have a lot of varied workloads but I'm much more likely to have a single core in one VM pegged than I am to have any host be processor bound (my entire production cluster averages ~50k MHz of Gufltown processor time at peak load over the last month, this is out of almost 250k available).
      • by Targon ( 17348 )

        Could be, could be, but there is no information out there how production Bulldozer processors will perform once they start pumping out consumer chips. AMD may have been forced to wait this long to release Bulldozer just to deal with those 32nm process issues. I am not saying that Bulldozer will beat SB, but on the flip side, AMD machines have been selling well in that $500 and under range to this point, so Bulldozer SHOULD help.

        The average consumer doesn't need a LOT of processing power, so if the At

        • My understanding is that AMD is cheaper but normally requires more power and cores to an Intel chip. Electricity costs then offset any price difference. Where AMD has an advantage is their motherboards are easier to upgrade the CPU in the future. I however tend to buy a new computer rather than upgrade my old ones so for me that's not an advantage.
      • But let's stop pretending that Intel's highest end chip is the only one to talk about. Yes, Intel has, for a long time, had a chip for people with more money than sense. They make an ultra high end chip for $1000 that is only a tiny improvement over the one below it. It is for sale to people who buy for bragging rights, more than anything else. That would be the i7-990X right now. 3.46GHz, 6 cores.

        However right below that is a chip with near the performance but around half the cost. Right now that is the i7

    • by aliquis ( 678370 )

      If AMD doesn't move quickly, Intel's Sandy Bridge-E will beat Zambezi to market and AMD will lose any edge that it might have.

      Ok, so nothing will be lost?

      (No, I'm not trolling or flaming, personally I would get a new machine to play Starcraft II, AFAIK SCII only seem to use two cores. AMD themselves has claimed their chip (best consumer chip?) would be similar to 2600k in performance. 2600k is quad core vs octo core for the AMD chip. Considering the work load the 2600k will still outperform the AMD by a lot. Also with socket 2011 we talk quad channel memory instead of dual channel and even back in 1156 vs 1366 days and with SCII o

      • by Bigbutt ( 65939 )

        Funny, that's why I went dual-core vs quad-core when I built my game machine (specifically for Starcraft II).

        [John]

    • There's no evidence to indicate that AMD's "mainline" $200 CPU will be much better than the existing "mainline" $200 2500K that's out right now... Just because Intel offers chips at a higher range than AMD doesn't mean that AMD automatically beats Intel at everything below the highest range. When the 2500K first came out it was priced lower than AMD chips that were substantially slower... AMD "corrected" the price to performance ratio by slashing its own prices, which didn't do too much to help its profita

      • There's no evidence to indicate that AMD's "mainline" $200 CPU will be much better than the existing "mainline" $200 2500K that's out right now

        There is some, depending on your application of course. If computer chess analysis is your thing, you would see benchmarks results like these [hardwarecanucks.com], where the $189 Phenom II X6 1100T beats the $219 Intel 2500k.

        So AMD already has CPUs which are price-performance competitive, surely Bulldozer shouldn't be worse in terms of price-performance.

        • Yes but that's not really comparing apples to apples. A 6 core 125W processors should beat out a 4 core 95W processor. Yes the AMD is cheaper which is good but it also requires more power which will be offset by electricity costs.
          • And in the vast majority of tasks the X6 is slower, including some very multithreaded ones:

            http://www.anandtech.com/bench/Product/203?vs=363 [anandtech.com]

            • It isn't a question of if you can find a benchmark that the AMD processor does better, it is a question of how it does overall and the AMD processor and Intel processor compare. That Anad benchmark shows the answer is not well. For example the i5 is ahead in x264 encoding. Ok well that is a completely parallel activity, it will use all cores 100%. So being that the AMD processor has a slight clockspeed advantage and 2 more cores is should stomp the i7, be 50% faster. Instead the i7 bests it slightly.

              Also an

      • Re:Sandy Bridge-E (Score:5, Insightful)

        by bhcompy ( 1877290 ) on Wednesday September 07, 2011 @08:33AM (#37325940)
        That $200 is still a fuzzy number given the motherboard prices. You can get a very good AMD chipset(880/890) for $100 and have all the latest features(USB3, SATA3, etc) while being forward compatible for quite a while(manufacturer dependent, but Bulldozer(AM3+) is compatible with AM3 chipsets with BIOS updates). With Intel, you're still paying more for the equivalent and next year you'll need to get another motherboard to upgrade that processor because of constant socket changes.
        • Huh? You can get $70-80 intel boards with the feature set you're describing, and next year's Intel CPU (Ivy Bridge) will still use LGA1155. Notably, the whole "AMD boards carry on working with future CPUs" thing is a myth –they work for one generation, at most. Bulldozer chips will not work in AM3 boards, only AM3+. Similarly, the current crop of Phenom IIs will not work in AM2 boards.

          • Well what do you expect? That they would make a Bulldozer with no memory controller and a 100MHz bus so you can drop it into Socket 7 motherboards? The idea is that you can buy a motherboard, then a year or two later get the next generation processor and drop it in. After that it doesn't even make any sense -- the bottleneck stops being the CPU and becomes the fact that the older socket is using two channels of DDR2 instead of four channels of DDR3 etc., which is the whole reason that sockets change in the

            • Agreed, it doesn't make much sense to keep the same socket for many years. This is one of many reasons why the above "zomg, intel changes sockets all the time" mantra is bollocks.

              Intel releases a chip on one socket, they then do a die shrink on the same socket. They then move to a new architecture with a new socket. This is not very much different to what AMD is doing, and gets you the same guarantee of being able to upgrade your CPU in a year or two if you really want to on your now-a-bit-out-of-date bo

        • Don't forget that those cheaper AMD motherboards also usually include a decent (not great) video card.. That can run complete circles around any embedded intel card.. (for cheap PC's that you might want to play some games on)

    • Yeh... Because AMD's $200 X6 1100T isn't already beaten by intel's $190 i5 2400... oh wait, yes it is.

      • On what applications? Post benchmarks...

        • http://www.anandtech.com/bench/Product/203?vs=363 [anandtech.com]

          Let's see here, according to this:

          --All Sysmark tests
          --Photoshop CS4
          --DivX encoding
          --x264's first pass
          --Media Encoder 9
          --3dsmax r9
          --Cinebench single and multi-threaded
          --Blender
          --Ever single game tested

          Not what one would call a trivial list. Oh and it uses less power while doing it.

      • by h4rr4r ( 612664 )

        So in your universe motherboards are free?
        List price of CPU and board.

        • okay then...
          X6 1100T - $200
          cheap AMD mobo - $50
          mid ranged mobo that supports SATA 6000 and USB 3 - $100

          i5 2400 - $190
          cheap Intel mobo – $50
          mid ranged mobo that supports SATA 6000 and USB 3 - $65

      • and amd has more MB choice will more pci-e lanes on some of the boards. While on intel you have to take some of the x16 lanes for video or use the x4 DMI bus to fit in USB 3.0 sata and other stuff.

        • Except that USB3 and SATA 6Gb/s are all on the chipset, so you're bullshitting. What are you, as a desktop user (note, server boards have QPI links and hence more PCIe bandwidth) actually going to do with that extra PCI bandwidth?

          • Wrong. There is absolutely no USB 3.0 support on P67 and H67 (Sandy Bridge) chipsets, and it only supports two SATA 6Gb ports.

            Furthermore, only one PCIe x16 2.0 is supported by this generation chipset.

            The fact that you are not aware of how stunted the Sandy Bridge chipset is with regards to I/O, is telling... fanboy much?
          • The issue seems to be that on Intels mid-low end CPUs the motherboards are limited in number of PCIe lanes available, and often the second PCIe x16 slot is locked to 4x, meaning that dual GPUs aren't a good option... so you have to spend a lot more on the Intel side of the fence. IMHO a faster single-GPU is probably better spent on the Intel platform at the moment. In either case, we've already hit the "fast enough" stage for most people... I've recommended E-350 based laptops to a lot of people, as they
            • but on the low end boards you have to cut into video pci-e just to fit USB 3.0 in and even if it moves to the chip set the x4 link for cpu to chip set is will have to have network, sound, sata, usb, x1 pci-e slots all running over it and to get more pci-e lanes you have to get a high end i7 cpu.

              With amd you can get a low-end to mid range cpu and get boards with lot's of pci-e io and even the low end chip sets has.

              on a intel board useing a x4 cable card tuner eat's up a lot of the pci-e io.

    • Great - My CPU will start generating kernel messages telling me it that it's the SON OF THE TECHNOLOGY MINISTER OF NIGERIA and that it has 25 GIGAFLOPS of PROCESSING that it needs help smuggling out from behind seven proxies and I can have 25% of it in return for overclocking the CPU by 10%.

  • by Chrisq ( 894406 ) on Wednesday September 07, 2011 @07:55AM (#37325634)
    What will be interesting is the price/performance ratio compared to the Intel chips. This chip will be typically used in server farms, and this will be at least as important as the raw power - though obviously there is an overhead in running more servers. AMD has usually been ahead of Intel, and it still is on most mid-range and low-end chips, but it has started to fall behind at the high end [cpubenchmark.net].
    • AMD has usually been ahead of Intel?? At the low-end price-point maybe. For under $100, AMD has usually offered the more competitive options. But rarely has AMD really been ahead of intel in any other market segment.

      The Athlon XP was more or less on par with the P4, P4A, and P4B. I bought into AMDs architecture at the time because it was a bit more bang for the buck, and excellent companion chipsets for gamers and enthusiasts (nForce and nForce 2). They couldn't stand up top the P4C's, so in comes the At
      • by afidel ( 530433 )
        Only if you're looking at the desktop market, in the server market Intel really couldn't compete till Nehalem with the introduction of QPI, at least for most of my big workloads.
      • I would go so far as to say, especially since the AMD buyout, and use in integrated video chipsets, that they have been ahead for low/id range desktops and laptops where integrated graphics are used. Even now, the AMD chipsets for graphics are better than the Intel gfx glued into some i3/i5 solutions. It really depends on your use, and how you look at them as a whole. For a mid-range graphics workhorse for gaming, you can get in a bit cheaper with a high-end AMD CPU, and a mid-range mobo. Since Intel's
  • If AMD can kick out a believable press release stating that they'll have the 8-way chips out in a reasonable amount of time, then they'll be fine.

  • by serviscope_minor ( 664417 ) on Wednesday September 07, 2011 @08:12AM (#37325764) Journal

    Does anyone know the FPU performance of these things?

    So comparing a 16 "core" 'dozer to a 12 core magny-cours:

    The number of parallel integer (and memory addressing) threads has gone up from 12 to 16.

    The number of FPUs has dropped from 12 to 8.

    The new FPUs are now twice as wide with the AVX instructions.

    So, two threads share one wider FPU now. If it's hard keeping an FPU full, then this should make better use of the hardware. It seems that if your code does well for parallel, scalar FPU work already, then there may be a performance drop.

    If you have trouble filling the FPU for scalar work, then this should give better utilisation of less hardware. There's a possible performance increase if your utilisation is currently under 67%. Since the two core units can feed the FPU independently, there is a little latency hiding now. This could help even if there are two completely independent processes using the FPU at the same time.

    I suppsose the reasoning is that there is often fine-grained parallelism to be had, and the problem of fine grained parallelism and keeping the FPU full are often independent. So AVX will improve performance there.

    So, it seems that the peak FPU performance has increased in the ratio of 16/12.

    The actual performance could be all over the place. It will be interesting to see.

    The other thing is that these are now single chips with 8 bulldozer units on and 16ish cores. Perhaps AMD will go and make more MCMs like before, giving 32ish cores per socket :)

    • Does anyone know the FPU performance of these things?

      No.

      Or at least nobody who can talk.

      Next question?

      :-) Sorry, but what kind of answer did you really expect? :-)

      • Yep, that is the old problem of chossing processors that did never realy go away.

        You simply can't know what you are buying before you go out and buy one of each to test. And, of course, if you are buying just one or two machines there is no reason at all to test anything.

        You can read the specs and go with the processor that is more likely to fit your work, but any detail could change things.

  • by sgt scrub ( 869860 ) <saintium@nOSPaM.yahoo.com> on Wednesday September 07, 2011 @08:16AM (#37325816)

    Unfortunately, and possibly because of reports that AMD is struggling to clock its Bulldozer cores to speeds that are competitive with Intel's Core i7, there's no word of the 8-core desktop-targeted Zambezi CPU.

    If you increase the clock on the CPU you have to cool it. Reducing the die reduces the amount of cooling that needs to be done. AMD is not able to shrink their die. Yet.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Yeah, Intel's vast capital reserves mean they typically have a generational lead in process tech, and get the increased efficiency / decreased temps of a die shrink "for free". AMD have to out-innovate them just to produce an equivalent CPU, let alone a better one. Unfair, but that's how (near-monopolistic) business works I guess.

    • It's more complicated than that. As you shrink die size, you have to fight all sorts of things that bleed current through paths its not intended to go. As you increase frequency, you have to increase voltage to make it work. As you increase voltage, you increase the bleed through. The better you are at fabbing a particular die size, the less bleed through you have and the more you can crank up the voltage and frequency. That's what they were talking about when they discuss AMD's problems getting their

  • by gman003 ( 1693318 ) on Wednesday September 07, 2011 @11:22AM (#37328590)

    I've never really considered AMD the manufacturer to look towards when looking for high-performance stuff. In my mind, at least, they're the "dirt cheap and good enough" side - I bought a triple-core Phenom for about the price of a low-end Core 2 Duo a year or two back. They've always had the best performance per dollar. Sometimes, yeah, they did even have the best absolute performance, but Intel's back in the lead again.

    High performance just isn't a very profitable market segment. Gamers and high-end servers will buy it, but that's not where the big market is. The big market is desktops and laptops - stuff where a 4gHz sextuple-core processor is overkill. A business machine will work fine with half that - and with AMD's price advantage, they've been moving in on business and desktops. Supercomputers might also be enough to sustain the company - they buy by the thousands, and AMD's power efficiency and multi-core design has usually been attractive to the few in that business. There, performance per core isn't nearly as important as cores per watt.

    That said, I'm not surprised that AMD is (supposedly) having issues meeting their targeted clock rates. Pre-release info pegged the top desktop processor at 4.2gHz - a record for an x86 processor. The last to get close to that was the last few Pentium IV HTs at 3.8gHz. AMD's top processor to date only reached 3.7ghz (Phenom II X4 980BE), and that was after years of refining their process. AMD set their sights too high, and is having problems for it.

"Once they go up, who cares where they come down? That's not my department." -- Werner von Braun

Working...