Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Hardware

AMD Next-Gen Kaveri APU Shipments Slip To 2014 138

MojoKid writes "The story around AMD's upcoming Kaveri continues to evolve, but it's increasingly clear that AMD's 3rd generation APU won't be available for retail purchase this year. If you recall, AMD initially promised that Kaveri would be available during 2013 and even published roadmaps earlier in May that show the chip shipping in the beginning of the fourth quarter. What the company is saying now is that while Kaveri will ship to manufacturers in late 2013, it won't actually hit shelves until 2014. The reason Kaveri was late taping out, according to sources, was that AMD kept the chip back to put some additional polish on its performance. Unlike Piledriver, which we knew would be a minor tweak to the core Bulldozer architecture, Steamroller, Kaveri's core architecture, is the first serious overhaul to that hardware. That means it's AMD's first chance to really fix things. Piledriver delivered improved clock speeds and power consumption, but CPU efficiency barely budged compared to 'Dozer. Steamroller needs to deliver on that front."
This discussion has been archived. No new comments can be posted.

AMD Next-Gen Kaveri APU Shipments Slip To 2014

Comments Filter:
  • I don't like Intel lack of PCI-e in the i5 and lower range i7.

    also where are the nexgen higher end i7's with QPI?

    • by JDG1980 ( 2438906 ) on Saturday August 31, 2013 @07:19PM (#44727429)

      Steamroller FX chips aren't even on the roadmap at this time. That doesn't mean that they will never come, but AMD is clearly prioritizing their APUs over enthusiast-oriented chips at this time.

      • Comment removed based on user account deletion
        • by mjwx ( 966435 )

          I also really can't blame them in this matter as I have been saying for years and time and time again I've seen with my own two eyes that CPUs passed "good enough" several releases ago and even the Phenom II based chips and first gen Bulldozers really are overkill for all but a handful of users.

          This.

          I have a Phenom II 955 and the only problem I have is the motherboard doesn't support SSD's properly. The CPU handles new games fine, I've upgraded my graphics card and RAM, bought 2 SSD's (thinking about buying a third so I can use my 512 GB SSD exclusively for game installs and the old 256 GB went into the laptop) but not the CPU. The only thing at the moment that would force me to upgrade my CPU is the motherboard (I think I can still get an AM3 compatible board though). If I had to buy a new CP

        • by hazydave ( 96747 )

          Presumably the console chips are already in production. AMD's been fabless for quite some time, so there shouldn't be any reason production is an issue -- and looking at console sales over the years, it's not as if they're ramping up for a new iPhone, even given that both new consoles are using AMD chips.

          They ought to have plenty of resources available for new chips if they have a rationale for making them. Low end users rarely clamor for a hot new processor, only a cheaper chip with the same hotness. And

    • If you want lots of PCIe the Intel 4820K is coming, somewhat soon.

    • I agree that, since I'm going to get a dedicated graphics card anyway, I'd much prefer some more focus CPU. That being said, an APU could open new possibilities for using OpenCL and the like without bogging down the main CPU cores or competing against graphics for the GPU's precious shader units. Now I'm intrigued - I wonder if this arrangement would actually work...

      • by hazydave ( 96747 )

        They needs to scale the GPU part of the APU to deliver a decent level of OpenCL performance. But AMD has one big advantage -- a bit problem with OpenCL is latency -- there's overhead in compiling the OpenCL code, overhead in communicating with the GPU, etc. So you typically see CPU use drop in OpenCL applications, and some of the CPU being used is "wasted" in OpenCL overhead. SO in short, as the CPU speed increases, you need a much faster or better architected GPU to make that extra processing worthwhile.

        Bu

  • An unnecessarily overpowered chip will be delayed, so more of the hardware features no one asked for will be delivered to a market that usually works in symbiosis with the Microsoft inefficiency treadmill but is now being destroyed entirely by that same company.

    • Re:oh noooo (Score:5, Insightful)

      by Ambassador Kosh ( 18352 ) on Saturday August 31, 2013 @07:56PM (#44727637)

      I would love to see what these chips do for engineering simulations. In simulation software there is a lot of back and forth between parts that can be done on a gpu for a huge performance gain to parts that work best on a cpu. The problem is that mostly you end up running them pure cpu only because the overhead of handing off to the gpu and getting a result back is so high. Kaveri is the first chip I know of that can do a zero copy transfer between the gpu and cpu. It may not be great for all apps but it should be AMAZING for engineering sims if they are modified to take advantage of it.

      Some of the papers I read found that on simulations large enough for the gpu to make a difference you could get a 50x performance increase and theoretically it should have been around 200x or so but the overhead of loading and retrieving the data was still very large.

      • Looks like the developers you work with haven't discovered asynchronous operation and the principle of overlapping communication and computation.

        • In a simulation there are a lot of steps where you have some linear calculations and then some other calculations that can be thousands run in parallel but the next linear calculation depends on the parallel calculation. Async makes no sense for that. The next linear calculation can't be run until the previous parallel calculation has finished. Think of some steps as simple linear equations and the next step as a set of codependent equations that need to be converged. Doing that in parallel is vastly faster

    • Re: (Score:1, Insightful)

      An unnecessarily overpowered chip will be delayed, so more of the hardware features no one asked for will be delivered to a market that usually works in symbiosis with the Microsoft inefficiency treadmill but is now being destroyed entirely by that same company.

      Why don't you wait for it to come out before criticizing it. AMD is in trouble and it would hurt us all including the intel fan boys who are reading these comments if AMD is gone.

      I noticed Intel is already raising prices on the newer I7s for no other reason that they think there is no competition as everyone bashes them in every tech website.

      Steamroller is actually slower than the older Phenom II per clock cycle and is a crappy chip. That is true. I still own a phenom II but my crappy mobo is showing its ag

  • I like AMD now for budget builds. I loved AMD when they were smacking intel. However, this line of chip names has to end soon. There just that many more cool sounding pieces of heavy construction equipment.

    The New Phenom VIII x16, based on Suction Excavator technology!
    or
    From the new Skip Loader core comes the AMD Skippy x8!
    or
    Our new Pipelayer core provides all the uumph you need to penetrate difficult projects.

    These just don't have the same ring.

  • by Muerte2 ( 121747 ) on Saturday August 31, 2013 @07:37PM (#44727547) Homepage

    The AMD APUs really are a great melding of price vs performance. Sure Intel has faster CPUs, but they're also more than twice as much! The highest end APU is $150, and the highest i7 is $340. The i7 will have higher CPU performance, but most games aren't CPU bound, they're GPU bound. The AMD APUs have decent GPUs. They won't replace your high end GPU if you're playing Battlefield at 1080p, but if you're a mid-level gamer they perform great. Plus you can always add a decent GPU for $150 and you're still less than that 4700 i7!

    • Comparing most expensive chips isn't fair or useful. Intel's most expensive chips can cost a lot more because AMD doesn't have anything competitive.

      A AMD FX-8350 costs $200. In Intel land, a i5-3750 is the right cost equal, at about $215. Intel's lead is so large that even a previous generation unit from their line up is approximately equal performance to AMD's current models. Which of those two is faster depends on the benchmark [techreport.com].

      At the $100 end of the market, there are a few really cheap models where A

      • FX-8350 is 180$, I just bought one. That is, the price difference is 20%, but the comparable motherboard was about 40$ cheaper with AMD. In the 180-220$ range of processors, 75$ is nothing to sneeze at.
        • by Osgeld ( 1900440 )

          you will have to spend that extra 75 bucks on cooling and a power supply though so you really end up spending about the same for a lower performing chip

          that's what kind of happened to me, havent had an intel system since the pentium 1, went to go upgrade / make a new box, so getting a new mb and video card anyway, both the AMD and INTEL gigabyte brand motherboards were 80ish bucks, the 8 core AMD was cheaper, but knowing from my quad core I instantly needed a 40$ fan cause the OEM fans work, but sound like

          • The boxed processor I bought (FX4300, quad @ 3.8ghz OC'd to 4.1ghz) is using the stock AMD HSF that it came with. It usually runs around 42C and gets up to maybe 53-54C under a load that pegs all 4 cores (movie encoding, which I often do).

            At no point is the CPU fan ever even audible. The only thing I can hear other than some HDD whine is the PSU fan on my Antec Neo Eco 520.

            And how are you getting that the AMD processor uses 220w more than the Intel? The Bulldozer TDP is 125w (AMD FX-8350 Vishera 4.0GHz)

            • by Osgeld ( 1900440 )

              The AMD branded cooler master that came with mine in the amd box never had a problem keeping things cool, but it was so damn loud my wife complained about it from across the apartment 2 rooms down the hall

              TDP has little to do with how much power a CPU sucks out of the supply, the 8 core 4.2 ghz under full load will suck almost 300 watts of power

              my 4170 quad sucks down 198 watts and my old ATI6870 sucks 247 under full load so without thinking about motherboard, fans, optical disk's, shit plugged into the por

              • Something's wrong with your wattage figures, like you're taking the whole system's power use when stressing the CPU with a burn program, likewise with the GPU (with something like Furmark)
                The point still stands, but to me the more annoying point is paying for the electrical utility bill.

                To min/max the game what you need to do is build an Intel system with a lowest end mobo (around $50), Intel 3770 or 4770 or 4570, a good 300W or 350W PSU, stock cooler, max it out at 16GB. You can't run your 4170 on a low en

                • by Osgeld ( 1900440 )

                  agreed, that's why my most recent build was an i7 3770K, in the middle high end AMD just cant wrangle the numbers, and in the low end no one cares whats in their 299 walmart machine

                  • I think a FX6300 is half decent, esp. if you want to play with virtualisation with Vt-d. You had the worst of the bunch with that FX 4100.

            • The boxed processor I bought (FX4300, quad @ 3.8ghz OC'd to 4.1ghz) is using the stock AMD HSF that it came with. It usually runs around 42C and gets up to maybe 53-54C under a load that pegs all 4 cores (movie encoding, which I often do).

              Here's a dirty little secret about the "temperatures" that are reported by AMD (and probably Intel) processors. They are *not* calibrated against anything. You can't say that the processor runs at 53-54C with any certainty (it can be +/- 10% or more when compared to a
      • According to Tomshardware an Icore3 can fucking beat that 8 core. Especially in Skyrim and Crysis.

        FYI I am typing this on an AMD phenom II sadly as I am not an intel troll. The FX really is a crappy chip and there is no sense trying to defend it as the people who play games or do any graphics work use dedicated graphics anyway. The intel integrated crap is fine for Office work and web browsing in this day and age.

        Here is hoping this next generation one fixes the problems.

        • Are there really that many people playing Skyrim and Crysis? Particularly playing those games with low graphic settings in order to not be GPU limited? A lot of my clients still have Core 2 duo or Phenom II and don't need more power. Even worst, a lot of their employees still have P4 at home and see no reason to upgrade. Also, considering the Xbox One and the PS4 both have an AMD processor (and not a fast one), it's kind of obvious there is very little use of a Core i7 for most people, even for gaming.

          Perso

        • by Osgeld ( 1900440 )

          the FX is pretty garbage, I have a FX4170, and its noticeably better than my old phenom II 720, it can outrun an i5 in daily operations, it cant hold a candle to the i7

          most disappointing AMD chip I have ever bought

    • by tuppe666 ( 904118 ) on Saturday August 31, 2013 @08:54PM (#44727905)

      The AMD APUs really are a great melding of price vs performance....

      Even though I loath the 70% gross margin that Intel insists on. They have http://www.phoronix.com/scan.php?page=news_item&px=MTI5MTI [phoronix.com] 20-30 people working on Linux Drivers vs 5 from AMD. There is more than one way to measure bang for buck. That said when I buy a separate graphics card it will be AMD.

      • Wait a second. You know the AMD drivers are for shit, but you're going to willingly choose AMD? The company that pays lip service to open source, but just trickles out information slowly so that the free driver will always suck and never support some of even their old hardware, like R690M?

        Why would you pay for abuse? As long as AMD is being a collection of asshats about video drivers, giving them money is just voting for asshattery.

        All my CPUs have been AMD for ages now but every time I try an ATI GPU I win

      • by Mashdar ( 876825 )
        This is why I just bought an nVIdia video card. :( I would prefer AMD products if they had competitive drivers and performance... Plus, there is a hardware reason for preferring nVidia on Linux if you use wine to play Windows games: You can disable the shaders. This is impossible on ATI cards. :( I've liked AMD as a name since my k6-2, but my game machine is still intel and nVidia. My HTPC and Laptop are a different story.
    • by Osgeld ( 1900440 )

      I went to go upgrade my quad core FX last year, in order to get the 8 core I needed a new fan (cause the ones amd ship are a joke ... they keep it cool, running at 6k rpm and loud as a jet) and a new power supply

      by the time I bought those two things I was at the price of a 3770k and still didnt get close to performance .. bought intel

      until AMD can get their power to power ratio in line they are just not worth looking at in the mid to upper end, and no one cares about the low end, go get yourself a 99$ off l

    • by aliquis ( 678370 )

      Throw in motherboard and ram and possibly graphics card in that mix and picking that CPU vs the Intel one make less of a difference.

      For me personally I need to get a new HDD, PSU, want to get a case and need a new monitor to.

      Reason to pick AMD? None.

    • by yenic ( 2679649 )

      The AMD APUs really are a great melding of price vs performance. Sure Intel has faster CPUs, but they're also more than twice as much! The highest end APU is $150, and the highest i7 is $340. The i7 will have higher CPU performance, but most games aren't CPU bound, they're GPU bound. The AMD APUs have decent GPUs. They won't replace your high end GPU if you're playing Battlefield at 1080p, but if you're a mid-level gamer they perform great. Plus you can always add a decent GPU for $150 and you're still less than that 4700 i7!

      This is why my next machine will be an AMD APU. While I have a standalone card now, if it dies I'd likely just move to using the APU alone. I don't think it'd present a major problem, especially whenever it is I upgrade. They're only getting better.

    • 1) AMD highest end APU is not 150$, it's 330$. With tiers all the way down.
      2) If GPU bound, you can get an i5 for 100$ less or an i3 for even less, and put that towards a GPU.
      3) Games are also mostly limited to 1 core more or less. Making about 7 of AMD's cores more less useless in this regard.

      AMD are not great price vs performance. AMD ARE good at price at the low end. If you are building a basic machine on the cheap, AMD is your chip right now (or a business server). However if you wish to use it for any

  • This has been known for a couple months or three months, and even then was not a big surprise as the original target was "H2 2013" with no commitment.

  • How about a decent Linux driver for once.
  • by aristotle-dude ( 626586 ) on Sunday September 01, 2013 @01:29AM (#44729135)
    • Furthermore, "apu" means help as a standalone word, or helping/auxiliary as a prefix. So "apuprosessori" would mean a coprocessor.

      Also, to nitpick a little, "kaveri" is more like a buddy, or even a (random) guy, as opposed to a close/true friend.

      • Indeed. "Ystävä" can be used for a close friend.

        But anyway, "Kaveri APU chip" quite literally says "helper buddy chip" -- the name of this AMD product sounds incredibly cute to the Finnish ear. Even though Kaveri is actually a river.

        As a sidenote, Roccat [roccat.org] is a German company which makes PC peripherals which carry Finnish names.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (1) Gee, I wish we hadn't backed down on 'noalias'.

Working...