AMD A10 Kaveri APU Details Emerge, Combining Steamroller and Graphics Core Next 105
MojoKid writes "There's a great deal riding on the launch of AMD's next-generation Kaveri APU. The new chip will be the first processor from AMD to incorporate significant architectural changes to the Bulldozer core AMD launched two years ago and the first chip to use a graphics core derived from AMD's GCN (Graphics Core Next) architecture. A strong Kaveri launch could give AMD back some momentum in the enthusiast business. Details are emerging that point to a Kaveri APU that's coming in hot — possibly a little hotter than some of us anticipated. Kaveri's Steamroller CPU core separates some of the core functions that Bulldozer unified and should substantially improve the chip's front-end execution. Unlike Piledriver, which could only decode four instructions per module per cycle (and topped out at eight instructions for a quad-core APU), Steamroller can decode four instructions per core or 16 instructions per quad-core module. The A10-7850K will offer a 512-core GPU while the A10-7700K will be a 384-core part. Again, GPU clock speeds have come down, from 844MHz on the A10-6800K to 720MHz on the new A10-7850K but should be offset by the gains from moving to AMD's GCN architecture."
Phenom || instead of Bulldozer (Score:1)
Re: (Score:2, Insightful)
I wish I had mod points.
Proud Phenom II user here. Awesome chip and I still use it on the gaming platform ... from 2010?!
I like the 6 core and only 10% slower than an icore 7! ... 1st generation icore 7 from 2010. :-(
IT is obsolete and I see no purpose to upgrade my system with it with another one. It is like making a multicore 486 at 3ghz. It would not hold a candle to anything made in the last 15 years.
AMD needs to do an Intel. Dump Netburst 2.0. Intel failed with the pentium IV but after 3 years the core
Re: (Score:1)
They are called "Core i7" not "iCore7".
Re:Phenom || instead of Bulldozer (Score:5, Funny)
Re: (Score:2)
Re: (Score:3, Insightful)
I run a core 2 duo on a motherboard 8 years old, with a gtx460 (it was originally with an 8800GT, which I pensioned off) and I will guarantee you my PC outperforms most PCs sold today, gaming-wise.
The Core2Duo was a good chip for its time, but current Intels outperform it by a wide margin. I'm pretty sure that even current AMDs beat it, despite their Bulldozer mis-design. Likewise, the GTX460 will be beaten by modern cards.
If you are talking about Intel PCs that use only integrated graphics, your claim might be true. But gamers usually understand that they need a discrete GPU ;-)
i dont believe you (Score:2)
I had a core2 E5300, and I replaced it with a new Q6600 from ebay dirt cheap, yes more cores, a bit hotter, but more cores is more flops.
I'll be looking for an even faster Q9550, as its close to i7s, but way cheaper.
Yes, we can buy full PCs for $300+ that give you latest i7s running way faster.
But reusing old Qxxx's on older mbs is close enough when it costs less than 3 pizzas.
Re: (Score:1)
But like the phenom II it is obsolete compared to todays CPU.
Actually the Phenom II is a step up and an AthlonFX64 would be the AMD equivalent of your setup. Fine for light work but I would not want to purchase one today if needing a new computer unless I am broke and then I would buy it at Salvation Army used and just rewipe it.
Only games, compiling, and HD video editing require anything newer than a 2006 era machine which is why XP just wont die already! First it was multitasking you need a $3000 machine
What's the GPU for? (Score:4, Interesting)
Re: (Score:1, Flamebait)
Catch up with Intel?
AMD is creaming Intel in this area. Intel's graphics SUCK. They are as fast as 2006 era graphics and makes game developers pull their hair out and scream more than web developers with IE 6 as they need many work arounds with such poor performance.
The new 5000 series is only 5 years obsolete from what I hear. But Intel likes it this way as they want people to think it is 1995 all over again and buy beefy over priced cpus for better fps instead of upgrading a video card.
AMD has made some
Re:What's the GPU for? (Score:5, Informative)
Iris Pro is on par with the 650m for gaming and the 650m isn't even 2 years old.
The Iris pro is on a $500 part only made possible by bolting expensive eDRAM onto a processor that otherwise would cost $300.
The mind boggles at how people think that this is boasting material.
Re:What's the GPU for? (Score:4, Informative)
> AMD is creaming Intel in this area. Intel's graphics SUCK. They are as fast as 2006 era graphics
Nope.
http://www.notebookcheck.net/Intel-Iris-Pro-Graphics-5200.90965.0.html
"Thanks to the fast eDRAM memory, the average performance of the Iris Pro is only about 15 percent behind the dedicated mid-range cards GeForce GT 650M and GT 750M. This makes the Iris Pro even faster than AMDs Radeon HD 8650G and the fastest integrated GPU of 2013. Therefore, many current games can be played fluently in medium or high detail settings and a resolution of 1366 x 768 pixels. In some older or less demanding titels like Diablo III or Fifa 2013, even higher resolutions or quality settings may be possible."
Re: (Score:2)
Re: (Score:2)
Iris Pro is something I have never seen in any mainstream laptop, and once it comes out in say a Dell or Samsung, that's gonna cost an arm and leg because Intel's high end GPUs come with high-end CPU combos )like the i7). The great thing about AMD's A10+ HD 8650G combo is that you can buy a $600 HP or Lenovo with it today to get about the same level of gaming performance. If Kaveri improves upon the A10 without increasing the price or power consumption, it will be a winner as far as budget gaming on laptop
Re: (Score:2)
They're up to mid range 2009 level
The Iris 5200 is about the same performance as a discrete GT 240, like what gp said they have.
Re: (Score:2)
Re: (Score:2)
For everyone that doesn't play games.
If you don't play games, why do you care about 3D performance?
Sure, you might do CAD or similar 3D work, but then you can afford a real GPU.
Re: (Score:2)
If you don't play games, why do you care about 3D performance?
Photoshop [tomshardware.com], for example? Or anything using OpenCL?
Re: (Score:2)
Re: (Score:3)
Laptops? While I'd love to see a nice, low cost CPU/GPU combo that can hang with my (rather meager) Athlon X2 6000+ and GT 240, I'm still running pretty low end gear. If this is targeted at enthusiasts they're just going to replace it with a card...
Basically it's a CPU + GPU bundle that only takes up the size of the CPU. It's not meant for the hardcore gamers, just pragmatists who are looking for value and simplicity. Like every company, AMD has a product lineup -- different products are marketed in different ways (although AMD is not always as clear about the matter as it could be). For the price, these chips are usually pretty good values.
Re: (Score:2)
Re: (Score:2)
You are all over the place. You wonder what the GPU is for, then state that you actually will love this very product because its a low cost CPU/GPU combo, but then specifically name your "rather meager" rig that is even slower than the last generation of APU's in both CPU and GPU performance (ie, your rig is the thing that cannot hang), and finish the whole thing off hypothesizing that AMD might in fact be targeting "enthusiasts."
Are you some sort of discordian
Re: (Score:2)
I believe if you have a discrete GPU based on the same architecture (GCN in this case), you can use both simultaneously for a small speed boost, or switch between them depending on load (so your 250W video card doesn't need to spin its fans up just to render the desktop).
There's also some consideration for using the integrated GPU for lower-latency GPGPU stuff while using the discrete GPU for rendering. I don't think that's actually used in anything yet, but I'm not actually using an APU in any of my machin
Laptops (Score:2)
Exactly. That's why the big deal with Intel's Haswell was basically "consumes a lot less power", the rest was incremental and a few added instructions for the future. AMD seems to have the same tech analysts as Netcraft crying "The Desktop is dying, the desktop is dying!"
If you play to own anything that is a desktop, then anything like this from AMD or Intel, that can be replaced with something that is TWICE is fast using the cheapest 50$ dedicated video card, makes the advances absolutely meaningless.
In fa
Re: (Score:2)
Re: (Score:2)
I kind of wonder about this too. No matter how low-end your desktop system is, as long as you have a modern CPU, even in say Celeron range, you can always pop into it a 100 dollar ATI video card (check Tom's hardware's latest recommendations) and it should run circles around those AMD APU's with integrated graphics. Now AMD is supposedly shooting at the market for these $100 video cards. That is, they seem to imply that this APU will make cheap video cards unnecessary. It will certainly be interesting to lo
Re: GCN goodies (Score:2)
I think that Kaveri would become a very compelling choice for htpc and even gaming. You could easily build an entry level steam machine with this and because there is no discrete GPU, you could do a really small form factor with good airflow.
An audio server that uses true audio is another intriguing option.
There are even fanless cabinets that will take up to 95 watt CPUs like this one.
http://www.fanlesstech.com/2013/11/asktech-nt-zeno3.html [fanlesstech.com]
I also have a noob question. Can kaveri or even the existing a10 chi
Re: (Score:2)
"I also have a noob question. Can kaveri or even the existing a10 chips be used in crossfire mode? Meaning integrated graphics crossfired with a discrete GPU. Does it even make sense to do something like this? For example, it could be a good upgrade path."
Yes and it should even automatically set it up for you.
Re: GCN goodies (Score:2)
I think that Kaveri would become a very compelling choice for htpc and even gaming. You could easily build an entry level steam machine with this and because there is no discrete GPU, you could do a really small form factor with good airflow.
An audio server that uses true audio is another intriguing option.
There are even fanless cabinets that will take up to 95 watt CPUs like this one.
http://www.fanlesstech.com/2013/11/asktech-nt-zeno3.html [fanlesstech.com]
I also have a noob question. Can kaveri or even the existing a10 chi
Re: GCN goodies (Score:2)
Apologies for the double post. The submit button didn't respond for several seconds so I clicked it again. In fact, I think I did it thrice.
OpenCL 2.0 support? (Score:1)
Will this new architecture of AMD support OpenCL 2.0?
Re: (Score:2)
the problem is... (Score:1)
"enthusiasts" don't give a rat's tail about on-board graphics.. so strip that shit out and give us an unlocked processor for less coin. tyvm.
Re: (Score:2)
so strip that shit out
It will be much harder to have the GPU cache coherent with the CPU if that "shit" is stripped out. It is this advance far more than anything else which makes this architecture hold promise. There's now some crazy amount of arithmetic performance available (far, far more than even the most expensive Intel chips) but without the usual high latency (and pain in the ass) trip to and grom a graphics card.
That "shit" will make the GPU suitable for a substantially broader range of problems s
Re: (Score:2)
To be fair, most of the PC market is budget. We the enthusiasts are the minority. This thing will probably play Starcraft 2, Crysis 3, Battlefield Whatever, BioShock Infinite Squared, etc... well enough for someone who doesn't mind 35 fps on an HD monitor. If you want 90 fps on a 4K monitor, you'll have to move up to Core i5 + mid level or better discrete graphics card.
Re: (Score:2)
I know I am minority, but it's cool to have a mid-level gaming capability on portables and not having to pay for it arm and leg. On the desktops.. I agree. The market for people who insist on gaming on a budget PC but refuse to put in at least a $100-120 video card is kind of small.
Dual graphics (Score:1)
Kavari looks good for a budget gaming PC, but I think they are being a bit optimistic about the "dual graphics" feature. This is where you pair the iGPU with a dGPU, to get better performance. AMD has never been able to get this feature to work properly. All it does is create "runt" frames, which makes the FPS look higher, but without giving any visual improvement.
http://www.tomshardware.com/reviews/dual-graphics-crossfire-benchmark,3583.html [tomshardware.com]
Kaveri is a poor version of the Xbox One chip (Score:2, Insightful)
Kaveri should be properly compared to the chips in the PS4 and Xbone. As such, it can be said that Kaveri is significantly poorer than either.
-Kaveri is shader (graphics) weak compared to the Xbone, which itself is VERY weak compared to the PS4.
-Kaveri should be roughly CPU equivalent (multi-threaded) to the CPU power of eother console
-Kaveri is memory bandwidth weak compared to the Xbone, which itself is VERY bandwidth weak compared to the PS4
-Kaveri is a generation ahead of the Xbone in HSA/hUMA concepts,
Kaveri is much better as PC chip (Score:2)
- Single-thread performance matters much more than multi-thread performance, and Kaveri has almost twice the single-thread performance of the Xbone and PS4 chips.
- Memory bandwidth is expensive. You either need wide and expensive bus, or expensive low-capasity graphics DRAM which need soldering, and means you are limited to 4 GiB of memory(with the highest capasity GDDR chips out there), with zero possibility of late upgrading it, or both(and MAYBE get 8 giB of soldered memory). Though there has been rumour
Re: (Score:1)
- Memory bandwidth is expensive. You either need wide and expensive bus, or expensive low-capasity graphics DRAM which need soldering, and means you are limited to 4 GiB of memory(with the highest capasity GDDR chips out there), with zero possibility of late upgrading it, or both(and MAYBE get 8 giB of soldered memory). Though there has been rumours that Kaveri might support GDDR5, for configurations with only 4 GiB of soldered memory.
In general (not necessarily relating to Kaveri as-is) 8 giB of fast, soldered memory as in the PS4 would make sense for a PC.
The current APUs are seriously bandwidth starved. In reviews where a Phenom II with a discrete graphics card is pitted against an APU with similar clock speed and number of graphics cores, the Phenom II usually wins (except benchmarks that don't use the GPU much). Overclocking the memory helps the APU some, which is further evidence.
With PS4 style memory that problem could be solved,
Re: (Score:2)
Because my computer is a Phenom II, this might be the first time I add RAM to an existing PC.
Re: (Score:2)
Cache Money! (Score:2)
I could be wrong, but it had little to do with AMD and more to do with MS specifications.
The only difference between the graphic cores on the Xbox One and PS4 is that the PS4 uses newer DDR5 memory, while the xbox DDR3. Xbox tried to compensate for the slower memory by adding additional cache on die, however this takes up physical real estate, which forced them to use a couple less cores (in exchange for faster memory handling). To simply say one is faster/better than the other is a bit misleading.
The reaso
Re: (Score:2)
OK, I admit I didn't read too carefully. Thought you were just comparing the Xbox and PS4 situation.
However it is likely for the exact same reason. When is DDR5 coming out? Can you actually buy some? No you cannot. Why design and release something you cannot use?
Reminds me of the funny MB with two slots, one for one kind of DDR VS another. I have no doubt the have another version all ready for "release" once DDR5 become viable and common place.
Article is crap and misses biggest feature! (Score:5, Interesting)
This is the chip that unites the CPU and GPU into one programing model with unified memory addressing. Heterogeneous System Architecture(HSA) and Heterogeneous Uniform Memory Access(HUMA) are the nice buzzword acronyms that AMD came up with but it basically removes the latency from accessing GPU resources and makes memory sharing between the CPU cores and GPU cores copy free. You can now dispatch instructions to the GPU cores almost as easily and as quickly as you do to the basic ALU/FPU/SSE units of the CPU.
Will software be written to take advantage of this though?
Will Intel eventually support it on their stuff?
Ars article on the new architecture. [arstechnica.com]
Anandtech article on the Kaveri release. [anandtech.com]
Re: (Score:2)
1. The 386 had a discrete FPU, called the 387
2. The 486 integrated the FPU, and all subsequ
Re: (Score:2)
The x86 external FPU started with the Intel 8087 [wikipedia.org] which was launched during 1980. The 8087 was the FPU for the 8086, the first generation of x86. The 80286 followed the same logic using an external FPU, 80287.
The 386 was the first to integrate the FPU onto the CPU die in the DX line of 386's. The 386SX was a 386 without the FPU which depending on the computer/motherboard could be upgraded with a 387 coprocessor.
So:
386DX = 386+FPU
386SX = 386, no FPU
The 486 also followed the same logic offering a DX or SX vers
Re:Article is crap and misses biggest feature! (Score:4, Insightful)
Your history is rather off. The 386 never had an integrated FPU. 386 DX had a 32-bit bus. The 386 SX had a 16-bit bus for cost saving measures. The 486 DX was the one with the integrated FPU, and that was the first to include the FPU by default. The 486 SX had the FPU fused off.
Re: (Score:2)
Ah shit, you're right. I forgot that the 386 didn't have an FPU and was confused by the 486SX/DX nomenclature.
Thinking back my father had two 386's at work. One a 386DX was for CAD and now that I think of it, it had a Cyrix "Fast Math" 387 FPU. Interesting thing was it had a slot which was two 8 bit ISA slots end-to-end that was a 32 bit expansion slot. Wasn't populated but was interesting. He also had a 386SX which was used for word processing and accounting/payroll. Later on we had two 486's.
Re: (Score:2)
Actually should be the driver work to support this.
When a app asks to copy something to the GPU, it ask the GPU drivers, that can use that zero-copy/remap magic and tell the apps its done.
So yes, it should be supported out of the box if the drivers support it right.
Re: (Score:1)
Games will almost certainly make use of uniform memory for loading textures faster. That feature will make it much easier to implement "mega-textures".
still crushing Intel (Score:1)
Now let's look at Intel's solution for a basic gaming or HD v
Re: (Score:2)
Power consumption.
I really like AMD (in fact, all my computers since 1999 -- except for an old iMac -- have been AMD-based), but I really, really wish I could get a (socketed, not embedded) AMD APU with less than 65W TDP (ideally, it should be something like 10-30W).
I hate that when I ask people in forums "what's the lowest power consumption solution for MythTV with commercial detection and/or MP4 reencoding?" the answer is "buy Intel."
Re: (Score:1)
Secondly, if it's not a laptop not many people really care. DVRs sort of make sense because of the actual heat though.
Re: (Score:2)
But wait, there's more! Their 6-core non-APU chip blows away an i3 and some of their i5 processors while costing almost half.
Wow! AMD only need six cores to beat an Intel dual-core! They're totally crushing Intel, baby!
Back in the real world, if what you're saying is true, AMD woudln't be forced to sell these chips at bargain basement prices. I'm thinking of using one to replace my old Athlon X2 system, but only because it's cheap.
Re: (Score:1)
Plus, the i5 is a quad core. The FX6300 gets a passmark rating of around 6400. The i5-3450 gets around 6450 so they're basically the same speed.
The FX is $119 and the i5 is $190.
The FX has a max TDP
Re: (Score:2)
If AMD can get X performance for Y price and Intel can't beat them, that's who everyone will buy.
Except they don't, because AMD can't compete with Intel at anything other than the low end. Which they've traditionally done by selling big chips at low prices where the margins can't be good.
Plus, the i5 is a quad core.
You were gloating about a six-core AMD beating the i3, which is a dual core with hyperthreading. That you consider that an achievement shows how far behind AMD are right now.
The FX6300 gets a passmark rating of around 6400. The i5-3450 gets around 6450 so they're basically the same speed.
In a purely synthetic benchmark.
The FX has a max TDP of 95W and the i5 is 77W and their minimum power states are almost identical.
And my i7 has a 75W TDP.
AMD has a price per performance passmark ratio of 55.63 and Intel's is 12.36. 55.63 beats every Intel chip in existence as well.
So why don't AMD triple their price? They'd still beat Intel on price/performance for ever
Re: (Score:1)
It hurts me to see AMD like this.
I am typing this on a Phenom II now. Not a bad chip at the time several years ago as that could hold a candle to the i5s and i7s with just a 10% performance decrease but was less than half the price and had virtualization support for VMWare and 6 cores. I run VMWare and several apps at once so it was a reasonable compromise for me and budget friendly.
But today I would not buy an AMD chip :-(
I would buy a GPU chip which is about it as those are very competitive. I wish AMD w
Re: (Score:2)
Eventually, maybe 5, 10, or 15 years out, I expect Intel's competition to be high end ARM chips. But for now, AMD is it. If we the consumers let AMD fold, we had better be satisfied with buying used desktop processors because I fully expect new ones to double in price per performance compared to what they are today, just because nothing will be available as an alternative.
Re: (Score:1)
Dude look at the 200,000,000 XP installations still running!
x86 is here to stay forever. Windows RT failed and it is a competitive cycle where ARM can't compete.
These XP users also show there is no need to ever upgrade anymore. They work. Why change to something that does this same thing they already have??!
Chips no longer double in performance as we hit limits in physics :-(
AMD is loud and needs a big fan. A i3 core is just as competitive sadly unless you really hit every darn core for an app. My phenom II
Re: (Score:2)
I don't expect Android to dominate consumer operating systems next year, or five years from now. But I can readily believe that 15 years from now Microsoft consumer operating systems will be in a decline, and so w
Re: (Score:1)
Not worth it for a 20% boast.
I upgraded to an ATI 7850 and notice da significant boost.
My goal was a 5 year plan so in 2015 is when I will upgrade. I got an SSD, ATI 7850, and 16 gigs of higher speed ram. Yes my processor is 2.6 ghz but that is the only part left and the T edition is a little bit faster.
In the old days I would upgrade at 100% performance increase. Today it is I/O. I am sure Tom Raider would run fine under medium to high on it if I disable an effect or 2. No biggie for the extra cost.
Re: (Score:2)
You're also forgetting that processors are practically a non-issue these days. If you had an i7 system with a 1TB drive and an Athlon X2 AM3 Regor 260 system with an SSD, the AMD system would feel faster doing just about anything realistic like web browsing and opening software. Intel fanboys are just buying high performance chips t
Core questions (Score:2)
Can some of these cores work on game AI whilst others handle graphics, or can they only work on one task all at once? Could they do game AI at all? And can programmers program for gpu cores cross-platform or is it still one brand at a time?
APU nifty (Score:2)
The way this
Bottom line... (Score:2)
How does it compare in per-core performance to Intel chips? Everything else is just meaningless techno-babble.
Re: (Score:2)
Their last set of GPU names were pacific islands.
Re: (Score:3, Interesting)
If "Kaveri" is what you are referring to, it also happens to be the name of a river in South India.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re:Why all the polish namings? (Score:5, Informative)
"Kaveri" in finnish means "pal" as in a friend.
Which is actually pretty fitting for the chip that has CPU and GPU on one die.
Re: (Score:3)
For several years now, AMD is using islands names [wikipedia.org] for the internal GPU names (being the type/location of island used to group families) and river names [wikipedia.org] for the CPUs/APUs. Kaveri [wikipedia.org] is a river in India... that just by luck is also a word in Polish and Scandinavia.
Re:Once burnt, twice shy (Score:4, Insightful)
Dunno about you, but I ain't gonna be excited by AMD's offerings anymore, after what they dished to us on their Bulldozer roll out
For more than a year before Bulldozer came into being they told us that the Bulldozer gonna be revolutionary - they hyped the Bulldozer so much that many forums were filled with people just couldn't wait to get their hands on it
Did you think the same thing about Intel after the Pentium4 too?
Re: (Score:2)
Did you think the same thing about Intel after the Pentium4 too?
This is starting to get ancient history but as I remember it Intel was pushing the PIII hard right up until the launch of PIV, they were never in the "please hold out a little longer, please don't buy an AMD our PIV is going to be twice as fast and give free blowjobs" mode. Of course they did keep pushing it after everyone knew it was a dud, after all that's what they had to sell much like AMD now. It's pre-launch when all you get are "leaks" that are really plants, PR statements and astroturf/fanboy hype b
Re:Once burnt, twice shy (Score:5, Insightful)
So did you stop believing in Intel after their bugged Pentiums rolled off the line? ARM only from now on, until they screw something up?
Just because a company has a product that flopped doesn't mean they won't ever produce anything good again. While it's fine to not be excited until it's actually hit shelves, writing them off for the end of time seems a bit premature.
Re: (Score:2)
So did you stop believing in Intel after their bugged Pentiums rolled off the line?
The Pentium had a bug that was fixed. Steamroller was just a horribly flawed design that didn't come close to what it was supposed to be.
Re: (Score:3)
AC wrote : They love to gag on Polish sausages.
You wrote : Dunno about you, but I ain't gonna be excited by AMD's offerings anymore, and some other bumpf.
How exactly were you replying to the AC?
I know policy is to stick your comment as high as possible, if possible..... but you replied to someone who said "They love to gag on Polish sausages". You must have know that you weren't actually replying to them.