AMD Launches New Mobile APU Lineup, Kabini Gets Tested 102
An anonymous reader writes "While everyone was glued to the Xbox One announcement, Nvidia GeForce GTX 780 launch, and Intel's pre-Haswell frenzy, it seems that AMD's launch was overlooked. On Wednesday, AMD launched its latest line of mobile APUs, codenamed Temash, Kabini, and Richland. Temash is targeted towards smaller touchscreen-based devices such as tablets and the various Windows 8 hybrid devices, and comes in dual-core A4 and A6 flavors. Kabini chips are intended for the low-end notebook market, and come in quad-core A4 and A6 models along with a dual-core E2. Richland includes quad-core A8 and A10 models, and is meant for higher-end notebooks — MSI is already on-board for the A10-5750M in their GX series of gaming notebooks. All three new APUs feature AMD HD 8000-series graphics. Tom's Hardware got a prototype notebook featuring the new quad-core A4-5000 with Radeon HD 8300 graphics, and benchmarked it versus a Pentium B960-based Acer Aspire V3 and a Core-i3-based HP Pavillion Sleekbook 15. While Kabini proves more efficient, and features more powerful graphics than the Pentium, it comes up short in CPU-heavy tasks. What's more, the Core-i3 matches the A4-5000 in power efficiency while its HD 4000 graphics completely outpace the APU."
Re: (Score:2)
No, rather the AMD launch *was* the XBox One announcement
Re:Heh (Score:4, Interesting)
You guys are ridiculous.
What AMD has here is a successor to Brazos, and the primary competitor is Atom. Which it runs rings around, might I add. It also equals or beats an Ivy Bridge based Pentium in all measures except single threaded performance, partially due to Kabini not having a turbo function.
Say what you will, but AMD has a clear winner in the low cost ultra mobile market at the moment.
Re: (Score:3)
What AMD has here is a successor to Brazos, and the primary competitor is Atom.
So AMD says, but Tom's Hardware disagrees:
So what about the Core i3-3217U, a 17 W processor? Surely that one is a more virile competitor, and not much more expensive than the Pentium. Core i3's on-die HD Graphics 4000 engine with its 16 EUs stomps all over the A4's 128 ALUs, despite the backing of AMD's capable Graphics Core Next architecture. Now, AMD claims that Kabini isn't meant to go up against Core i3. But we found notebooks with this exact CPU selling for as little as $360 on Newegg. It may turn out that the free market doesn't let AMD choose which Intel-based platforms its Kabini-based APUs contend with.
The cheapest laptop newegg sells that I could find was $250, so there's a good $100 range where Atoms, Celerons, Pentiums and AMD is battling it out - that's not much, really.
It also equals or beats an Ivy Bridge based Pentium in all measures except single threaded performance
Which is likely the part that matters in these laptops. I mean if you're trying to use these for serious number crunching you are using the wrong tool for the job. It's not like the single threaded performance is poor, it is horrible. Anandtech compared it to a i7-3517U [anandtech.com], which is totally unfair
Re: (Score:2)
So AMD says, but Tom's Hardware disagrees:
They disagree, but they are outright lying about the power envelope comparison, even though the actual numbers are right there in the article. That's because they're lying liars who fellate intel regularly.
It also equals or beats an Ivy Bridge based Pentium in all measures except single threaded performance
Which is likely the part that matters in these laptops.
Uh, what? That makes less than no sense. These are budget laptops.
I mean if you're trying to use these for serious number crunching you are using the wrong tool for the job.
OK, so you said it matters, then you said it's the wrong tool for the job, which means it doesn't matter. Make up your fucking mind, if you have one and you're not just an echo of Tom's which jumped the shark (and on intel's dick) ages ago.
Re: (Score:2, Insightful)
"But we found notebooks with this exact CPU selling for as little as $360 on Newegg."
They found one notebook, which is a $650 model, on a temporary sale for $360. The cheapest i3 notebook with the CPU they are comparing, not on sale, is $525, and it's a shitty one.
The cheapest B960 laptop is also $400, which makes it quite a bit above the $300-$350 atom models that this will be competing with. Maybe they should have compared it with the standard $300 laptop and
Re: (Score:2)
You missread his post. He was asserting that single threaded performance is what matters on these laptops, because no one is going to use them for big number crunching tasks that can actually use multiple cores effectively. He's correct. The Pentium beats the Brazos at single threaded performance, therefore, is a better chip for this kind of task.
Re:Heh (Score:4, Insightful)
You missread his post.
Your failure to understand the argument does not constitute a failure on my part to comprehend his comment.
He was asserting that single threaded performance is what matters on these laptops, because no one is going to use them for big number crunching tasks that can actually use multiple cores effectively. He's correct.
No, he's complextely wrong. What do you imagine that typical users need single-thread performance for? Most users need this only for games, and poorly-written ones to boot. PC games which require single-thread processing power are now vanishingly rare thanks at least in part to the influence of the tri-core Xbox 360, and the overwhelming tide of console to PC ports. Everything else the user typically does which requires very much CPU is already multithreaded. Most things the user does require virtually no CPU.
Running a GUI, editing files, I literally did these things on machines with single-digit MHz speeds which, when they were less responsive than using applications of today, were only so because of disk access times. And these tasks are today multithreaded, because they are based upon multithreaded libraries. Take a look at the programs running on a typical windows machine today, virtually all of them have a crapload of threads. Windows makes thread creation cheap in the way that Unix makes process creation cheap... not least because Windows is heavily multithreaded itself. And we are talking about what the majority of users will do with this hardware, which means running windows, playing the occasional game, watching cat videos on youtube.
Aside from games, the only times that most users use much CPU is during video encoding or possibly decoding, both of which are aggressively multithreaded and often even GPU accelerated, or while using graphics or video editing applications which are also typically heavily multithreaded, and have been for years. In short, practically no typical user actually needs serious single thread performance any more — what they need is good multithreaded performance, so that their computer can do a million pointless things behind the scenes without causing their cat videos to skip.
The Pentium beats the Brazos at single threaded performance, therefore, is a better chip for this kind of task.
The Pentium is only better than the new AMD cores we're talking about at the kind of task that people who buy APUs don't do. Thus, while your statement is factual, it is also irrelevant.
Re: (Score:3)
I would go one step further and say that the majority of users need neither better single-threaded performance nor better multi-threaded performance. They just need newer hardware that
Re: (Score:2)
And good app responsiveness typically requires only two cores, give or takeâ"one to offload the minor background tasks so that they don't get backed up too far behind the foreground processing and one to handle the foreground app's processing needs. Beyond two cores, the benefits start to fall off pretty rapidly.
Uh no. I started my multicore life with an Athlon 64 X2. Then moved up to a Phenom II X3. Now I'm on a Phenom II X6. Every time I do something which requires multicore, I use all the cores. I'm not I/O limited because I have an SSD and because I use an AMD processor with a proper bus design.
I can perceive very little difference in responsiveness between my current-generation MacBook Pro (4-core 2.7 GHz Core i7) and my circa 2007 black MacBook (2-core 2.16 GHz Core 2 Duo) except in CPU-hungry apps like Photoshop
First, here's zero dollars, get yourself a real OS. Second, if you were expecting to perceive the difference all the time, you don't understand how this stuff works.
The big difference that faster single-core performance gets you, assuming all other things are equal, is better battery lifeâ"being able to crank through the background tasks in less time means the CPU is idle longer.
Right, that would be true if the long-running tasks of
Re: (Score:1)
The Intel chips also have a big edge at audio and video encoding. But that is likely in part because of their market dominance.
Why, you ask? Encoding applications contain critical tight loops, and how well a processor performs the specific sequence of instructions can make a huge difference in speed. Those inner loops are even sometimes done as hand-tuned assembler code, one of the few places that assembler is still relevant other than embedded systems programming. Because Intel's CPUs hold most of the mark
Re: (Score:2)
It offers something in between Atom and i3. The problem is that even though it offers higher computational power it still targets the same market as Atom and it does that with less power use efficiency.
Those who trade off the computational power for battery life will mostly chose Atom and those looking for higher computational power will use celerons and i3.
hUMA (Score:4, Informative)
heterogeneous Uniform Memory Access [arstechnica.com] is really what one should be paying attention to. With that tech in both of the upcoming consoles and major support from the same, Intel better watch out.
Re:hUMA (Score:4, Interesting)
I'm sure that Intel will happily let AMD do all the heavy lifting and then just license the tech when it becomes ready for prime time. If AMD can get just a couple of killer apps out of its' HSA initiative efforts they stand a decent chance to once again be the tail that wags the dog.
Re: (Score:3)
Re:hUMA (Score:5, Insightful)
But even more puzzling to me is why both MSFT and Sony picked the absolute WEAKEST CHIP that AMD sells for their flagships...what the fuck?
Because of exactly what parent said:
AMD can provide unified memory (hUMA) with a decent GPU and a decent CPU on the same die. Intel cannot, nvidia cannot.
hUMA will not make your PC faster in general, but it will provide you with a feature, even a PC with 20 Geforce Titans does not have: Latency free data exchange between CPU and GPU.
It will make GPU processing more feasible especially on a small scale. I can't give you an example from gaming, but I can give you an example from my own expertise. When we simulate big proteins, we do it on a GPU. However, for small proteins, the latency overhead simply kills us. Processing on the GPU would be faster, but we need to copy back and forth all the time. We don't need faster GPUs, we need faster transfers. With hUMA: no problem.
Re: (Score:1)
Do you need an interesting compiler to pass integer and Floating point objects around? MMX? Unless you are doing some assembler work much of the directX and WDDM takes care of that for you. The days of using assembly for games are coming to an end as frameworks and apis with compilers take over.
Re: (Score:2)
basically what AMD is doing is swapping their shitty old FPU design for the performance offered by the new ATI FPU design. It's not completely done yet but that's what I saw when the first announcments were made on the Fusion/Llano chips.
I give it another 2 generations before they have things worked out well enough to begin beating Intel at their own game and they still have the ATI division to push the envelope even further w/o regards to the TDP of the on die units. As to the single thread performance, tu
Re: (Score:2)
Re: (Score:1)
Well if gaming works with the cheaper newer FPU from the GPU portion it is more than an Atom.
I would like to see the PC return to where the cutting edgeness is at and the last console I bought was a WII (I do not like the playstation) so maybe I do not care and laugh?
But economic reality is more people are poor in the US than ever, Europe is in recession, corporations are under more pressure for profits, and China and India are new markets where they have less disposable income even in these economic times.
Re: (Score:2)
Re:hUMA (Score:5, Informative)
I can give you an example in gaming: TWICE THE WORLD GEOMETRY. The data has to be loaded from persistent storage or network into main RAM, then that same exact data must be shoved over into the GPU in batches to be rendered on demand. With hUMA I don't have to have a copy on the GPU and a copy in main memory -- just one copy. That means TWICE the geometry with the same amount of total RAM.
Furthermore, physics is great on the GPU I can parallelize the hell out of that. However, triggering sound effects and updating network state via read-back buffer is a horrible slow hack. hUMA means the GPU can actually be used to update gamestate that actually matters -- instead of just non-gameplay affecting things like particle effects. Logic can be triggered much more easily and course grain physics data can be read back at will for network synchronization. Client side prediction (latency compensation) also becomes a lot cheaper.
I can get a crap load of fine structural detail rendering and acting to physics right now on discrete GPUs, but the problem is when I want any of that to actually mean anything in terms of gameplay, I have to read back the data to the CPU side. hUMA utterly destroys the barriers preventing all sorts of RAM intensive gameplay. Hell, even weighted logic trees for AI can be processed on the GPU instead of only on the CPU, and we'll have the RAM budget to spare because we don't need two copies of EVERYTHING in memory all of a sudden. That means larger more complex (read: smarter) AI, and lots more of them.
Folks really don't realize how horrible the current bottleneck is. You want world that's fully destructible down to the pixel (atomic voxel), with models that actually have meat under the skin, and rebar in the walls, and with different physical properties so that you can freeze a door then shatter it, or pour corrosive acid on the hinge or create reactive armored structures on the fly by throwing some metal plate atop explosives atop the concrete bunker... Yeah, we can do all that on the GPU right now. However, without hUMA, on the CPU logic side of things the GPU is seen as a huge powerful black box -- We put the equations and bits of inputs in, amazing stuff happens, but we can't actually tell what's going on except for through a very tiny output signal -- the RAM transfer bottleneck; So, we can't really act on all the cool stuff going on. Right now that means we have to just make all the cool GPU stuff not important for gameplay, like embers that burn and blow about but can't burn you, or drapes that flutter in the breeze but can't be used to strangle someone with, or tied together to make an escape rope; Unless we planned all that out in advance.
Re: (Score:2)
Mod points and cookies for this fine explanation. Thank you.
Re: (Score:2)
I'll put it another way.
hUMA may slow down the GPU's raw execution speed, and contention between CPU and GPU access more tetchy, but it makes interaction between the game logic and presentation much more flexible. Doing this with DDR5 was a painful compromise, I am sure, but DDR3 would have made these systems cost 3 times more than they will for the same memory load-out.
TL:DR: hUMA gives developers a much more flexible and faster way to share resources between the GPU and CPU than the PCI pipes do.
Re: (Score:2)
erm swap DDR5 and DDR3... really... I do know the difference. >_
Re: (Score:2)
Re: (Score:2)
For gaming on a single GPU _ANY_ quad-core shows almost NO difference.
Obviously there are some exceptions such as Civ V which are heavily CPU bound, but 90% of all games are GPU bound.
http://www.anandtech.com/show/6934/choosing-a-gaming-cpu-single-multigpu-at-1440p [anandtech.com]
Re: (Score:1)
Price!
Microsoft has lost over a billion dollars on the xbox for the last 12 years. Only during the last 2 or 3 did they break and start to make money. The reason (besides investing in ZUNE) are console makers sell each item at a loss and hope they make up in games sold or when the technology goes down in price so they become cheaper to make towards the end of their life cycles.
The goal of the company is to raise the shareprice. With the stock price about the same for the last 10 years investors are pissed a
Re: (Score:2)
For crying out loud (Score:3, Informative)
On Wednesday, AMD launched it's latest line of mobile APUs, codenamed Temash, Kabini, and Richland.
.
Should be:
.
On Wednesday, AMD launched its latest line of mobile APUs, codenamed Temash, Kabini, and Richland.
.
.
Re: (Score:3)
Re: (Score:1)
Re: (Score:2)
Didn't even notice the difference and I'm a native speaker. Guess I'm too used to people getting it wrong that I no longer see it and automatically provide the proper inflecting when reading. Guess it's the reason I tend to use use its incorrectly as often as I do.
Re: (Score:2)
Wrong form of "its".
At least they didn't try to add an apostrophe after the U in "APUs".
Re: (Score:2)
whoosh's
Re: (Score:1)
Re: (Score:2)
As a cyberneticist, I have worked for years to create a machine intelligence system capable of reading (OCR) and comprehending (lexical structure), and performing basic actions based on the meanings it extracts from these. Over millions of generations of algorithmic evolution it finally has a very tiny fraction of the intelligence an average human does. When my AI talk to each other they only draw attention to protocol failures where they can not truly discern what the other end meant. They don't lock
Re: (Score:1)
here
http://www.tomshardware.com/reviews/kabini-a4-5000-review,3518-7.html [tomshardware.com]
Tom's tested them with F1 and Skyrim. HD4000 paired with an i3 beat the Kabini machine in minimum framerate by 50%.
Price & power consumption (Score:4, Interesting)
What's more, the Core-i3 matches the A4-5000 in power efficiency while its HD 4000 graphics completely outpace the APU.
Sure. Unless you're using the damn CPU at full speed.
What I'd be more interested to know though, is how expensive A4 5000 CPUs are. Do they cost as much as the Core i3 3271u?
Re:Price & power consumption (Score:4, Informative)
Under $70. The highest spec embedded Kabini part is $72 so we can expect retail to be a bit below that.
Intel officially prices the i3 3217U at $225 but somehow I think that's not the actual price it's sold at.
Re:Price & power consumption (Score:5, Informative)
amd cant compete on power consumption
... and that's exactly why AMD's CPU's power consumption in this article is lower. Now tell me, were you always this bad at math, or did it occur after an accident?
Re: (Score:2)
Was that installation carried out by Mr. Smith in the Matrix? If so, it's no wonder you got a defective replacement.
Re: (Score:1)
Look at Newegg, the cheapest laptop with Core i3 3271u is $534.99, while laptops with Brazos are typically $400 or under (Kabini replaces Brazos, so I would say the price will be the same)
Oh, what's your definition of "matches"? (Score:1)
"What's more, the Core-i3 matches the A4-5000 in power efficiency while its HD 4000 graphics completely outpace the APU."
http://www.tomshardware.com/reviews/kabini-a4-5000-review,3518-13.html [tomshardware.com]
While gaming, the 17W i3 is consuming nearly twice the amount of power as the 15W Kabini, at 35W vs 20W. Intel's ULV TDP ratings are an absolute joke.
Re: (Score:2)
A 50% increase in battery life is a 25% reduction in power consumption, not a 50% reduction. So they're cutting their TDP by a quarter.
Re: (Score:2)
It's a 33% decrease in systemwide power consumption, of which the CPU is only part. So if their statement is accurate, it means that they're cutting their TDP by significantly more than 33%, per Amdahl's law.
Re: (Score:2)
"While gaming"....sigh. Who cares.
If you don't care, you shouldn't be trying to make a point.
If you read the quote in the grandparent post, you'll see that the wording makes you think the i3 matches the A4's power consumption while in games. Perhaps you're as good in English as you are in math.
Re: (Score:1)
Re: (Score:1)
who the fuck games on a i3, its a facebook computer
Re: (Score:3)
Do not underestimate the demands of poorly-coded flash facebook games.
Re: (Score:2)
Re: (Score:2)
An i3 is a perfectly good CPU for casual gaming. Hell, I've been known to game on my laptop's Sandy Bridge Celeron U3600 1.2GHz dual core... it's not a hardcore gaming system, but it is quite usable when I'm not at home to use my desktop. There are quite a few games that will run quite acceptably on it, including most of my Steam library. (Civ5 is a hog, but that game is always CPU-heavy, and running it under Wine on a Celeron is painful).
The AMD system, apparently, won't be as good or even usable, in that
Re: (Score:1)
Not since the 20th century has the CPU become the bottleneck for games. It is almost always graphics since the first 3d cards came into existence.
The particular linked CPU is an ATOM competitor version of its APU. Not an icore3 competitor. Besides I would take this cpu over an icore3 for consumer use. Notice how your cell phone is all smooth when you move the page up an down with your finger? Your competitor it gets choppy right? That is because the gpu is in the cpu on your phone so for small data no laten
Re: (Score:1)
This laptop [newegg.com]is about 3 times faster than what you are using, just in CPU. Graphics would blow it away as well. Surely you aren't going to claim that less than an inch of size makes it "another class" are you? They are both "thin and light"
Re: (Score:2)
Oh is it?
I have done my whole PhD in CS simulation project (hundreds of thousands of agents with machine learning, discrete event methods and what not) on my G630 celeron computer. I run heavy software like Matlab on the same PC. My previous PC was an AMD 4400 MHz equivalent.
It still is my main PC. If even i3 is required for your facebook things you are doing something wrong.
Re: (Score:2)
and none of that has to react in realtime
Re: (Score:1)
Re: (Score:2)
well yea, those are 10 year old games
Re: (Score:2)
In the price range the A4's comes in, Intel doesnt have any competitive chips. Not a single one at all.
Re: (Score:2)
In the price range the A4's comes in, Intel doesnt have any competitive chips. Not a single one at all.
In the mobile sphere, where something like the A4 is most likely to actually be used (since they're touting the power consumption), you can easily find $400 laptops with Intel i3 in them. Unless the AMD offering produces laptops in the sub-$300 range without sacrificing things like having a real keyboard or a screen larger than a netbook, then that price point is irrelevant: the manufacturers will happily eat the increased profit, and you the consumer will end up paying the same at the till.
As regards your
Re: (Score:2)
In the mobile sphere, where something like the A4 is most likely to actually be used (since they're touting the power consumption), you can easily find $400 laptops with Intel i3 in them.
Why did you just pick $400?
Answer: Because thats what you have to pay for the Intel solution.
Is this important?
Answer: Only if the AMD solution you are comparing against also costs $400.
So, did you justify your argument?
Answer: No, because you never once mentioned the price of AMD solutions, nor went through the effort to see exactly what AMD solutions were available in the same price range and compare the performance of those equally priced devices with the precious i3 that you are drooling on.
Re: (Score:1)
matches power consumption? (Score:5, Informative)
What's more, the Core-i3 matches the A4-5000 in power efficiency while its HD 4000 graphics completely outpace the APU.
has anyone bothered looking at the benchmarks? The overall system power consumption when games were run was 20watts for AMD and 35watts for the Core i3.
To my calculation, that's a 75% more power consumption then AMD. Intel hardly "matches" anything...
AMD was still at least 3 watts less power hungry in any other benchmark, too...
Re: (Score:1)
If I'm gaming on my laptop, I don't do it on battery. If I'm mobile, while 3W will make a difference in the long-run, it won't make anywhere near as big a difference as turning down the screen brightness will.
Ultimately it comes down to price for the average consumer... and while the Intel offering is more expensive on paper, at the retail point of sale, I expect that the AMD offering will end up being the same as the Intel offering in the low-end laptop market: you can already get i3-based laptops for $400
Re: (Score:3, Insightful)
None of the benchmarks have made an apples to apples comparision. Either they compare a 35W Pentium to the 15W Kabini, or it's an expensive Core i3/i5.
Core i3-3217U only appears in laptops costing more than $500. Kabini replaces Brazos which typically appears in cheap (sub $400) laptops.
Re: (Score:2)
AMD will be the new favorite. (Score:1)
AMD will be the new favorite. Their APUs are cheap, give most bang for the buck, and are space- and power-efficient. A majority of desktop users in low- to mid-segment will find what they need in the A-series, and with the upcoming Kaveri even a few high-end users may consider ditching the expensive Intel and the big dedicated graphics board.
Re:AMD should move into other areas (Score:4, Insightful)