Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Stats Upgrades Hardware

AMD Launches New Mobile APU Lineup, Kabini Gets Tested 102

An anonymous reader writes "While everyone was glued to the Xbox One announcement, Nvidia GeForce GTX 780 launch, and Intel's pre-Haswell frenzy, it seems that AMD's launch was overlooked. On Wednesday, AMD launched its latest line of mobile APUs, codenamed Temash, Kabini, and Richland. Temash is targeted towards smaller touchscreen-based devices such as tablets and the various Windows 8 hybrid devices, and comes in dual-core A4 and A6 flavors. Kabini chips are intended for the low-end notebook market, and come in quad-core A4 and A6 models along with a dual-core E2. Richland includes quad-core A8 and A10 models, and is meant for higher-end notebooks — MSI is already on-board for the A10-5750M in their GX series of gaming notebooks. All three new APUs feature AMD HD 8000-series graphics. Tom's Hardware got a prototype notebook featuring the new quad-core A4-5000 with Radeon HD 8300 graphics, and benchmarked it versus a Pentium B960-based Acer Aspire V3 and a Core-i3-based HP Pavillion Sleekbook 15. While Kabini proves more efficient, and features more powerful graphics than the Pentium, it comes up short in CPU-heavy tasks. What's more, the Core-i3 matches the A4-5000 in power efficiency while its HD 4000 graphics completely outpace the APU."
This discussion has been archived. No new comments can be posted.

AMD Launches New Mobile APU Lineup, Kabini Gets Tested

Comments Filter:
  • hUMA (Score:4, Informative)

    by Anonymous Coward on Friday May 24, 2013 @10:53PM (#43819157)

    heterogeneous Uniform Memory Access [arstechnica.com] is really what one should be paying attention to. With that tech in both of the upcoming consoles and major support from the same, Intel better watch out.

    • Re:hUMA (Score:4, Interesting)

      by rrhal ( 88665 ) on Friday May 24, 2013 @11:45PM (#43819361)

      I'm sure that Intel will happily let AMD do all the heavy lifting and then just license the tech when it becomes ready for prime time. If AMD can get just a couple of killer apps out of its' HSA initiative efforts they stand a decent chance to once again be the tail that wags the dog.

    • Comment removed based on user account deletion
      • Re:hUMA (Score:5, Insightful)

        by tstrunk ( 2562139 ) on Saturday May 25, 2013 @06:19AM (#43820499)

        But even more puzzling to me is why both MSFT and Sony picked the absolute WEAKEST CHIP that AMD sells for their flagships...what the fuck?

        Because of exactly what parent said:
        AMD can provide unified memory (hUMA) with a decent GPU and a decent CPU on the same die. Intel cannot, nvidia cannot.
        hUMA will not make your PC faster in general, but it will provide you with a feature, even a PC with 20 Geforce Titans does not have: Latency free data exchange between CPU and GPU.

        It will make GPU processing more feasible especially on a small scale. I can't give you an example from gaming, but I can give you an example from my own expertise. When we simulate big proteins, we do it on a GPU. However, for small proteins, the latency overhead simply kills us. Processing on the GPU would be faster, but we need to copy back and forth all the time. We don't need faster GPUs, we need faster transfers. With hUMA: no problem.

        • Re:hUMA (Score:5, Informative)

          by VortexCortex ( 1117377 ) <VortexCortex AT ... trograde DOT com> on Saturday May 25, 2013 @12:21PM (#43822225)

          I can give you an example in gaming: TWICE THE WORLD GEOMETRY. The data has to be loaded from persistent storage or network into main RAM, then that same exact data must be shoved over into the GPU in batches to be rendered on demand. With hUMA I don't have to have a copy on the GPU and a copy in main memory -- just one copy. That means TWICE the geometry with the same amount of total RAM.

          Furthermore, physics is great on the GPU I can parallelize the hell out of that. However, triggering sound effects and updating network state via read-back buffer is a horrible slow hack. hUMA means the GPU can actually be used to update gamestate that actually matters -- instead of just non-gameplay affecting things like particle effects. Logic can be triggered much more easily and course grain physics data can be read back at will for network synchronization. Client side prediction (latency compensation) also becomes a lot cheaper.

          I can get a crap load of fine structural detail rendering and acting to physics right now on discrete GPUs, but the problem is when I want any of that to actually mean anything in terms of gameplay, I have to read back the data to the CPU side. hUMA utterly destroys the barriers preventing all sorts of RAM intensive gameplay. Hell, even weighted logic trees for AI can be processed on the GPU instead of only on the CPU, and we'll have the RAM budget to spare because we don't need two copies of EVERYTHING in memory all of a sudden. That means larger more complex (read: smarter) AI, and lots more of them.

          Folks really don't realize how horrible the current bottleneck is. You want world that's fully destructible down to the pixel (atomic voxel), with models that actually have meat under the skin, and rebar in the walls, and with different physical properties so that you can freeze a door then shatter it, or pour corrosive acid on the hinge or create reactive armored structures on the fly by throwing some metal plate atop explosives atop the concrete bunker... Yeah, we can do all that on the GPU right now. However, without hUMA, on the CPU logic side of things the GPU is seen as a huge powerful black box -- We put the equations and bits of inputs in, amazing stuff happens, but we can't actually tell what's going on except for through a very tiny output signal -- the RAM transfer bottleneck; So, we can't really act on all the cool stuff going on. Right now that means we have to just make all the cool GPU stuff not important for gameplay, like embers that burn and blow about but can't burn you, or drapes that flutter in the breeze but can't be used to strangle someone with, or tied together to make an escape rope; Unless we planned all that out in advance.

          • Mod points and cookies for this fine explanation. Thank you.

          • I'll put it another way.

            hUMA may slow down the GPU's raw execution speed, and contention between CPU and GPU access more tetchy, but it makes interaction between the game logic and presentation much more flexible. Doing this with DDR5 was a painful compromise, I am sure, but DDR3 would have made these systems cost 3 times more than they will for the same memory load-out.

            TL:DR: hUMA gives developers a much more flexible and faster way to share resources between the GPU and CPU than the PCI pipes do.

        • Comment removed based on user account deletion
      • Price!

        Microsoft has lost over a billion dollars on the xbox for the last 12 years. Only during the last 2 or 3 did they break and start to make money. The reason (besides investing in ZUNE) are console makers sell each item at a loss and hope they make up in games sold or when the technology goes down in price so they become cheaper to make towards the end of their life cycles.

        The goal of the company is to raise the shareprice. With the stock price about the same for the last 10 years investors are pissed a

  • For crying out loud (Score:3, Informative)

    by Anonymous Coward on Friday May 24, 2013 @10:59PM (#43819177)

    On Wednesday, AMD launched it's latest line of mobile APUs, codenamed Temash, Kabini, and Richland.
    .
    Should be:
    .
    On Wednesday, AMD launched its latest line of mobile APUs, codenamed Temash, Kabini, and Richland.
    .
    .

    • Whats the difference between those?
      • by SeaFox ( 739806 )

        Wrong form of "its".

        At least they didn't try to add an apostrophe after the U in "APUs".

    • You are noise in the signal. This kind of bullshit nit-picking helps no one. His message was perfectly clear from his context, WHICH MATTERS MORE THEN RAW GRAMMAR.
    • As a cyberneticist, I have worked for years to create a machine intelligence system capable of reading (OCR) and comprehending (lexical structure), and performing basic actions based on the meanings it extracts from these. Over millions of generations of algorithmic evolution it finally has a very tiny fraction of the intelligence an average human does. When my AI talk to each other they only draw attention to protocol failures where they can not truly discern what the other end meant. They don't lock

  • by WaroDaBeast ( 1211048 ) on Saturday May 25, 2013 @01:34AM (#43819733)

    What's more, the Core-i3 matches the A4-5000 in power efficiency while its HD 4000 graphics completely outpace the APU.

    Sure. Unless you're using the damn CPU at full speed.

    What I'd be more interested to know though, is how expensive A4 5000 CPUs are. Do they cost as much as the Core i3 3271u?

  • "What's more, the Core-i3 matches the A4-5000 in power efficiency while its HD 4000 graphics completely outpace the APU."

    http://www.tomshardware.com/reviews/kabini-a4-5000-review,3518-13.html [tomshardware.com]

    While gaming, the 17W i3 is consuming nearly twice the amount of power as the 15W Kabini, at 35W vs 20W. Intel's ULV TDP ratings are an absolute joke.

    • by Osgeld ( 1900440 )

      who the fuck games on a i3, its a facebook computer

      • Do not underestimate the demands of poorly-coded flash facebook games.

        • by Zuriel ( 1760072 )
          Nothing like opening some shitty little flash game and hearing your i7's fan spin up to full speed.
      • An i3 is a perfectly good CPU for casual gaming. Hell, I've been known to game on my laptop's Sandy Bridge Celeron U3600 1.2GHz dual core... it's not a hardcore gaming system, but it is quite usable when I'm not at home to use my desktop. There are quite a few games that will run quite acceptably on it, including most of my Steam library. (Civ5 is a hog, but that game is always CPU-heavy, and running it under Wine on a Celeron is painful).

        The AMD system, apparently, won't be as good or even usable, in that

        • Not since the 20th century has the CPU become the bottleneck for games. It is almost always graphics since the first 3d cards came into existence.

          The particular linked CPU is an ATOM competitor version of its APU. Not an icore3 competitor. Besides I would take this cpu over an icore3 for consumer use. Notice how your cell phone is all smooth when you move the page up an down with your finger? Your competitor it gets choppy right? That is because the gpu is in the cpu on your phone so for small data no laten

        • Not sure you realize this, but your laptop is about 5% slower than an Atom D2700. You are using a netbook, dude, it's just a really big heavy expensive netbook.

          This laptop [newegg.com]is about 3 times faster than what you are using, just in CPU. Graphics would blow it away as well. Surely you aren't going to claim that less than an inch of size makes it "another class" are you? They are both "thin and light"
      • by wmac1 ( 2478314 )

        Oh is it?

        I have done my whole PhD in CS simulation project (hundreds of thousands of agents with machine learning, discrete event methods and what not) on my G630 celeron computer. I run heavy software like Matlab on the same PC. My previous PC was an AMD 4400 MHz equivalent.

        It still is my main PC. If even i3 is required for your facebook things you are doing something wrong.

      • I've gamed on the Pentium variety to pass time on the train. Mostly Dues Ex and Age of Empires Online which it did pretty well.
    • In addition, all these folks are trying to justify their AMD hate with these A4 benchmarks when the A4 is the lowest end of these new chips, and none of these haters ever want to talk price.

      In the price range the A4's comes in, Intel doesnt have any competitive chips. Not a single one at all.
      • In the price range the A4's comes in, Intel doesnt have any competitive chips. Not a single one at all.

        In the mobile sphere, where something like the A4 is most likely to actually be used (since they're touting the power consumption), you can easily find $400 laptops with Intel i3 in them. Unless the AMD offering produces laptops in the sub-$300 range without sacrificing things like having a real keyboard or a screen larger than a netbook, then that price point is irrelevant: the manufacturers will happily eat the increased profit, and you the consumer will end up paying the same at the till.

        As regards your

        • In the mobile sphere, where something like the A4 is most likely to actually be used (since they're touting the power consumption), you can easily find $400 laptops with Intel i3 in them.

          Why did you just pick $400?

          Answer: Because thats what you have to pay for the Intel solution.

          Is this important?

          Answer: Only if the AMD solution you are comparing against also costs $400.

          So, did you justify your argument?

          Answer: No, because you never once mentioned the price of AMD solutions, nor went through the effort to see exactly what AMD solutions were available in the same price range and compare the performance of those equally priced devices with the precious i3 that you are drooling on.

      • by Flodis ( 998453 )
        Yeah. I trust Tomshardware's tests about as far as I can throw them. They were notoriously Intel-biased during the netburst era. Just google some old tests. It's hilarious.
  • by Luke_22 ( 1296823 ) on Saturday May 25, 2013 @04:16AM (#43820167)

    What's more, the Core-i3 matches the A4-5000 in power efficiency while its HD 4000 graphics completely outpace the APU.

    has anyone bothered looking at the benchmarks? The overall system power consumption when games were run was 20watts for AMD and 35watts for the Core i3.
    To my calculation, that's a 75% more power consumption then AMD. Intel hardly "matches" anything...

    AMD was still at least 3 watts less power hungry in any other benchmark, too...

    • If I'm gaming on my laptop, I don't do it on battery. If I'm mobile, while 3W will make a difference in the long-run, it won't make anywhere near as big a difference as turning down the screen brightness will.

      Ultimately it comes down to price for the average consumer... and while the Intel offering is more expensive on paper, at the retail point of sale, I expect that the AMD offering will end up being the same as the Intel offering in the low-end laptop market: you can already get i3-based laptops for $400

    • Re: (Score:3, Insightful)

      by edxwelch ( 600979 )

      None of the benchmarks have made an apples to apples comparision. Either they compare a 35W Pentium to the 15W Kabini, or it's an expensive Core i3/i5.
      Core i3-3217U only appears in laptops costing more than $500. Kabini replaces Brazos which typically appears in cheap (sub $400) laptops.

    • You also have to take how much processing you can accomplish with the same amount of energy. If they're matched evenly in gaming performance, then that 20 watts is great. If Intel is much faster, then it might even out or turn the other way.
  • by Anonymous Coward

    AMD will be the new favorite. Their APUs are cheap, give most bang for the buck, and are space- and power-efficient. A majority of desktop users in low- to mid-segment will find what they need in the A-series, and with the upcoming Kaveri even a few high-end users may consider ditching the expensive Intel and the big dedicated graphics board.

No spitting on the Bus! Thank you, The Mgt.

Working...