Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
AMD Hardware

AMD Trinity A10-4600M Processor Launched, Tested 182

MojoKid writes "AMD lifted the veil on their new Trinity A-Series mobile processor architecture today. Trinity has been reported as offering much-needed CPU performance enhancements in IPC (Instructions Per Cycle) but also more of AMD's strength in gaming and multimedia horsepower, with an enhanced second generation integrated Radeon HD graphics engine. AMD's A10-4600M quad-core chip is comprised of 1.3B transistors with a CPU base core clock of 2.3GHz and Turbo Core speeds of up to 3.2GHz. The on-board Radeon HD 7660G graphics core is comprised of 384 Radeon Stream Processor cores clocked at 497MHz base and 686Mhz Turbo. In the benchmarks, AMD's new Trinity A10 chip outpaces Intel's Ivy Bridge for gaming but can't hold a candle to it for standard compute workloads or video transcoding."
This discussion has been archived. No new comments can be posted.

AMD Trinity A10-4600M Processor Launched, Tested

Comments Filter:
  • by asliarun ( 636603 ) on Tuesday May 15, 2012 @04:23PM (#40009457)

    That's really all that matters. I've always been and AMD fan but If they can't pull out the same performance for less or equal price, they're done.

    IMO, the Trinity is a truly compelling offering from AMD, after a long long time. Yes, it trades lower CPU int/float performance for higher GPU performance when compared to Ivy Bridge, but this tradeoff makes it a very attractive choice for someone who wants a cheap to mid-priced laptop that gives you decent performance and decent battery life while still letting you play the latest bunch of games in low-def setting. Its hitting the sweet spot for laptops as far as I am concerned. I'm also fairly sure it will be priced about a hundred bucks cheaper than a comparable Ivy Bridge - that's how AMD has traditionally competed. Hats off to AMD fror getting their CPU performance to somewhat competitive levels while still maintaining the lead against the massively improved GPU of the Ivy Bridge. All this while they're still at 32nm while Ivy Bridge is at 22nm.

    Having said that, what I am equally excited about is the hope that Intel will come up with Bay Trail, their 22nm Atom that I strongly suspect will feature a similar graphics core that is there in Ivy Bridge. Intel has always led with performance and stability, not with power efficiency and price, so they need to create something that genuinely beats the ARM design, at least in the tablet space if not in the cellphone space.

  • by mrjatsun ( 543322 ) on Tuesday May 15, 2012 @04:31PM (#40009569)

    > Ivy Bridge and Llano actually ended up 'tied

    Yes, but Llano is the *old* AMD processor ;-) Check the reviews for performance of a HD 4000 vs a Trinity.

  • by Kjella ( 173770 ) on Tuesday May 15, 2012 @04:35PM (#40009625) Homepage

    Well, they'll sell them at the prices that they sell at, it's not like a CPU ever has a negative margin. The question is if that's good enough in the long run to keep making new designs and break even. Particularly as Intel is making a ton of money on processors that AMD can't compete against. Their Ivy Bridge processors should cost about 75% of a Sandy Bridge but sell for 98% of the price. Intel now has huge margins because AMD can't keep the pressure up, it's not really helping AMD to surrender the high end because it only gives Intel a bigger war chest.

    This launch is okay, it's all around much better than Llano and keeping a fair pace with Intel, but it obviously tops out if you want CPU performance. What will be interesting to see it next year when Intel will have both a completely new architecture for the Atom and be on their best processing technology. Then I fear AMD may be seeing the two-front war again, both on the high and low end. Right now the Atom is a little too gimped to actually threaten AMDs offerings. I expect Intel just wants AMD crippled, not killed though to avoid antitrust regulations, so I think they'll be around while Intel makes all the money.

  • by obarthelemy ( 160321 ) on Tuesday May 15, 2012 @04:38PM (#40009679)

    Built myself a PC to play WoW 3 months ago. Went with the high-end Llano, no discrete graphics required. An Intel setup would have required a graphics card, larger base (mini-itx MB), and more money. For most users that are also *casual* gamers (not hard-core), AMD's CPU/GPU balance saves a graphics cards while providing sufficient CPU power.

  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday May 15, 2012 @04:49PM (#40009811) Homepage Journal

    Speaking of all-in-ones, an all-in-one AMD chip would be a dandy basis for a games console. If not one from Microsoft (who has no particular need for x86) then it would perhaps be a good match for Valve. Public distaste for Sony is at an all-time high, but is it enough to unseat them? etc etc.

    if I could have a 16 core phenom ii, though, that would be pretty awesome. I could drop it right into my current machine. I'd pay $100 for even eight cores, though, let alone sixteen.

  • by Lumpy ( 12016 ) on Tuesday May 15, 2012 @05:18PM (#40010199) Homepage

    Sorry but the 8 core FX kicks the crud out of the quad core i7 that is the same clock speed. I actually USE a pc for video editing rendering and 3d rendering and the new 8 core machine with one FX processor is kicking the arse of the i7 machine.

    Granted i'm actually using multi threaded software unlike most people, but saying that the i7 is the end all to computing performance is not true.

  • by Lumpy ( 12016 ) on Tuesday May 15, 2012 @05:22PM (#40010261) Homepage

    "people under 30 who really dont do anything with their computers but websurf don't use towers. tablets and notebooks. small notebooks."

    Fixed that for you. Every person I know under 30 that actually uses a computer has a tower. they need to do things like Render 3d GFX for static images or movies, high end photography, video production. even the CAD/CAM geeks have a tower.

    I know plenty of under 30 professionals that actually use a computer to the point that they need a tower, It seems you don't, you might want to hang around smarter people.

  • by bored ( 40072 ) on Tuesday May 15, 2012 @09:24PM (#40012315)

    They have a second niche, much more directly focused on price, in compute-light, memory-heavy server applications(since you can populate your sockets with AMD CPUs for less and the number of DIMMs you get is roughly proportional to the number of sockets you have active)

    I haven't tried AMD's latest server machines, but if they are even 1/2 as good as the old, ones they are a _MUCH_ better deal. My 6 !! year old DL585G2 is actually faster on every single thing it gets used for than the much newer westmere machines we have been buying. The problem is that intel is charging an absolute fortune for chips clocked fast, so we end up with 1.8 or 2.2Ghz westmere machines, and their single thread performance is abysmal compared to the much older 3.2Ghz AMD machine. Our application scales nicely, but quickly becomes IO bound, so both machines basically get the same throughput, but the AMD machine has much lower overall latency. This results in it actually getting much better benchmarks in our tests.

    So, in theory we could get an intel that kicks the crap out of the AMD machine, but its going to cost us 5x as much (from ~$5k to ~$25k). So we buy the cheap ones, and they get their ass handed to them by a 6 year old machine that cost $5k when it was new.

Disc space -- the final frontier!

Working...