Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AMD Graphics Intel Upgrades Games Hardware

AMD Trinity APUs Stack Up Well To Intel's Core 3 223

Barence writes "AMD's APUs combine processor and graphics core in the same chip. Its latest Trinity chips are more powerful than ever, thanks to current-generation Radeon graphics and the same processing cores as AMD's full-fat FX processors. They're designed to take down Intel's Core i3 chips, and the first application and gaming benchmarks are out. With a slight improvement in applications and much more so in games, they're a genuine alternative to the Core i3." MojoKid writes with Hot Hardware's review, which also says the new AMD systems "[look] solid in gaming and multimedia benchmarks, writing "the CPU cores clock in at 3.8GHz / 4.2GHz for the A10-5800K and 3.6GHz / 3.9GHz for A8-5600K, taking into account base and maximum turbo speeds, while the graphics cores scale up to 800MHz for the top A10 chip."
This discussion has been archived. No new comments can be posted.

AMD Trinity APUs Stack Up Well To Intel's Core 3

Comments Filter:
  • Wow (Score:5, Funny)

    by binarylarry ( 1338699 ) on Thursday September 27, 2012 @09:23AM (#41477411)

    AMD is finally competitive with Intel's lowest end offerings again!

    Yay!

    • The advantage of AMD chips recently has been avoiding intel integrated graphics. I know you CAN get most intel chips without it, but it was a real anti-selling point for me.

      • Re:Wow (Score:5, Insightful)

        by h4rr4r ( 612664 ) on Thursday September 27, 2012 @09:35AM (#41477523)

        You know you can just not use it right?
        Why bother looking for a chip without it?

        Heck, these days it is even usable and has good open drivers.

      • Re:Wow (Score:5, Insightful)

        by Skarecrow77 ( 1714214 ) on Thursday September 27, 2012 @09:44AM (#41477619)

        Ironic statement, since the main selling point of the chip being reviewed here is its integrated graphics.

        Which I find just silly really. These are fine chips to build a PC for your little cousin who surfs the web and maybe plays world of warcraft. for any real build, integrated graphics, for all their advancements, still read like:
        Intel: "Our new HD4000 graphics are nearly as fast as a mainstream card from 8 years ago!"
        AMD: "HAH, our new chip's graphics cores are as fast as a mainstream card from 6 years ago! we're two years of obsolecense better!"

        even a $100 modern dedicated card will whallop either of these chips solutions.

        • by h4rr4r ( 612664 )

          For 90% of folks either of these is good enough.
          I have played portal 2 on my macbook air using the Sandy Bridge graphics. It was fine.

          Very few folks care about dedicated graphics cards these days.

          I have 1 machine that has one, it cost a $100 and that is it. I might buy another if Steam for Linux ever launches.

          • Re: (Score:3, Insightful)

            No one cares about dedicated graphics cards.... unless they play games.

            I don't know what region you're from where all the gamers just use the onboard GPU that comes with their mobo.

            • by h4rr4r ( 612664 )

              That is why I said 90%, the other 10% are the gamers.

            • No one cares about dedicated graphics cards.... unless they play games.

              Or do cuda-enabled research.

          • Re:Wow (Score:4, Interesting)

            by Skarecrow77 ( 1714214 ) on Thursday September 27, 2012 @10:06AM (#41477945)

            Pretty much until the sandy bridge era, integrated graphics were completely unusable for gaming, and they are still years behind dedicated cards.

            Your statement that "90% of folks either of these is good enough." is true, but misleading. It is true that the extent of desktop/laptop gaming that most people are interested in maxes out at farmville (or whatever the new facebook gaming trend is, I certainly don't pay attention), and they do their gaming on their phone, tablet or console.

            These articles however are written towards the community that constructs their own PCs, or at the very least is quite picky about what is inside their machines. You don't read these articles unless you care about such things. From that perspective, for the majority of the target audience of TFA links, these graphics performance of either brand is hardly good enough for any sort of main machine build.

            • by tyrione ( 134248 )
              I will add that if Zynga got off its ass and ported their games to HTML5/WebGL and/or used OpenCL via Adobe's Flash people would be glad they have a dedicated GPGPU as the current Zynga games are eating nearly 2GB of RAM and these games are dirt simple and shouldn't come close to the demands on CPU/RAM requirements that they current draw.
          • by RMingin ( 985478 )

            By using Portal 2, you have demonstrated only how very well-coded and well-optimized Portal 2 is. It runs fine on just about anything that can make triangles. Recently I watched Portal 2 running at high resolution and high apparent quality on a GeForce 8600 GT. Those are ancient, and weren't good even when they were new.

            • by h4rr4r ( 612664 )

              So what was good when they were new?
              I had one and it seemed up to the task for every game that was new at that time.

              • by RMingin ( 985478 )

                I have Sandy Bridge in my laptop and desktop, and an Ivy Bridge desktop for a project, and I'm not disputing that their graphics are quite good, particularly for an integrated chip.

                I just meant that proclaiming Portal 2 performance wasn't going to get the traction you were looking for.

                • by h4rr4r ( 612664 )

                  Why not?

                  Portal is far more graphically intense than any game 90% of people are ever going to play on their PC. WoW and Farmville are more likely the targets for this and integrated covers that fine.

              • That was kind of an awkward transitional moment in GPU development, where the midrange parts were married to memory that couldn't do the cores justice, but the high-end parts are still half-decent for low-to-middling resolutions and detail settings today. The 8800GT and Radeon 2900XT can still get the job done (though the latter card will heat your house nearly as well as a hot Pentium D...). I had an 8600GTS until recently, and it wasn't half-bad, but also wasn't much more than an incremental step above
              • Comment removed based on user account deletion
        • by PRMan ( 959735 )
          The HD 4000 is probably better than you think. It plays most modern games adequately--not at top rendering for all features on a 2560x1920 monitor--but better than any console.
          • Re:Wow (Score:4, Interesting)

            by Skarecrow77 ( 1714214 ) on Thursday September 27, 2012 @10:20AM (#41478159)

            I don't doubt that it works. I have a previous version of integrated intel graphics (yes I am aware of the advancements of the HD2000/3000/4000 series in comparison) on this laptop, and -can- game with the settings turned down... way down.

            that said, I think my (somewhat cynical) "we are as good as a 6 year old card!" comments are pretty appropriate. Tom's hardware [tomshardware.com] ranks the HD4000 roughly on par with the nvidia 6800 ultra (released in 2004) or the 8600GT (released in 2006).

            the 8600gt was a fine midrange card, and can still run today's games, albiet at reduced resolution and details. if all you're looking for is the ability to run a game, period, these chips will work, but I can't really say they'd do much better than a console (the ps3 gpu is essentially an nvidia gtx 7800, and the 360 gpu is similar, only with unified shaders), and again they don't hold a candle to even modest dedicated cards today.

            in a laptop, I might be interested. On the desktop, which is what the chips being reviewed are for, I can't see much use for these things when it comes to gaming (which, again, is their big selling point right now). if you're building a desktop machine you expect to do any gaming on, and the extra $100 for, say, a gts 450 or something like that is a budget breaker, maybe you should be saving up an extra month.

            • that said, I think my (somewhat cynical) "we are as good as a 6 year old card!" comments are pretty appropriate. Tom's hardware ranks the HD4000 roughly on par with the nvidia 6800 ultra (released in 2004) or the 8600GT (released in 2006).

              The AMD Trinity in this review scored nearly double the HD4000.

          • Comment removed based on user account deletion
        • Hell even a $50 card is massively better. I don't play games but the performance improvement in doing GIS and cartography I have seen from going from integrated or on board graphics to a discrete ~$50 card have always been impressive.
        • Comment removed based on user account deletion
      • Re: (Score:2, Flamebait)

        by Targon ( 17348 )

        Considering the piss-poor quality of Intel based machines in the $500 and under range, and when AMD based machines do tend to have higher quality components in that range, you could compare buying an Intel based machine like putting a Ferrari engine into a Yugo. Yea, it may be faster, but the overall experience of owning it will be shorter and more prone to failure. Obviously, going to a higher end Intel machine would result in a better experience, but at the low end, Intel based machines have a much h

        • How true is that really? Celeron, Atom and Core machines seem very reliable to me, Atoms in particular. I've seen many of them very badly abused and they hold up. I think it's just the uh, I always get this wrong, square-cube law? Ah yes, that's the one. The low-end machines are smaller and therefore less prone to damage from the same force, and at the same time, produce less force when dropped. I remember the Pentium III machines as being pretty durable too, but unfortunately many of them came with ATI Rag

      • Comment removed based on user account deletion
        • All I can say is that I've been burned by good specs that somehow manage to lack critical(to me) graphics functions, like supporting modern shader models.

        • I've got a similar "low end" system, Core i3, 6 GB ram, (and of course HD2000 graphics); it's not the greatest currently but it gets you by. Compared to what I've been using for the past 8 years though (P4 2.8GHz 2GB ram), it totally screams. I've been revisiting Quake 3, Portal 2, Half Life2, Left for Dead, and (heh) Return to Castle Wolfenstein with that setup. Yep, mostly older games, but those are from when I gamed a lot. It's definitely impressive for an integrated solution, but I really don't game
    • graphics blows Intel away and what better faster cpu or slower cpu with much better video??

    • Re:Wow (Score:5, Insightful)

      by pushing-robot ( 1037830 ) on Thursday September 27, 2012 @09:41AM (#41477599)

      Or, more accurately, AMD's integrated video is better than Intel's integrated video (seriously, that's all they tested!).
      And these AMD chips still double the system power consumption [hothardware.com] over their Intel counterparts.

      So if you're part of the subset of gamers that morally object to dedicated video cards but still enjoy noisy fans and high electricity bills, AMD has a product just for you! Woo!

      • Re:Wow (Score:5, Insightful)

        by h4rr4r ( 612664 ) on Thursday September 27, 2012 @09:48AM (#41477673)

        You are actually bitching about less power than a light bulb used to use?

        At worst case it looks like ~60 watts on the two higher end units. How low power is the monitor if that constitutes doubling the power? I am betting total system in this little test ignores the monitor.

        Oh noes tens of dollars more per year in electricity! The HORRORS! How ever will I afford such an extravagance that costs per year almost what two drinks at the bar costs.

        If they are within 100watts I would call it a wash and be far more interested in computing power per $ upfront cost. AMD has traditionally done very well in that test and only started failing it very recently.

        • I'd agree from the perspective of a low-budget desktop machine (unless you need things to run quietly), but for a laptop it's a different story. A 10 watt TDP difference on a mobile processor can make the difference between a cool machine and one that overheats, depending on the design of it. The Ivy Bridge in my current laptop hardly gets warm ever under load, whereas the Core 2 Duo/X1650 combo in my previous one would overheat to the point of shutting down. Plus, the Intel chips offer a lot smoother Linux
          • by h4rr4r ( 612664 )

            These are not laptop chips. The AMD laptop ones are actually lower power.

            Yes, the linux drivers are much better. For built in that is what I go with for that reason.

    • Re:Wow (Score:4, Interesting)

      by beelsebob ( 529313 ) on Thursday September 27, 2012 @09:51AM (#41477709)

      Except that none of the benchmarks actually cover CPU speed, because AMD have put all the reviewers under NDA until the chip is released. That rather suggests they haven't caught up, they're just showing off the better IGP, which no one playing games will use anyway, and that anyone not playing games won't give a shit about.

      • Re:Wow (Score:4, Insightful)

        by fuzzyfuzzyfungus ( 1223518 ) on Thursday September 27, 2012 @10:21AM (#41478189) Journal

        Except that none of the benchmarks actually cover CPU speed, because AMD have put all the reviewers under NDA until the chip is released. That rather suggests they haven't caught up, they're just showing off the better IGP, which no one playing games will use anyway, and that anyone not playing games won't give a shit about.

        While I'm not hugely sanguine about AMD's prospects(unfortunately, it isn't going to be pretty if the world is divided between x86s priced like it's still 1995 and weedy ARM lockdown boxes, so it would be nice if AMD could survive and keep Intel in check); but there is one factor that makes IGPs much more of a big deal than they used to be:

        Laptops. Back in the day, when laptops were actually expensive, the bog-standard 'family computer from best buy, chosen by idiots and sold by morons on commission' would be a desktop of some flavor. Unless the system was terminally cheap and nasty and entirely lacked an AGP/PCIe slot, it didn't matter what IGP it had, because if little Timmy or Suzy decided they wanted to do some gaming, they'd just buy a graphics card and pop it in.

        Now, it's increasingly likely that the family computers will be laptops(or occasionally all-in-ones or other not mini towers) of equally unexciting quality but substantially lower upgradeability. If you want graphics, you either use what you bought or you buy a whole new computer(or just a console).

        This makes the fact that some, but not all, IGPs can actually run reasonably contemporary games(especially at the shitty 1366x768 that a cheap laptop will almost certainly be displaying) much more important to some buyers and to the PC market generally.

    • In fairness, i3 isn't Intel's lowest end offering, there's still Pentium and Celeron which are lower than that. It's kind of mid-range...
  • One down.....two to go!

  • by Laglorden ( 87845 ) on Thursday September 27, 2012 @09:39AM (#41477579) Journal

    AMD has apparently forbidden testers to write about cpuperformance.

    In their NDA-contract it's specified

    "In previewing x86 applications, without providing hard numbers until October [something], we are hoping that you will be able to convey what is most important to the end-user which is what the experience of using the system is like. As one of the foremost evaluators of technology, you are in a unique position to draw educated comparisons and conclusions based on real-world experience with the platform,"

    and

    "The topics which you must be held for the October [sometime], 2012 embargo lift are
            - Overclocking
            - Pricing
            - Non game benchmarks"

    So the reviews coming out are only from sources that has decided to go along with those "guidelines". In other words, not complete, I would say extremly biased.

    • by h4rr4r ( 612664 )

      All prerelease info is like this, same with any reviewer who got the part for free.

      What we really need is the consumer reports of computer hardware. Buy it only from normal vendors and don't advertise.

  • by IYagami ( 136831 ) on Thursday September 27, 2012 @09:39AM (#41477585)

    AMD allowed websites to publish a preview of the benchmarks before the estimated date if they only focused on graphics performance. This is an unfair move by AMD.

    Read http://techreport.com/blog/23638/amd-attempts-to-shape-review-content-with-staged-release-of-info [techreport.com] for more details

    (maybe in a couple of weeks you will find that AMD Trinity APUs have abysmal x86 performance compared to Intel CPUs)

    Disclaimer: I own a laptop with an AMD cpu inside

    • by Targon ( 17348 ) on Thursday September 27, 2012 @09:53AM (#41477741)

      In this day and age, CPU performance means less and overall performance is the thing people look for. A quad-core 1.5GHz is easily enough for your average home user for day to day, and at that point, GPU power for things like full-screen youtube or Netflix videos becomes a bit more of a concern. We WILL have to wait and see what the performance numbers come in at, but a 10% bump in CPU performance is expected over the last generation from AMD.

      • by PRMan ( 959735 )
        This. I built my wife a machine with an i3 but also with 8GB RAM and an SSD. It has Intel HD 4000 graphics. It screams. Unless you are ripping MP3s, editing video or compiling Chrome, your CPU is easily able to handle any task instantly anyway. And my wife plays Facebook-style games, which are smooth and fast on an Intel HD 4000 anyway.
        • I don't even really think MP3s qualify as demanding these days; a single core Atom can manage MP3 encoding at high quality settings at better than real-time, and even though most available encoders aren't multithreaded an awful lot of people will be limited by the speed of their optical drive before their CPUs really come into the picture. DVD and Blu-ray transcoding have stepped into the role of multimedia-centric CPU flogger in its stead. That - and scientific computing - are what's leading me to consid
      • I am a trifle surprised that AMD is trying to stage-manage the CPU performance benchmarking(since everybody who cares already has an informed guess based on the last model, and in absence of information pessimists are simply going to assume that the part is bloody dire, so actual benchmarks could hardly make things worse); but it is lovely how it is practically impossible to buy a non-netbook with a CPU too weak for general purposes.

        The big killer seems to be disk I/O(well, that and the gigantic bottleneck

    • by h4rr4r ( 612664 )

      This is what everyone does.
      Any test with early parts or free parts is rigged, don't trust them. Either the test is rigged or very commonly the part is.

      This is not limited to computer parts, car reviews are often of cars specially setup for the reviewers. Lambo brings two cars to every review one setup for going fast in a straight line and one for cornering work. If you dare mention this or use the cars in the way they are not setup and print it you will never review another Lambo without buying it or borrow

    • by Sloppy ( 14984 )

      Actually, that article says you have to focus on gaming performance, not graphics. So: bring on the Dwarf Fortress benchmarks!

  • Is it the official name of the Haswell packages?

    Un-fucking-believable.
    • by Matimus ( 598096 )
      The article is talking about Ivy Bridge, which would be the 3rd generation Core. The "i3" loosely represents the performance sku, "i3" being the low end. Haswell will probably be marketed as the 4th generation Core.
  • Still consuming 140-150 watts at peak load vs intel's ~90. Good to see the graphics numbers coming up though.

    The reason I highlight power is that the integrated graphics power could be a huge advantage in a low-end laptop. As long as it doesn't kill battery life.
  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Thursday September 27, 2012 @09:55AM (#41477767) Homepage Journal

    But does it run linux worth a damn [phoronix.com]? Inquiring minds want to know. I got boned by buying an Athlon 64 L110/R690M machine for which proper Linux support was never forthcoming. Now I want to see power saving and the graphics driver work before I give AMD money for more empty promises about Linux support.

    • I'm running Ubuntu 12.04 without incident on the the A8-3870 (previous Llano architecture) without incident. Ubuntu + XBMC in a small shoebox mini-ITX enclosure is working great for an inexpensive HTPC for my home.

      Best,

    • For the last few years I have only been buying Intel hardware because it just works out of the box with all Linux distros. Is this AMD thing going to work out of the box in Linux?
      No, I'm not going to take time to download and install drivers. That crap is for M$ users. Yeah, yeah, I know Intel graphics are not the fastest thing out there. Save it for someone who cares. The Intel graphics are fast enough for the games that I write and play.

  • by sinij ( 911942 ) on Thursday September 27, 2012 @10:06AM (#41477963)
    I admit, I am one of the last few ideologues in PC gaming. I would never consider AMD graphic card due to shitty drivers and I would never consider Intel CPU due to socket shenanigans. Yes, I am actually one of the rare few people who upgrades CPUs and cares about socket backwards comparability.

    My current gaming rig uses Zambezi 8-core AMD CPU, still adequate but it shows its age. I am disappointed AMD hasn't come up with an upgrade, but I can wait.

    My last gaming rig lasted me over 4 years and going. I started with Athlon X2 end ended with Phenom II X4. It is still in use as a media PC, and still capable of gaming.

    Maybe it is dumb luck, but every AMD chip I had was running cool, overclocked well and lasted. Every Intel chip I owned didn't overclock well and had problems staying cool.
    • by jandrese ( 485 )
      For what its worth, my gaming rig is going on 6 years old now and still runs most games at high settings. For game playing these days, you can pretty much get away with upgrading your machine at the same time the most popular consoles get upgraded, because everybody targets the consoles anyway. You can build a rig that is orders of magnitude faster than a PS3 or a 360, but most of that power will be wasted because the games still ship with crappy low resolution textures, limited used of new graphics featu
    • Maybe it is dumb luck, but every AMD chip I had was running cool, overclocked well and lasted. Every Intel chip I owned didn't overclock well and had problems staying cool.

      I never had a chip that overclocked well until my Phenom II X3 720, which went from 2.8 to 3.2 on air, that was a nice free bump. Very reliable. But now I have a 2.7 GHz X6 (I don't play the latest games, but I do compile software and I do like to watch video at the same time) and it doesn't seem to have the same headroom.

      I figure it's just luck for the most part, since everyone I knew had better luck overclocking their intel chips than I did. I got my first K6 up just one step, and my first P2, too.

  • I was just wondering if the quality of the video drivers has improved at all since ATI was rebranded to AMD.
    ATI was notorious for how awful its video drivers were. My current laptop has a Mobility Radeon X1400. Whenever I play a video that uses Overlay, there is about a 2% chance that it will hard-freeze the system. I don't think I've ever seen anything like that on an Intel or Nvidia graphics product.
    I also sometimes get system-stopping delays that are several seconds long when running 3D games, it seem

    • by 0xA ( 71424 )
      Better? Yes but still not very good.
    • I was just wondering if the quality of the video drivers has improved at all since ATI was rebranded to AMD.

      No, ATI's drivers are still shit. But now, nVidia's drivers are also shit, so they look better by comparison. As it stands, Intel is the leader in working graphics.

      • by jandrese ( 485 )
        Intel is the leader in graphics drivers unless you actually try to play a game on them, then they're in dead last place, behind even AMD/ATI.
        • I don't have any personal experience but pretty much everyone else says that the latest greatest integrated intel graphics are tolerably credible. I've even heard them compared to my 240GT, and if that's true then frankly they're fantastic, if not the most powerful thing around. Not everyone needs SLI and greater than 1080p to be happy.

  • Article states, "They're designed to take down Intel's Core i3 chips, and the first application and gaming benchmarks are out."
    I3's are meant for basic desktop and doing your homework, not a gaming rig. So they are saying, Hey, our new chip is just as crappy at games as the I3... Brilliant marketting.
    • by 0123456 ( 636235 )

      I3's are meant for basic desktop and doing your homework, not a gaming rig.

      My laptop has a two year old i5 that, I believe, is basically just an i3 with turbo mode (i.e. it's a dual core with hyperthreading rather than a real quad, and the CPU benchmarks are almost identical to a similar clocked i3 since turbo mode is switched off under heavy load). It plays every game I've thrown at it so far on medium to high settings, limited by the GPU, not the CPU.

  • The Intel Core i3 is a dual core chip. They're comparing that to an AMD Quad core chip. Sure, AMD graphics are better than Intel - you know since they bought ATI. They better look out, Intel is now a full process node ahead of the entire industry. They might catch up in graphics performance just by widening the process gap and throwing more transistors at it.
  • from the 2011 Symposium on Application Accelerators in High-Performance Computing (http://dl.acm.org/citation.cfm?id=2060321/)

    "Depending on the benchmark, our results show that Fusion produces a 1.7 to 6.0-fold improvement in the data-transfer time, when compared to a discrete GPU. In turn, this improvement in data-transfer performance can significantly enhance application performance. For example, running a reduction benchmark on AMD Fusion with its mere 80 GPU cores improves performance by 3.5-fold over t

Children begin by loving their parents. After a time they judge them. Rarely, if ever, do they forgive them. - Oscar Wilde

Working...