Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD Graphics Intel Upgrades Games Hardware

AMD Trinity APUs Stack Up Well To Intel's Core 3 223

Barence writes "AMD's APUs combine processor and graphics core in the same chip. Its latest Trinity chips are more powerful than ever, thanks to current-generation Radeon graphics and the same processing cores as AMD's full-fat FX processors. They're designed to take down Intel's Core i3 chips, and the first application and gaming benchmarks are out. With a slight improvement in applications and much more so in games, they're a genuine alternative to the Core i3." MojoKid writes with Hot Hardware's review, which also says the new AMD systems "[look] solid in gaming and multimedia benchmarks, writing "the CPU cores clock in at 3.8GHz / 4.2GHz for the A10-5800K and 3.6GHz / 3.9GHz for A8-5600K, taking into account base and maximum turbo speeds, while the graphics cores scale up to 800MHz for the top A10 chip."
This discussion has been archived. No new comments can be posted.

AMD Trinity APUs Stack Up Well To Intel's Core 3

Comments Filter:
  • Re:Wow (Score:5, Insightful)

    by h4rr4r ( 612664 ) on Thursday September 27, 2012 @10:35AM (#41477523)

    You know you can just not use it right?
    Why bother looking for a chip without it?

    Heck, these days it is even usable and has good open drivers.

  • Re:Wow (Score:5, Insightful)

    by pushing-robot ( 1037830 ) on Thursday September 27, 2012 @10:41AM (#41477599)

    Or, more accurately, AMD's integrated video is better than Intel's integrated video (seriously, that's all they tested!).
    And these AMD chips still double the system power consumption [hothardware.com] over their Intel counterparts.

    So if you're part of the subset of gamers that morally object to dedicated video cards but still enjoy noisy fans and high electricity bills, AMD has a product just for you! Woo!

  • Re:Wow (Score:5, Insightful)

    by Skarecrow77 ( 1714214 ) on Thursday September 27, 2012 @10:44AM (#41477619)

    Ironic statement, since the main selling point of the chip being reviewed here is its integrated graphics.

    Which I find just silly really. These are fine chips to build a PC for your little cousin who surfs the web and maybe plays world of warcraft. for any real build, integrated graphics, for all their advancements, still read like:
    Intel: "Our new HD4000 graphics are nearly as fast as a mainstream card from 8 years ago!"
    AMD: "HAH, our new chip's graphics cores are as fast as a mainstream card from 6 years ago! we're two years of obsolecense better!"

    even a $100 modern dedicated card will whallop either of these chips solutions.

  • Re:Wow (Score:5, Insightful)

    by h4rr4r ( 612664 ) on Thursday September 27, 2012 @10:48AM (#41477673)

    You are actually bitching about less power than a light bulb used to use?

    At worst case it looks like ~60 watts on the two higher end units. How low power is the monitor if that constitutes doubling the power? I am betting total system in this little test ignores the monitor.

    Oh noes tens of dollars more per year in electricity! The HORRORS! How ever will I afford such an extravagance that costs per year almost what two drinks at the bar costs.

    If they are within 100watts I would call it a wash and be far more interested in computing power per $ upfront cost. AMD has traditionally done very well in that test and only started failing it very recently.

  • by Targon ( 17348 ) on Thursday September 27, 2012 @10:53AM (#41477741)

    In this day and age, CPU performance means less and overall performance is the thing people look for. A quad-core 1.5GHz is easily enough for your average home user for day to day, and at that point, GPU power for things like full-screen youtube or Netflix videos becomes a bit more of a concern. We WILL have to wait and see what the performance numbers come in at, but a 10% bump in CPU performance is expected over the last generation from AMD.

  • Re:Wow (Score:4, Insightful)

    by h4rr4r ( 612664 ) on Thursday September 27, 2012 @10:56AM (#41477787)

    Why would they not compare their new entry level CPU to their competitors entry level CPU?

    These CPUs are designed to be priced against i3 of course they should be compared to i3.

    You do realize that an NVIDIA card will work just fine in a computer with an AMD or intel CPU right?

  • Re:Wow (Score:3, Insightful)

    by binarylarry ( 1338699 ) on Thursday September 27, 2012 @10:58AM (#41477819)

    No one cares about dedicated graphics cards.... unless they play games.

    I don't know what region you're from where all the gamers just use the onboard GPU that comes with their mobo.

  • Re:Wow (Score:5, Insightful)

    by characterZer0 ( 138196 ) on Thursday September 27, 2012 @11:00AM (#41477849)

    I gave up on ATI's drivers too and bought a new laptop with an nVidia card. The state of the drivers is so pathetic that the laptop will not even boot nine times out of ten unless I disable the discrete card and use the integrated Intel GPU because otherwise the Optimus screws everything up. I will take occasionally buggy ATI over completely non-functional nVidia next time.

  • Re:Wow (Score:5, Insightful)

    by TheLink ( 130905 ) on Thursday September 27, 2012 @11:06AM (#41477957) Journal
    Does it really degrade performance? I've had motherboards with intel graphics, and I just plug an ATI/NVidia video card into them, install the drivers and it seems to work. Then if the video card fails (which does happen) I have the intel graphics to fall back on - so I can still use the PC for normal desktop stuff even if I can't play games that require higher graphics performance.
  • Re:Wow (Score:4, Insightful)

    by fuzzyfuzzyfungus ( 1223518 ) on Thursday September 27, 2012 @11:21AM (#41478189) Journal

    Except that none of the benchmarks actually cover CPU speed, because AMD have put all the reviewers under NDA until the chip is released. That rather suggests they haven't caught up, they're just showing off the better IGP, which no one playing games will use anyway, and that anyone not playing games won't give a shit about.

    While I'm not hugely sanguine about AMD's prospects(unfortunately, it isn't going to be pretty if the world is divided between x86s priced like it's still 1995 and weedy ARM lockdown boxes, so it would be nice if AMD could survive and keep Intel in check); but there is one factor that makes IGPs much more of a big deal than they used to be:

    Laptops. Back in the day, when laptops were actually expensive, the bog-standard 'family computer from best buy, chosen by idiots and sold by morons on commission' would be a desktop of some flavor. Unless the system was terminally cheap and nasty and entirely lacked an AGP/PCIe slot, it didn't matter what IGP it had, because if little Timmy or Suzy decided they wanted to do some gaming, they'd just buy a graphics card and pop it in.

    Now, it's increasingly likely that the family computers will be laptops(or occasionally all-in-ones or other not mini towers) of equally unexciting quality but substantially lower upgradeability. If you want graphics, you either use what you bought or you buy a whole new computer(or just a console).

    This makes the fact that some, but not all, IGPs can actually run reasonably contemporary games(especially at the shitty 1366x768 that a cheap laptop will almost certainly be displaying) much more important to some buyers and to the PC market generally.

  • Re:Wow (Score:4, Insightful)

    by tlhIngan ( 30335 ) <[ten.frow] [ta] [todhsals]> on Thursday September 27, 2012 @11:52AM (#41478557)

    Or use Windows or possibly Gnome...or do OpenCl or OpenGl programming...or-

    The list goes on. The fact that people are still selling craptacular integrated video chipsets in this day and age saddens me greatly. Guys, it's 2012...pony up for a dedicated video card with dedicated video ram. Quit trying to save a buck or two on a component you really don't want to be cheap on.

    Well, I think you can do OpenCL on Intel HD3xxx/4xxx chips these days. At least Apple seems to (on the retina MBP, they have a custom shader to handle the scaling from the double-size framebuffers to native panel size (if you're running at the higher-than-half size modes, e.g., 1920x1200) so that when you switch between GPUs, you don't notice it happening like you would if you wanted native.

    As for why integrated graphics - easy - price. The customer sees $500 laptops, and they end up demanding cheap laptops. Think of all those /. arguments where "Apple is expensive! Their laptops start at $1000 when everyone elses is at $500!".

    Hell, we call PCs (desktops and laptops) costing over $1000 "premium" nowadays. Expensive even, when we're constantly inundated with sub-$500 laptops and PCs.

    It's why netbooks died quickly after the launch of the iPad (no manufacturer wanted to build no-profit PCs, and tablets at $500 were far more profitable), why you can get i7 laptops with integrated graphics and 1366x768 screens. Why "ultrabooks" costing $1000+ seem to be the ones everyone's dumping money into making product for (with high-res screens!), etc.

    The race to the bottom has led manufacturers to focus on what everyone says they should look for in a PC - GHz (more is better), GB (more is better), GB (more is better) (one is RAM, other is HDD). Which means stuff like graphics and screen resolution (two of the most expensive parts) get ignored and skimped on because consumers don't care.

    Hell, a retina MBP fully tricked out costs under $4000. Which only over a half-decade ago would've been considered normal for high-end PCs. These days it puts it basically at the top end "for 1%ers only" category.

  • Re:Wow (Score:4, Insightful)

    by spire3661 ( 1038968 ) on Thursday September 27, 2012 @12:27PM (#41478943) Journal
    We have LONG passed the point that you need the latest fire breathing hardware to be a 'gamer'. People that go SLI, exotic cooling etc, are HARDWARE junkies first, its not the defining characteristic of a gamer. Back in the day we built exotic hardware out of necessity to play games, not hardware adulation. I have 4 'hand assembled' Sandy/Ivy Bridge systems, and only one has a real video card in it. The rest handle graphics jsut fine. All of them play TF2 '2fort' @ 1080p no problem. Im not saying its the best image or experience, but it runs well and is cheap.
  • Re:Wow (Score:5, Insightful)

    by petermgreen ( 876956 ) <plugwash@nOSpam.p10link.net> on Thursday September 27, 2012 @01:14PM (#41479505) Homepage

    Because its adding heat for a part you're not using

    Personally I doubt a graphics core that is turned off draws any significant power, particually compared to the massive gulf in performance per watt between intel and AMD at the moment. You could buy a xeon chip where the graphics core is lasered off rather than merely disabling it in software but I doubt it's worth the extra cost to do so.

    and sucking up die space?

    Meh, what does that matter to me as the user. Yes a slightly smaller die is perhaps a little cheaper to make but we all know price is only loosely tied to cost anyway and it's not like a smaller die means a smaller total area taken up by the processor. The package and heatsink already many times bigger than the die.

  • Re:Wow (Score:4, Insightful)

    by Kjella ( 173770 ) on Thursday September 27, 2012 @02:34PM (#41480665) Homepage

    Hell, a retina MBP fully tricked out costs under $4000. Which only over a half-decade ago would've been considered normal for high-end PCs. These days it puts it basically at the top end "for 1%ers only" category.

    Sorry, but $4000 wasn't anything like normal even for a high end PC in 2007, that'd be a "1%er" PC with a $999 Intel Core 2 Extreme CPU and dual $599 nVidia GeForce 8800 GTX with still plenty cash for the rest. I think you'd have to go back to the 90s and probably early rather than late 90s to find prices like that.

  • Re:Wow (Score:4, Insightful)

    by Ironhandx ( 1762146 ) on Thursday September 27, 2012 @07:15PM (#41483621)

    My kingdom for a mod point or 10...

    I tried Linux years ago on my Radeon 7500 and couldn't figure out why the hell everyone was bitching about the ATI drivers. I've had linux installed on pretty much every generation of ATI card since(save that I skipped the HD2xxx and HD3xxx series, plus the Xxxx series was an X600 all in wonder, which ran fantastic, actually better than on windows) and I still haven't had problems.

    The only caveat is I usually upgrade a generation or close to a generation behind. By then linux driver support is in place. However on the same token I have 2 laptops here with nvidia mobile GFX in them that have the same damn problems as the above user described. I'll get the things working great for about 2 minutes and then another update will hit and fuck the whole thing again.

    My plan was to leave my desktops on windows for gaming and run Linux on my HTPCs and laptops because I don't need those to run all the latest games. I was sorely mistaken >_ I do have an E350 laptop that beyond some sound issues(common to everything I've ever installed pretty much any linux distro on except that one X600 AIW that just for some crazy reason worked fantastic on TV out) works great.

    Add on top of that I like my hardware to fail when it fails. Not throw random artifacts, bugs, the works, and make me spend 3-5 fucking hours till I finally figure out that the video card is overheating but instead of crashing the computer or rebooting itself its throwing the whole system for a loop.

    People come to me with mysterious hardware-related computer problems I now start at whatever NVidia parts are installed to diagnose it. Its not always correct but its saved me enough time checking other shit that it is now my best practice to save time.

"A car is just a big purse on wheels." -- Johanna Reynolds

Working...