Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Hardware

AMD Brings New Desktop Chips Down To 65W 104

crookedvulture writes "AMD's new Llano-powered A-series APUs have had a difficult birth on the desktop. The first chips were saddled with a 100W power rating, making them look rather unattractive next to Intel's 65W parts. Now, AMD has rolled out a 65W version of Llano that's nearly as fast as its 100W predecessor despite drawing considerably less power under load. This A8-3800 APU doesn't skimp on integrated graphics, which is key to Llano's appeal. If you're not going to be using the built-in Radeon, the value proposition of AMD's latest desktop APUs looks a little suspect."
This discussion has been archived. No new comments can be posted.

AMD Brings New Desktop Chips Down To 65W

Comments Filter:
  • by Joe_Dragon ( 2206452 ) on Tuesday October 04, 2011 @04:42PM (#37604844)

    as you get more for your $ then with a intel board.

    • Re: (Score:1, Informative)

      Performance desktop user here... Let me know when they start beating out the i5-2500K or i7-2600K CPU performance wise (even if the chip is more expensive!). I've got my i7-2600K running at 4.4Ghz stable without playing with the voltages or running turbine aircraft engine coolers (matter of fact the PC is almost silent). I can't think of any features I am missing on my P67 rev2 board which would make me trade in the performance of the CPU I have either. I love this chip!

      I used to like AMD quite a lot (P4 e
      • by blair1q ( 305137 )

        Let me know when they start beating out the i5-2500K or i7-2600K

        They may never do that, if they keep getting all excited about a part that has less than half the performance.

        Looking at the price-performance chart from the summary, it's clear the i5-2500K is leaps and bounds better than any other chip currently available.

        Of course, people here are more performance- than price-conscious, so those i7-##0X jobs off to the right are drooltastic. Especially when you check on Pricewatch and find them for $200 less than TechReport is listing.

        My home box is getting unstable in

        • Wait a minute.. How the heck is your CPU the most important component upgrade? Seriously Get the 3rd or 4th fastest processor. Do you know how much time your CPU sits Idle? sure, and SSD will help some, but put in more ram, better video card.. heck, get a nicer, bigger monitor so you can see more physical desktop.. but CPU?? Unless you do video encoding for a living you will never know the difference

          • by swalve ( 1980968 )
            It's not how long it spends idle, it is how long it spends near 100% usage. As long as you can peg it, you can benefit from more processor.
        • Since you seem to care vaguely about stability, you might want to know that none of Intel's current desktop chipsets support ECC memory. I'm speccing a system right now and it looks like I'm going to have to buy a non-Intel chip.

          PS. If you still want an Intel, the Xeon E3 1200 series is the closest ECC chipset to a desktop chipset.
        • Of course, people here are more performance- than price-conscious, so those i7-##0X jobs off to the right are drooltastic.

          Except they aren't because while they beat the 2600K in highly multithreaded tasks they lose badly to it in tasks with four or less active threads. Afaict most desktop tasks have four or less active threads. Plus by buying them you would be buying into a dying platform. Therefore unless I really needed the features of the LGA1366/x58 platform I would probablly not buy one at this point.

          And since there are rumors that Intel is flattening out its roadmap (no sense overspending when the competition is as lame as they are), anything built today will remain egoboosting for longer than the usual.

          Today you have a choice between LGA1155 which has the sandy bridge cores and native sata 6G but it's a mainstream platform

      • by dbIII ( 701233 )
        Performance server user here - that 48 core supermicro AMD system from well over a year ago buries any of those Intel systems you are talking about. Intel now have things with less cores but higher speed at around three times the price.
        Everything under serious development is being written as multithreaded if it isn't already. A fast core is pointless if it's switching context all the time to run something that would be on another core if you had more cores.
        • that 48 core supermicro AMD system from well over a year

          I love those machines. They are frankly awesome and astonishingly cheap and dense compared to the competition. The funny thing is that they beat most of the real specialist high density crazy-servers on CPU grunt per U and completely bury them in price. They're also similar in power (worse if you believe the vendor information, which I don't). The huge system image (mine are configured for 256G) is also really nice for certain kinds of problem.

          I've rec

          • by dbIII ( 701233 )
            One nice thing I could do with it (only have one) is when I had a braindead application that ran like crap because it was doing sorting on disk I just fed it a 20GB ramdisk to get better than 100x the performance. Of course a sane application would check to see how much memory was available before resorting to disk.
            Fairly stupid software licencing practices made the thing far more cost effective than a cluster (even shifting the licences onto the cluster of 8 core machines I have would cost more than the 4
    • by geekoid ( 135745 )

      for the price? yeah, Yugo's have better cars the Mercedes, for the price.

      • Well, a better analogy would be a corvette vs a ferrari. Corvette isn't quite as fast or does quite as well in the corners... but a corvette is a lot more accessible to most people... and for every day things is more than enough and cheaper to maintain.

        My phenom II cost me 125$ and isn't much lower performing than some of the newer 300$ i5's (<5% difference), and smokes some of the older 300$+ i7's. And the socket is forwards compatible for many of the newer phenoms and Bulldozer coming out in a few mont
        • Your phenom II (I'm guessing a 965 at $125) isn't much lower performing than some of the newer $300 (wait, no $190 for an i5 2400) i5s (it's more like 30% compared to an i5 2400 generally http://www.anandtech.com/bench/Product/102?vs=363 [anandtech.com]), but neither is an i3 2100. In fact, so much so that the i3 2100 will beat your CPU silly most places (http://www.anandtech.com/bench/Product/102?vs=289), and costs $125 too ;)

    • Comment removed based on user account deletion
      • Go run Firefox on these with lots of addons and then tell me it is a good chip. :-)

        That should be the end all of benchmarks

        • Comment removed based on user account deletion
        • by Sloppy ( 14984 )

          "Low-end" CPUs are definitely underrated. I don't know exactly what addons you're talking about, and no doubt there are some good ones that make a low-end CPU insufferable, but ..

          We've got an ION (Atom 330+Nvidia 9400) (which a Brazos easily beats) in a box that somehow perversely turned out to be the most-used machine in the house. I did not plan for that; it was an accident. It was originally just intended for MythTV (where all I cared about was 1. must decode video 2. minimize total wattage), and ION

  • "If" (Score:5, Insightful)

    by Baloroth ( 2370816 ) on Tuesday October 04, 2011 @04:45PM (#37604892)

    The whole point of these chips is the built in Radeon, whether it's for GPU or GPGPU performance. I'm not even sure why you would compare it solely as a processor, and I'm quite sure that isn't a fair or reasonable comparison. Nor one anyone wants to make (who might actually buy a Llano). For high performance, you'll get a dedicated card anyways. Anyone looking at this will use the integrated Radeon, that's the point.

    • get a llano. get another 6xxx radeon. yet get ANOTHER 6xxx radeon. you got a 3 way crossfire.

      you were speaking of performance ?
      • Crossfire generally doesn't scale well enough to make 3 way crossfire worthwhile when you have 2 mid-high end cards and one slower GPU. Also, the fastest discrete GPU that supports the hybrid crossfire right now is the 6670 I believe.

        PS: I'm a Llano owner (A4-3400).
    • but if you look at the benchmarks, the Intel i3 beats the AMD in almost every gaming benchmark too. So what does the AMD chip have to offer if its supposed ondie GPU can't even beat Intel's?
      • Link?

        In gaming benchmarks I can get a $2,000 icore 7 xeon with an integrated graphics chip and then setup a $499 Dell with just an i3, but throw in a Raedon 6950. Guess which computer will trounce the benchmarks by a very large margin?

        The GPU is what is important in gaming and regular desktop usage with accelerated html 5 browsing and Metro around the corner. CPU is less important. Also like another slashdotter posted you can always add a dedicated card and then crossfire it with the CPU/GPU :-D ... now tha

        • The xeons dont have the high performance Sandy Bridge GPU, so thats not terribly suprising. Only the desktop chips have the new intel GPU.

          • Actually, and E3 Xeon ending with a 5 in its serial number has an HD 3000.

            • Ive been looking at the Xeon E3s, and Intel's knowledgebase seems to indicate they lack the hardware gpu features.

              For instance, look at the E3 1270 (link) [intel.com]. Under "Graphics specs", it says "no" to all of the graphics features, including "processor graphics".

              Ive been looking at these closely for the last few weeks, and it seems you specifically need a separate gpu chipset on the motherboard to handle the graphics, as the CPU will not do it.

        • Comment removed based on user account deletion
      • by Pulzar ( 81031 )

        In those tests, i3 is being tested with external graphics, compared to AMD with the same external graphics. Basically, it's a CPU vs CPU test. Which is pretty ridiculous because they are both targeted to users who will not buy external cards...

        The actual i3 vs A8 tests with their associated graphics are tested later in the article here: http://techreport.com/articles.x/21730/8 [techreport.com]. The results aren't even close - AMD is more than playable, i3 is not.

      • by cynyr ( 703126 )

        find me a game that makes use of more than 2 cores...

        Or better yet do a "while re-encoding this 1080P source(link) using these ffmpeg/libx264 settings(link) using n-1 cores, here is the FPS of ${GAME}" or even simply "we started a virus scan and then decided to play ${GAME}"

        Can we please move past the single and dual threaded benchmarks? go look at the x264 encode times using all the cores for both chips, I'll wait... yep the AMD wins at a given price point. I don't know about you but i usually have $X to s

    • Seriously. For the average consumer to surf the web and watch movies, built-in GPUs are more than adequate. They can even play casual games. The reduction in power has it on par with Core i5 now in most cases; however, Intel has lower power Core i3s and at least one Core i5 that is lower power.
      • llano low end notebooks are able to play starcraft 2.
      • I disagree. Integrated gpus have such high ram latency that even hidef videos have trouble keeping up.

        IE 10 ppr and Firefox 7 have a big difference in performance depending on GPU for sites and ads that take 100% cpu utilization.Metro will show this when Microsoft adds IOS graphical effects. Llamo may not be super fast, but it is lightyears ahead of regular gpus because it is integrated with the CPUs ram controller. If you are on a computer with a decent dedicated video card fire up IE 9 (I know blaspheme h

    • The thing I find strange about llano is... Who wants a low end radeon, who can't make do with an HD 2000 or HD 3000? I can't think of anyone who actually wants a "real" graphics chip, but doesn't want a *real* graphics chip on the desktop.

      They look great for laptops, at low power usage, but for desktop... really no.

      • by Sloppy ( 14984 )

        Who wants a low end radeon, who can't make do with an HD 2000 or HD 3000?

        That's my thinking too, but there it turns out there is an answer. The niche I see for Llano is where someone is looking the at absolute dollars spent on the machine, combined with having some minimum standard performance for both the GPU and CPU. That is, someone doesn't want pre-Sandy Bridge Intel integrated graphics (i965 isn't enough even if the CPU is) or a weak CPU (ION's Atom isn't enough even though the Nvidia 9400 is), so bu

  • by unity100 ( 970058 ) on Tuesday October 04, 2011 @04:45PM (#37604900) Homepage Journal
    http://pente.hubpages.com/hub/AMD-Fusion-APU-Processor-Specifications [hubpages.com]

    for its possible to play starcraft 2 with that shit, even on a low end portable if it has the llano.

    in a desktop, you can even crossfire it with its equivalent 6xxx card, therefore reaching major performance for ridiculous price.

    if you went with a traditional route, you would need to get the cpu, and then get a separate 6xxx equivalent card, and then one more to do the crossfire.

    llano pieces give you 1 good cpu and 1 good graphics card in one shot, and in future they will be upgradeable. you will be able to upgrade both the cpu and 'graphics card' of your rig by upgrading just 1 piece of hardware.
    • A question regarding the Crossfire capability: does it automatically enable in, say, a laptop (specifically, an ASUS K53TA) with an A4 APU and a Radeon 6550? Or is it actually the part where ATI Control asks me which graphics core it should use for a given application?

      • Re: (Score:3, Informative)

        by unity100 ( 970058 )
        if, the board you have is crossfire capable, and the generations match each other (it has to be in XXYY range and first XXes must match from what i know, but exceptions are possible), ati catalyst control center will see that you have crossfire possibility, and it may auto enable it. you may enable crossfire, or disable it. with windows 7 and vision control center more customizations may be possible, however if you consider that hardware acceleration is even used for web page rendering in firefox, you would
  • With OpenCL 1.1 throughout and more applications leveraging it even on Linux it's rather clear that once the Desktop Environments of KDE, Gnome catch up in some respects with OS X on using OpenCL for the GPGPUs then you'll will be using all that power without even knowing it. Games most certainly will be using it. Gimp, Blender, Inkscape and more are rolling it into their products.
    • There's that, but there's also dual GPUs which have been around for a while. I think Apple has offered dual GPU laptops for years now, where the big one would only get tapped for GPU intensive use, saving battery power.

      A desktop isn't as sensitive to power use as a laptop is, but you could still conceivably cut down on the electrical bill and cooling costs.

    • Because the linux drivers are no good on this. I've got a 3650 with Fedora 15, and most of the stuff works under linux 3.0 but the video on my display is shifted up and left for no good reason and tinkering with modelines didn't move the picture at all. I'm still using the CPU but I put my nVidia card back in so I could use my display.

      • by tyrione ( 134248 )
        The Compositing and leveraging the GPGPU is up to the Desktop Environment and the Application. Only OS X has OpenCL/GCD system-wide and app-wide due to the Compositor, Quartz and WindowServer all leveraging OpenCL natively working with an OpenGL 3.2 fully accelerated environment in 10.7. Linux is still sucking hind tit with OpenGL 1.4. It's only with KDE 4.7.x that OpenGL ES 2.0 bits are now being leveraged. At that rate it'll take years for Linux to catch up. Hopefully, with X moving to Wayland the gap won
  • by Anonymous Coward on Tuesday October 04, 2011 @05:04PM (#37605190)

    Not sure if I'm supposed to spill the beans on this, but I'm an AC, dammit. I'm in their focus-group thing, and apparently they're working real hard on a Crossfire-like solution right now so your "free" on-chip GPU isn't being wasted if you throw down for a discrete card. They haven't been making much words about this, though. Odd.

    • Re: (Score:1, Flamebait)

      by geekoid ( 135745 )

      Ah yes, posting AC dissolved all obligations~

      What an untrustworthy piece of shit you are.

    • by ackthpt ( 218170 )

      Not sure if I'm supposed to spill the beans on this, but I'm an AC, dammit. I'm in their focus-group thing, and apparently they're working real hard on a Crossfire-like solution right now so your "free" on-chip GPU isn't being wasted if you throw down for a discrete card. They haven't been making much words about this, though. Odd.

      I figured they were trying to fly under the RADAR until they got to some point.

      • by Zuriel ( 1760072 )

        I figured they were trying to fly under the RADAR until they got to some point.

        Google says they aren't doing a very [softpedia.com] good [wikipedia.org] job [softpedia.com].

    • This is already known. I read about it a few weeks ago. You didn't spill any beans I'm afraid.

    • Well, considering that I was wondering if you could do precisely this (and if not, why the hell not) with the Llano, I don't think this will be considered "leaking" any information. Well, that and they demoed a scaling programming language for using multiple GPUs several months ago, which is basically a similar idea. AMD could make massive inroads on Intel if they can get such a system working well.
    • by dabadab ( 126782 )

      Sorry, are you from the past?... The Dual Graphics option for Llano has been in the news since, well, basically since the existence of Llano is known. It also has been featured in basically all the Llano reviews (like this one [anandtech.com] from June) so I am not sure what do you mean by "not making much words about this"

    • > They haven't been making much words about this, though. Odd.
      Probably because it only works on DX10/11 games. On DX9 games it actually causes it to run *slower*.

  • Forget 3D, what I'd like to know is how good is the video codec support under linux? Specifically de-interlacing and pulldown of 1080i video for mpeg2, h264 and vc1? I'd really like to dump my windows box, but so far the very best de-interlacing - both quality and coverage - seems to be with nvidia under windows

    • For linux, it appears that is rather new. An SDK for XvBA [wikipedia.org] was released by AMD on Feb 2011. VAAPI (Intel) and VDAPU (nVidia) have been out longer comparatively.
      • My personal experience has been that with nVidia parts, their proprietary driver "just works" under Linux, on occasions when it can't even identify the part itself.

        With ATi/AMD... not so much; more often than not, trying to install proprietary driver is like pulling teeth out of a pitbull's mouth. Even I get it to install, it only sort-of-kind-of works. Trying to uninstall it is downright insane.

        I don't know why ATi/AMD suck this hard, or why it's so much effort to get anything they made to work. But frankl

        • I ran Fedora 14 before the gnome 3 fiasco and switched to Windows 7. My system had an ati-5750 and used the fedora ati proprietary drivers and it worked fine. FLuid animations, great video support at 1080p and it never crashed. Was a very stable system. I do admit I did not do gaming or CAD on it. I dual booted to Windows 7 to run Wow or anything like that.

          AMD has higher quality hardware and are better cards in my opinion. Its drivers are always so so and conservative compared to Nvidias. I have had 2 nvidi

          • I have had 2 nvidia chipsets and cards fail within the last 5

            i just sold my 4 year old sapphire 3870 dual slot to my friend's sister's family, and they are playing sims 3 as a family with that card.

        • With ATi/AMD... not so much; more often than not, trying to install proprietary driver is like pulling teeth out of a pitbull's mouth. Even I get it to install, it only sort-of-kind-of works. Trying to uninstall it is downright insane.

          Having recently switched to an ATI card, using Ubuntu, to these observations I say LOL no, yeah pretty much, and LOL no.

          Installation is simple. System-> Additional Drivers -> Enable ATI proprietary drivers -> Reboot (this part sucks but oh wel).

          Removal is the same procedure except the button says "Disable" instead of "Enable". There is absolutely nothing insane about it at all.

          Now as far as the "works" part, that's a different issue... It mostly works, and when it works it works excellently, but

      • by Zuriel ( 1760072 )

        There's an SDK out *now*, but they're late to the party. Noone's really interested in implementing a *third* API, so XvBA only gets used in the VAAPI --> XvBA wrapper. There's also a VAAPI --> VDPAU wrapper and direct VAAPI support for Intel IGPs, so the competition seems to be between VDPAU for it's relative maturity and polish and VAAPI for it's wide support.

        I don't believe VAAPI has *any* hardware-based deinterlacing yet.

        On an unrelated note, why are we still doing interlacing on 1080p LCD panels?

        • by cynyr ( 703126 )

          I was wondering the same my self. I tend to go for the 720P stuff over the 1080i for the same reasons. It scales up nicely.

        • by Tacvek ( 948259 )

          1080i@50Hz (i.e. 50 "half-frames" per second) effectively encodes more visual data than either 1080p@25Hz, or stretched 540p@50Hz.

          It encodes more than 1080p 25Hz by including information sampled twice as frequently, leading to smoother motion.

          It encodes more than stretched 540p@50Hz by way of discriminating between twice the number of vertical lines, and thus providing twice the apparent vertical resolution on still (or slow moving) objects.

          Obviously it provides less visual data than 1080p@50Hz, but 1080i@5

  • For a good dedicated graphics card, you are expected to shell out at least 150 extra watts under load.

    if, a desktop chip sports such a dedicated card, and its entire power consumption is 100 watts, that is NOTHING compared to a separate cpu + separate graphics cards combo. such a system would spend at the minimum 200 watts. so 100 watts compared to this is nothing.

    if you look at this in that light, 65 watt consumption becomes something phenomenal.
    • I also really don't see what the big deal is about TDP - all that determines is what sort of HSF you use. The important thing is the idle power because that is where the CPU sits most of the time.

  • i threw my lot with going with a bulldozer capable board and 2 discrete radeon cards for my last recent upgrade. now, i am thinking that if i went the llano route, and shoved in another 6xxx, the performance would be much more better at the current state.
  • http://www.newegg.com/Product/Product.aspx?Item=N82E16819103942 [newegg.com]

    it is one 4 core cpu, and one decent graphics card in one package, and its just 139. you would need to shell out $139 just for a decent graphics card, if you went with external.

    and great reviews :

    http://www.newegg.com/Product/Product.aspx?Item=N82E16819103942 [newegg.com]
  • by Cajun Hell ( 725246 ) on Tuesday October 04, 2011 @10:34PM (#37608096) Homepage Journal

    Is this a joke? The integrated graphics are the whole fucking point! If you don't want 'em, you can get a Phenom II (or maybe even an Athlon II) that uses less power and runs faster.

    If you don't use it as a car, the Honda Civic isn't really all that great a value, comparing slightly unfavorably to Stone Ruination IPA in most video compression benchmarks.

  • I think those AMD 45W quad cores (6??e series) were pretty cool (both metaphorically and literally). Intel rarely makes big desktop processors in such a low TDP range.

It is wrong always, everywhere and for everyone to believe anything upon insufficient evidence. - W. K. Clifford, British philosopher, circa 1876

Working...