AMD Brings New Desktop Chips Down To 65W 104
crookedvulture writes "AMD's new Llano-powered A-series APUs have had a difficult birth on the desktop. The first chips were saddled with a 100W power rating, making them look rather unattractive next to Intel's 65W parts. Now, AMD has rolled out a 65W version of Llano that's nearly as fast as its 100W predecessor despite drawing considerably less power under load. This A8-3800 APU doesn't skimp on integrated graphics, which is key to Llano's appeal. If you're not going to be using the built-in Radeon, the value proposition of AMD's latest desktop APUs looks a little suspect."
Amd also has better MB's for the price (Score:3)
as you get more for your $ then with a intel board.
Re: (Score:1, Informative)
I used to like AMD quite a lot (P4 e
Re: (Score:3)
Let me know when they start beating out the i5-2500K or i7-2600K
They may never do that, if they keep getting all excited about a part that has less than half the performance.
Looking at the price-performance chart from the summary, it's clear the i5-2500K is leaps and bounds better than any other chip currently available.
Of course, people here are more performance- than price-conscious, so those i7-##0X jobs off to the right are drooltastic. Especially when you check on Pricewatch and find them for $200 less than TechReport is listing.
My home box is getting unstable in
Re: (Score:2)
Wait a minute.. How the heck is your CPU the most important component upgrade? Seriously Get the 3rd or 4th fastest processor. Do you know how much time your CPU sits Idle? sure, and SSD will help some, but put in more ram, better video card.. heck, get a nicer, bigger monitor so you can see more physical desktop.. but CPU?? Unless you do video encoding for a living you will never know the difference
Re: (Score:2)
Re: (Score:2)
PS. If you still want an Intel, the Xeon E3 1200 series is the closest ECC chipset to a desktop chipset.
Re: (Score:2)
Of course, people here are more performance- than price-conscious, so those i7-##0X jobs off to the right are drooltastic.
Except they aren't because while they beat the 2600K in highly multithreaded tasks they lose badly to it in tasks with four or less active threads. Afaict most desktop tasks have four or less active threads. Plus by buying them you would be buying into a dying platform. Therefore unless I really needed the features of the LGA1366/x58 platform I would probablly not buy one at this point.
And since there are rumors that Intel is flattening out its roadmap (no sense overspending when the competition is as lame as they are), anything built today will remain egoboosting for longer than the usual.
Today you have a choice between LGA1155 which has the sandy bridge cores and native sata 6G but it's a mainstream platform
idiot. (Score:1)
I also really don't see the market for AMD's APU offerings I think part of the reason that Intel invests so little in the integrated graphics on their desktop chips is precisely because there is so little interest beyond basic 2D, HTPC and very light gaming usage. For anything else, users are going to buy a dedicated graphics card.
if you are unable to see that, then dont go deciding what is troll and what is not.
i just had to advice approx 2 guild members because they had to upgrade their outdated hardware in order to be able to play swtor with full settings. they cant buy desktops due to mobility requirements, and they dont have the finances to shell out on a high end gaming laptop.
there is that market for amd's apu offerings. a low end notebook can play starcraft 2. situation wont be too different in desktop - people will rea
Re: (Score:1)
Not just for games too.
If you play with IE 9 and IE 10/Windows 8 preview you will see fluid scrolling browsing when you hit hte up or down arrow keys with no flickering. Go to www.google.com videos with a generic search and try that it if you have a nice card? FIrefox 7 is catching up wtih video acceleration but it has slow scroll and will soon have fast scroll. Chrome will eventually have fast scroll with full accelerated html 5 canvas soon too.This is where it will matter for AMDs ALUs.
Average users and n
portable llano in a tablet (Score:1)
Games on your HTPC (Score:2)
While the integrated GPUs crush what Intel has to offer, they're still massively inferior to a mid to high-end dedicated graphics card. Yet, for 2D applications and HTPCs, the Intel onboard graphics are perfectly adequate.
True for non-gaming HTPCs. But if you want to plug in a gamepad and play some games on your HTPC that aren't MAME, you'll need something stronger than Intel's "Graphics My Ass" integrated GPU. That's why I recommend AMD for value systems: you're sure to end up with GeForce or Radeon graphics, which helps in case you do end up wanting to add a little gaming on the side.
Re: (Score:2)
can you give me some examples of games you can play on your HTPC with a gamepad that aren't MAME or any other emulated game console?
When I asked the same question, I got answers that I've collected here [pineight.com]. Also Lockjaw Tetromino Game [pineight.com] works with a gamepad, but that's simple enough to run well even on Intel.
Re: (Score:2)
Re: (Score:1)
Re: (Score:3)
Everything under serious development is being written as multithreaded if it isn't already. A fast core is pointless if it's switching context all the time to run something that would be on another core if you had more cores.
Re: (Score:2)
that 48 core supermicro AMD system from well over a year
I love those machines. They are frankly awesome and astonishingly cheap and dense compared to the competition. The funny thing is that they beat most of the real specialist high density crazy-servers on CPU grunt per U and completely bury them in price. They're also similar in power (worse if you believe the vendor information, which I don't). The huge system image (mine are configured for 256G) is also really nice for certain kinds of problem.
I've rec
Re: (Score:2)
Fairly stupid software licencing practices made the thing far more cost effective than a cluster (even shifting the licences onto the cluster of 8 core machines I have would cost more than the 4
Re: (Score:1)
for the price? yeah, Yugo's have better cars the Mercedes, for the price.
Re: (Score:2)
Okay then...
Pentium G840 –$88.99
ASRock H61M-GE LGA1155 Intel H61 1PCI-E16 1PCI-E1 2PCI SATA2 VGA GBLAN Motherboard - $62.99
Team Elite 8GB 2x4GB PC3-10666 DDR3-1333 9-9-9-24 Dual Channel Memory Kit - $39.99
That puts me $6 cheaper, and as you can see, even if you succeed in unlocking your cores, it beats your system 9 times out of 10. If you don't, well, it demolishes your system:
http://www.anandtech.com/bench/Product/188?vs=405 [anandtech.com]
http://www.anandtech.com/bench/Product/121?vs=405 [anandtech.com]
Re: (Score:2)
My phenom II cost me 125$ and isn't much lower performing than some of the newer 300$ i5's (<5% difference), and smokes some of the older 300$+ i7's. And the socket is forwards compatible for many of the newer phenoms and Bulldozer coming out in a few mont
Re: (Score:2)
Your phenom II (I'm guessing a 965 at $125) isn't much lower performing than some of the newer $300 (wait, no $190 for an i5 2400) i5s (it's more like 30% compared to an i5 2400 generally http://www.anandtech.com/bench/Product/102?vs=363 [anandtech.com]), but neither is an i3 2100. In fact, so much so that the i3 2100 will beat your CPU silly most places (http://www.anandtech.com/bench/Product/102?vs=289), and costs $125 too ;)
Re: (Score:3)
Re: (Score:1)
Go run Firefox on these with lots of addons and then tell me it is a good chip. :-)
That should be the end all of benchmarks
Re: (Score:2)
Re: (Score:2)
"Low-end" CPUs are definitely underrated. I don't know exactly what addons you're talking about, and no doubt there are some good ones that make a low-end CPU insufferable, but ..
We've got an ION (Atom 330+Nvidia 9400) (which a Brazos easily beats) in a box that somehow perversely turned out to be the most-used machine in the house. I did not plan for that; it was an accident. It was originally just intended for MythTV (where all I cared about was 1. must decode video 2. minimize total wattage), and ION
'security features' (Score:1)
anyone who vies for such 'features' translates into 'moron' in my dictionary.
Re: (Score:1)
"If" (Score:5, Insightful)
The whole point of these chips is the built in Radeon, whether it's for GPU or GPGPU performance. I'm not even sure why you would compare it solely as a processor, and I'm quite sure that isn't a fair or reasonable comparison. Nor one anyone wants to make (who might actually buy a Llano). For high performance, you'll get a dedicated card anyways. Anyone looking at this will use the integrated Radeon, that's the point.
nonononono (Score:1)
you were speaking of performance ?
Re: (Score:2)
PS: I'm a Llano owner (A4-3400).
Re: (Score:2)
Re: (Score:1)
Link?
In gaming benchmarks I can get a $2,000 icore 7 xeon with an integrated graphics chip and then setup a $499 Dell with just an i3, but throw in a Raedon 6950. Guess which computer will trounce the benchmarks by a very large margin?
The GPU is what is important in gaming and regular desktop usage with accelerated html 5 browsing and Metro around the corner. CPU is less important. Also like another slashdotter posted you can always add a dedicated card and then crossfire it with the CPU/GPU :-D ... now tha
Re: (Score:2)
The xeons dont have the high performance Sandy Bridge GPU, so thats not terribly suprising. Only the desktop chips have the new intel GPU.
Re: (Score:2)
Actually, and E3 Xeon ending with a 5 in its serial number has an HD 3000.
Re: (Score:2)
Ive been looking at the Xeon E3s, and Intel's knowledgebase seems to indicate they lack the hardware gpu features.
For instance, look at the E3 1270 (link) [intel.com]. Under "Graphics specs", it says "no" to all of the graphics features, including "processor graphics".
Ive been looking at these closely for the last few weeks, and it seems you specifically need a separate gpu chipset on the motherboard to handle the graphics, as the CPU will not do it.
Re: (Score:2)
Reading fail. You are right, the Xeon E3 xxx5 processors do have HD graphics. Thanks for the tip.
Re: (Score:2)
Re: (Score:3)
In those tests, i3 is being tested with external graphics, compared to AMD with the same external graphics. Basically, it's a CPU vs CPU test. Which is pretty ridiculous because they are both targeted to users who will not buy external cards...
The actual i3 vs A8 tests with their associated graphics are tested later in the article here: http://techreport.com/articles.x/21730/8 [techreport.com]. The results aren't even close - AMD is more than playable, i3 is not.
Re: (Score:2)
find me a game that makes use of more than 2 cores...
Or better yet do a "while re-encoding this 1080P source(link) using these ffmpeg/libx264 settings(link) using n-1 cores, here is the FPS of ${GAME}" or even simply "we started a virus scan and then decided to play ${GAME}"
Can we please move past the single and dual threaded benchmarks? go look at the x264 encode times using all the cores for both chips, I'll wait... yep the AMD wins at a given price point. I don't know about you but i usually have $X to s
Re: (Score:2)
casual ? (Score:1)
Re: (Score:1)
I disagree. Integrated gpus have such high ram latency that even hidef videos have trouble keeping up.
IE 10 ppr and Firefox 7 have a big difference in performance depending on GPU for sites and ads that take 100% cpu utilization.Metro will show this when Microsoft adds IOS graphical effects. Llamo may not be super fast, but it is lightyears ahead of regular gpus because it is integrated with the CPUs ram controller. If you are on a computer with a decent dedicated video card fire up IE 9 (I know blaspheme h
Re: (Score:2)
The thing I find strange about llano is... Who wants a low end radeon, who can't make do with an HD 2000 or HD 3000? I can't think of anyone who actually wants a "real" graphics chip, but doesn't want a *real* graphics chip on the desktop.
They look great for laptops, at low power usage, but for desktop... really no.
Re: (Score:2)
That's my thinking too, but there it turns out there is an answer. The niche I see for Llano is where someone is looking the at absolute dollars spent on the machine, combined with having some minimum standard performance for both the GPU and CPU. That is, someone doesn't want pre-Sandy Bridge Intel integrated graphics (i965 isn't enough even if the CPU is) or a weak CPU (ION's Atom isn't enough even though the Nvidia 9400 is), so bu
You WILL use the built-in radeon (Score:3, Insightful)
for its possible to play starcraft 2 with that shit, even on a low end portable if it has the llano.
in a desktop, you can even crossfire it with its equivalent 6xxx card, therefore reaching major performance for ridiculous price.
if you went with a traditional route, you would need to get the cpu, and then get a separate 6xxx equivalent card, and then one more to do the crossfire.
llano pieces give you 1 good cpu and 1 good graphics card in one shot, and in future they will be upgradeable. you will be able to upgrade both the cpu and 'graphics card' of your rig by upgrading just 1 piece of hardware.
Re: (Score:2)
A question regarding the Crossfire capability: does it automatically enable in, say, a laptop (specifically, an ASUS K53TA) with an A4 APU and a Radeon 6550? Or is it actually the part where ATI Control asks me which graphics core it should use for a given application?
Re: (Score:3, Informative)
Re: (Score:1)
as a result im doing mighty good with 2 x 5670s crossfired, running shit in 1920x1200 resolution in full detail settings in dx 11.
Why the hell wouldn't you use the GPU in the CPU? (Score:2)
Re: (Score:2)
There's that, but there's also dual GPUs which have been around for a while. I think Apple has offered dual GPU laptops for years now, where the big one would only get tapped for GPU intensive use, saving battery power.
A desktop isn't as sensitive to power use as a laptop is, but you could still conceivably cut down on the electrical bill and cooling costs.
Re: (Score:2)
Because the linux drivers are no good on this. I've got a 3650 with Fedora 15, and most of the stuff works under linux 3.0 but the video on my display is shifted up and left for no good reason and tinkering with modelines didn't move the picture at all. I'm still using the CPU but I put my nVidia card back in so I could use my display.
Re: (Score:2)
Re: (Score:2)
Too right. I've been to Llano, and AMD picked the right name for a podunk part.
Using the built-in Radeon (Score:4, Informative)
Not sure if I'm supposed to spill the beans on this, but I'm an AC, dammit. I'm in their focus-group thing, and apparently they're working real hard on a Crossfire-like solution right now so your "free" on-chip GPU isn't being wasted if you throw down for a discrete card. They haven't been making much words about this, though. Odd.
Re: (Score:1, Flamebait)
Ah yes, posting AC dissolved all obligations~
What an untrustworthy piece of shit you are.
Re: (Score:2)
It's a bummer this was modded down, I'm inclined to agree. If you promise to keep a secret, you should keep your promise.
Re: (Score:2)
A focus group member isn't part of the design or marketing committee after all.
Re: (Score:2)
Oh, well if he's unsure of his obligations then he's off the hook.
Impressive rebuttal, bro.
Re: (Score:2)
Re: (Score:2)
Not sure if I'm supposed to spill the beans on this, but I'm an AC, dammit. I'm in their focus-group thing, and apparently they're working real hard on a Crossfire-like solution right now so your "free" on-chip GPU isn't being wasted if you throw down for a discrete card. They haven't been making much words about this, though. Odd.
I figured they were trying to fly under the RADAR until they got to some point.
Re: (Score:1)
I figured they were trying to fly under the RADAR until they got to some point.
Google says they aren't doing a very [softpedia.com] good [wikipedia.org] job [softpedia.com].
Re: (Score:2)
This is already known. I read about it a few weeks ago. You didn't spill any beans I'm afraid.
Re: (Score:2)
Re: (Score:2)
Sorry, are you from the past?... The Dual Graphics option for Llano has been in the news since, well, basically since the existence of Llano is known. It also has been featured in basically all the Llano reviews (like this one [anandtech.com] from June) so I am not sure what do you mean by "not making much words about this"
Re: (Score:1)
> They haven't been making much words about this, though. Odd.
Probably because it only works on DX10/11 games. On DX9 games it actually causes it to run *slower*.
What about video codec support under linux? (Score:2)
Forget 3D, what I'd like to know is how good is the video codec support under linux? Specifically de-interlacing and pulldown of 1080i video for mpeg2, h264 and vc1? I'd really like to dump my windows box, but so far the very best de-interlacing - both quality and coverage - seems to be with nvidia under windows
Re: (Score:2)
ATi/AMD still lagging looooong way behind (Score:2)
My personal experience has been that with nVidia parts, their proprietary driver "just works" under Linux, on occasions when it can't even identify the part itself.
With ATi/AMD... not so much; more often than not, trying to install proprietary driver is like pulling teeth out of a pitbull's mouth. Even I get it to install, it only sort-of-kind-of works. Trying to uninstall it is downright insane.
I don't know why ATi/AMD suck this hard, or why it's so much effort to get anything they made to work. But frankl
Re: (Score:1)
I ran Fedora 14 before the gnome 3 fiasco and switched to Windows 7. My system had an ati-5750 and used the fedora ati proprietary drivers and it worked fine. FLuid animations, great video support at 1080p and it never crashed. Was a very stable system. I do admit I did not do gaming or CAD on it. I dual booted to Windows 7 to run Wow or anything like that.
AMD has higher quality hardware and are better cards in my opinion. Its drivers are always so so and conservative compared to Nvidias. I have had 2 nvidi
whoa (Score:1)
I have had 2 nvidia chipsets and cards fail within the last 5
i just sold my 4 year old sapphire 3870 dual slot to my friend's sister's family, and they are playing sims 3 as a family with that card.
Installation is fine, it's the stability (Score:2)
With ATi/AMD... not so much; more often than not, trying to install proprietary driver is like pulling teeth out of a pitbull's mouth. Even I get it to install, it only sort-of-kind-of works. Trying to uninstall it is downright insane.
Having recently switched to an ATI card, using Ubuntu, to these observations I say LOL no, yeah pretty much, and LOL no.
Installation is simple. System-> Additional Drivers -> Enable ATI proprietary drivers -> Reboot (this part sucks but oh wel).
Removal is the same procedure except the button says "Disable" instead of "Enable". There is absolutely nothing insane about it at all.
Now as far as the "works" part, that's a different issue... It mostly works, and when it works it works excellently, but
Re: (Score:1)
There's an SDK out *now*, but they're late to the party. Noone's really interested in implementing a *third* API, so XvBA only gets used in the VAAPI --> XvBA wrapper. There's also a VAAPI --> VDPAU wrapper and direct VAAPI support for Intel IGPs, so the competition seems to be between VDPAU for it's relative maturity and polish and VAAPI for it's wide support.
I don't believe VAAPI has *any* hardware-based deinterlacing yet.
On an unrelated note, why are we still doing interlacing on 1080p LCD panels?
Re: (Score:2)
FYI, interlaced vc1 probably won't even work at all under linux, regardless of hardware. There is something about the bitstream (versus progressive vc1) that requires unwritten code. Last time I tried feeding an interlaced vc1 stream to ffmpeg, just to demux (not re-encode) from m2ts to mkv, ffmpeg errored out saying it doesn't handled interlaced vc1 at all.
so have you tried using, say, vlc with vdpau? vlc claims to use "mostly" its own mux/demux.
Re: (Score:2)
I was wondering the same my self. I tend to go for the 720P stuff over the 1080i for the same reasons. It scales up nicely.
Re: (Score:2)
1080i@50Hz (i.e. 50 "half-frames" per second) effectively encodes more visual data than either 1080p@25Hz, or stretched 540p@50Hz.
It encodes more than 1080p 25Hz by including information sampled twice as frequently, leading to smoother motion.
It encodes more than stretched 540p@50Hz by way of discriminating between twice the number of vertical lines, and thus providing twice the apparent vertical resolution on still (or slow moving) objects.
Obviously it provides less visual data than 1080p@50Hz, but 1080i@5
Actually this is bullshit (Score:1)
if, a desktop chip sports such a dedicated card, and its entire power consumption is 100 watts, that is NOTHING compared to a separate cpu + separate graphics cards combo. such a system would spend at the minimum 200 watts. so 100 watts compared to this is nothing.
if you look at this in that light, 65 watt consumption becomes something phenomenal.
gah gah gah gah (Score:1)
Re: (Score:1)
Agreed, this is total bullshit (Score:2)
I also really don't see what the big deal is about TDP - all that determines is what sort of HSF you use. The important thing is the idle power because that is where the CPU sits most of the time.
Actually this made me think again (Score:1)
Its just $139 (Score:1)
it is one 4 core cpu, and one decent graphics card in one package, and its just 139. you would need to shell out $139 just for a decent graphics card, if you went with external.
and great reviews :
http://www.newegg.com/Product/Product.aspx?Item=N82E16819103942 [newegg.com]
Looks a little suspect?! (Score:3)
Is this a joke? The integrated graphics are the whole fucking point! If you don't want 'em, you can get a Phenom II (or maybe even an Athlon II) that uses less power and runs faster.
If you don't use it as a car, the Honda Civic isn't really all that great a value, comparing slightly unfavorably to Stone Ruination IPA in most video compression benchmarks.
45W (Score:2)