AMD Trinity A10-4600M Processor Launched, Tested 182
MojoKid writes "AMD lifted the veil on their new Trinity A-Series mobile processor architecture today. Trinity has been reported as offering much-needed CPU performance enhancements in IPC (Instructions Per Cycle) but also more of AMD's strength in gaming and multimedia horsepower, with an enhanced second generation integrated Radeon HD graphics engine. AMD's A10-4600M quad-core chip is comprised of 1.3B transistors with a CPU base core clock of 2.3GHz and Turbo Core speeds of up to 3.2GHz. The on-board Radeon HD 7660G graphics core is comprised of 384 Radeon Stream Processor cores clocked at 497MHz base and 686Mhz Turbo. In the benchmarks, AMD's new Trinity A10 chip outpaces Intel's Ivy Bridge for gaming but can't hold a candle to it for standard compute workloads or video transcoding."
But will it stand up against Intel? (Score:4, Insightful)
That's really all that matters. I've always been and AMD fan but If they can't pull out the same performance for less or equal price, they're done.
Re:But will it stand up against Intel? (Score:5, Insightful)
Re: (Score:3, Interesting)
Sorry but the 8 core FX kicks the crud out of the quad core i7 that is the same clock speed. I actually USE a pc for video editing rendering and 3d rendering and the new 8 core machine with one FX processor is kicking the arse of the i7 machine.
Granted i'm actually using multi threaded software unlike most people, but saying that the i7 is the end all to computing performance is not true.
Re: (Score:2)
Sorry but the 8 core FX kicks the crud out of the quad core i7 that is the same clock speed. I actually USE a pc for video editing rendering and 3d rendering and the new 8 core machine with one FX processor is kicking the arse of the i7 machine.
Mind telling us what applications you use? Because the 3.6 GHz FX-8150 loses [anandtech.com] to the 3.5 GHz i7-3770K in all of these (or 4.2 GHz vs 3.9 GHz if you want to compare turbo speeds), sometimes massively:
SYSMark 2012 - Video Creation
SYSMark 2012 - 3D Modeling
DivX encode
x264 encode - first and second pass
Windows Media Encoder 9
3dsmax (7/7 benchmarks)
CineBench R10 (single and multithreaded)
POV-ray SMP benchmark
Blender Character Render
Of course if you take the slower FX-8120 and compare it to the same clockspeed i5-
Re: (Score:3)
it beats intel(presumably more costly intel too) in gaming easily.
thanks to intels shitty gpu.
no surprises there, then.
Re: (Score:2)
What about when I use a more powerful, discrete graphics card?
Re: (Score:2, Insightful)
good luck cramming that into a tablet or 9" laptop.
people under 30 don't use towers. tablets and notebooks. small notebooks.
Re: (Score:2)
people under 30 don't use towers.
They do when their employer points at a cubicle and says, "Sit there. Use that PC."
Employers would prefer GMA or discrete (Score:3)
Re: (Score:3)
An employer that provides a tower can go Intel.
I disagree with this. The typical office user (non-engineer, non-programmer, non-graphics designer, non-AudioVideo designer... but basically a web client operator, e-paper pusher, email reader, calendar checker, etc.) would be fine with a machine about as powerful as today's most powerful smart phones. I never understood when its time to replacing aging hardware for the run of the mill office worker why companies always tend to go for the middle/top of the line boxes when it ends up being so much more power
Re:But will it stand up against Intel? (Score:4, Interesting)
"people under 30 who really dont do anything with their computers but websurf don't use towers. tablets and notebooks. small notebooks."
Fixed that for you. Every person I know under 30 that actually uses a computer has a tower. they need to do things like Render 3d GFX for static images or movies, high end photography, video production. even the CAD/CAM geeks have a tower.
I know plenty of under 30 professionals that actually use a computer to the point that they need a tower, It seems you don't, you might want to hang around smarter people.
Re: (Score:2)
Web development, app development, graphics editing, audio editing. You have a circular argument -- or a No True Scotsman if you prefer that term.
The list of computing tasks for which a powerful desktop machine is necessary is vastly smaller than the list of computing tasks, as evidenced by hardware sales. This trend is increasing. At some point large computers will be both expensive and rare, and I for one won't mind that if it means the end of fixing desktops. Users can send their tablets back to the manuf
not me...though technically not under 30 (Score:3)
I'm a professional software developer. I have an i5 laptop with built-in graphics, 8GB of memory, a couple of external displays, and a gigabit link to 2TB of NAS. Why would I need a tower?
I don't game much anymore, and when I do most of it is on my tablet anyway. My laptop is perfectly respectable for doing office work, compiling large amounts of code, doing photography work, and hobbyist CAD work in sketchup. It decodes high def video mostly in hardware with minimal overhead.
I have no desire for gaming
Re: (Score:2)
swap that i5 out for a high end 8 core amd or i7 quad, bump up to 16GB of ram and SSD and see how much faster things will compile... It does mean that you will have to have a highly parallel build system, but that's the price you pay.
Re: (Score:3)
You value your time? A good desktop can blow away an laptop CPU. I think that will change as we get new AMD and Intel parts as they don't care about performance but rather targeting mobile devices.
For software I build, an older desktop finishes 20 minutes earlier than the laptop. Both are AMD systems. I think you need to qualify what type of software development you do. It doesn't matter much if you write php code or small web apps in any language. Anything of real substance requires some oomph.
Re: (Score:3)
I'm under thirty, and have a desktop* as my main computer... I'm not ready to cut out the desktop for a laptop yet. I can't ever seem to get enough grunt in a laptop for even a 50% markup over a home built desktop.
*as long a a AMD Phenom II 1055T X6 in an Asus mini-itx board in a silverstone SG05 counts as "desktop".
Re: (Score:2)
Or Linux, where ATI performance suffers compared to Nvidia
I've been exclusively AMD+Nvidia since the K6-2 & Riva TNT2 days, but my next mobo will be Intel.
Re: (Score:3)
Re: (Score:2)
All I've ever read is how buggy the ati/amd drivers are and how support lags severely for kernels and cards and 3D is really slow. OTOH, the nvidia driver always supports the current stable kernel and cards.
If I were making an xbmc box (which I wouldn't since the Linux-based Iomega 35045 is only $105) then I'd have an Atom CPU and a GeForce 210 and install the binary driver and vdpau libraries so that it will off-load video decoding.
Re: (Score:2)
so will the gforce 210 do DTS master audio over HDMI and does it have enough grunt for high bit rate 1080P h264 streams (35-40Mbps)?
Re: (Score:2)
DTS master audio over HDMI
Dunno.
does it have enough grunt for high bit rate 1080P h264 streams
According to this [wikipedia.org] page, yes it can. However, I can't confirm it since I don't have any Blu-ray disks. (High-bitrate MP4s ripped from DVDs look perfectly fin on my 32" LCD, so I see no reason to buy BDs or a BD player.)
What I do know is that mplayer *never* breaks a sweat while playing "high RF" (maybe that's a term specific to Handbrake) MP4s.
Re: (Score:2)
I have two PCs at home with discrete GPU cards. Both have a similar processor, same amount and kind of memory, and very similar motherboard. One has a NVidia card, the other has an AMD. You can't tell the difference by using them, when running some physics simulation I found the AMD computer faster (altough it has a just a bit slightly slower CPU), and when installing Debian the AMD GPU just worked (it still tooke some setup for the physics simulation), while I had to install the NVidia by hand (ok, just ch
Re: (Score:2)
Re: (Score:2)
I wish someone would come up with a driver that would allow A-series APUs to run Linux reliably without graphical issues.
Which is why my next CPU will be Intel, since I don't want to pay for something I won't use.. Not that I'll need one for quite a while. 4 CPUs and 8GB RAM is just... enough.
And to think that I was happy with a KayPro IV, Borland Pascal, Wordstar, an Anchor Signalman modem and a Star Gemini 10X printer...
"Stability and transparency are important", then radeonhd drivers are awesome.
Maybe I don't push the envelope, but the nvidia blob has been stable for me for ages.
Regarding transparency, sure I want it, but I accept that ideological purity is impossible to achieve, so I take 3/4 of
Re: (Score:2)
Go Intel in that case.
"outpaces Intel's Ivy Bridge for gaming"? (Score:1)
No it doesn't. The summary says it does, then links to an article that says this:
Re:"outpaces Intel's Ivy Bridge for gaming"? (Score:5, Interesting)
> Ivy Bridge and Llano actually ended up 'tied
Yes, but Llano is the *old* AMD processor ;-) Check the reviews for performance of a HD 4000 vs a Trinity.
Re:"outpaces Intel's Ivy Bridge for gaming"? (Score:4, Informative)
So, AMD has the lead on average FPS, but it's now small enough that Intel wins in a few cases. AMD's integrated GPU is still a little better normally, but it's not a slam dunk any more.
Re: (Score:2)
> So, AMD has the lead on average FPS, but it's now small enough that Intel wins in a few cases. AMD's integrated GPU is still a little better normally, but it's not a slam dunk any more.
It's curious, that this is the case for mobile, but on the desktop the HD4000 is beaten by the Llano by a large margin:
http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review/15 [anandtech.com]
Re: (Score:3)
> AMD has the lead on average FPS, but it's now small enough that Intel wins in a few cases
Not really, Intel does win on a couple cases and is close for some cases.. Most of those are older CPU bound games. For Civ 5, AMD is close to 100% faster. A lot of the games that I looked at were ~ 40% faster (e.g. starcraft 2). e.g.
http://www.pcper.com/reviews/Mobile/AMD-Trinity-Mobile-Review-Trying-Cut-Ivy/Performance-Synthetic-3D-Real-World-Gaming [pcper.com]
http [anandtech.com]
Re: (Score:1)
the Ivy Bridge (Asus N56VM) is $1200 (MSRP) to $1300 on some preorder site. pre launch marketing (heh) claimed that a Trinity lappy might be $600-700. who knows, tho.
Re:"outpaces Intel's Ivy Bridge for gaming"? (Score:4, Insightful)
AMD's integrated GPU advantage is gone.
That's also compared to the more expensive i7 part. There was no i5 or i3 comparison.
Re: (Score:2)
right but can ICC target a FX-8150 instead of an i7? hmm no? right...
On linux that leaves you on generic x86_64 or hardware optimized...
Re: (Score:3)
Assuming we're not including discrete graphics card, if you want gaming performance, AMD wins. If you want video encoding or photo editing performance, Intel wins. For most people who have PCs, it doesn't matter because the CPU and graphics are already fast enough for anything there going to do on it.
Personally, I'm going with an Ivy Bridge, nVidia 680 GTX combo. If I was going for a single chip solution, I would probably go with AMD.
Re: (Score:3)
Personally, I'm going with an Ivy Bridge, nVidia 680 GTX combo. If I was going for a single chip solution, I would probably go with AMD.
I bet if you were going for an energy efficient solution, you'd probably also go with AMD... unless you didn't mind embarrassing performance and feeling like its the year 2000 all over again.
Re: (Score:2)
Next rig is looking like intel (as soon as they start giving me sata3 and USB3 only), on mini-itx. AMD seems to not care about the form factor at all. Granted in either case it will have an nvidia GPU because I like working graphics in wine and linux.
Re:But will it stand up against Intel? (Score:5, Interesting)
That's really all that matters. I've always been and AMD fan but If they can't pull out the same performance for less or equal price, they're done.
IMO, the Trinity is a truly compelling offering from AMD, after a long long time. Yes, it trades lower CPU int/float performance for higher GPU performance when compared to Ivy Bridge, but this tradeoff makes it a very attractive choice for someone who wants a cheap to mid-priced laptop that gives you decent performance and decent battery life while still letting you play the latest bunch of games in low-def setting. Its hitting the sweet spot for laptops as far as I am concerned. I'm also fairly sure it will be priced about a hundred bucks cheaper than a comparable Ivy Bridge - that's how AMD has traditionally competed. Hats off to AMD fror getting their CPU performance to somewhat competitive levels while still maintaining the lead against the massively improved GPU of the Ivy Bridge. All this while they're still at 32nm while Ivy Bridge is at 22nm.
Having said that, what I am equally excited about is the hope that Intel will come up with Bay Trail, their 22nm Atom that I strongly suspect will feature a similar graphics core that is there in Ivy Bridge. Intel has always led with performance and stability, not with power efficiency and price, so they need to create something that genuinely beats the ARM design, at least in the tablet space if not in the cellphone space.
Re: (Score:3)
Intel has always led with performance and stability, not with power efficiency and price,
I have to take issue with "not with power efficiency". I know you were making a comparison with ARM, but it's misleading to say they aren't power efficient. They are -- in the size class they've mostly been competing in.
I have to take issue with your taking issue. Yes, Intel's energy efficient options are indeed energy efficient. But they are also quite ridiculously anemic in computing power compared to AMD's and especially to ARM's 'green' offerings when it comes to energy efficiency and performance. I'm just going to come out and say it... the Atom is all hype. Yes... low power... but also low in performance and low in everything except price.
Re: (Score:2)
I'm just going to come out and say it... the Atom is all hype. Yes... low power... but also low in performance and low in everything except price.
When the Atom came it was a dirt cheap CPU for the "any CPU is good enough" market. I wouldn't buy one now after AMD came with Fusion, but between 2008 and 2011 it did okay and was certainly not "all hype".
Re: (Score:2)
I'm just going to come out and say it... the Atom is all hype. Yes... low power... but also low in performance and low in everything except price.
When the Atom came it was a dirt cheap CPU for the "any CPU is good enough" market. I wouldn't buy one now after AMD came with Fusion, but between 2008 and 2011 it did okay and was certainly not "all hype".
Maybe it was dirt cheap compared to non-green high-end processors... but not compared to non-green processors of identical processing power, which I estimate put it on par with processors that were new 6 years prior to its release, which, by the time Atom was released, were ridiculously dirt cheap, making the Atom, at the very least half-hype (because it is an energy efficient processory... my complaint is that its processing power is laughably anemic.)
Re:But will it stand up against Intel? (Score:4, Interesting)
Well, they'll sell them at the prices that they sell at, it's not like a CPU ever has a negative margin. The question is if that's good enough in the long run to keep making new designs and break even. Particularly as Intel is making a ton of money on processors that AMD can't compete against. Their Ivy Bridge processors should cost about 75% of a Sandy Bridge but sell for 98% of the price. Intel now has huge margins because AMD can't keep the pressure up, it's not really helping AMD to surrender the high end because it only gives Intel a bigger war chest.
This launch is okay, it's all around much better than Llano and keeping a fair pace with Intel, but it obviously tops out if you want CPU performance. What will be interesting to see it next year when Intel will have both a completely new architecture for the Atom and be on their best processing technology. Then I fear AMD may be seeing the two-front war again, both on the high and low end. Right now the Atom is a little too gimped to actually threaten AMDs offerings. I expect Intel just wants AMD crippled, not killed though to avoid antitrust regulations, so I think they'll be around while Intel makes all the money.
Trinity Launched, Tested (Score:5, Funny)
Looks like Price/Performance win over Intel (Score:4, Insightful)
I've seen a lot of reviews of various laptops that have missed the most important metric in this competition - Price!
What's been common in all reviews is that the only the very top end Intel "integrated" (No separate, discreet GPU) solutions have been competitive to the new fusion products. We're talking mobile i7s. I don't know if you've priced laptops lately, but the i7's are only found in expensive, high end systems.
The fusion APUs are nowhere near that expensive. Price wise, they should be compared to i3s or "pentium" mobile cpus.. Where they will win quite handily!
It turns out that AMD's 'APU' solutions have been very popular with low end device makers and AMD sells them by the boat load. What's impressed me, however, is how much intel has improved their GPU in ivy bridge. It's always been garbage before, but now it's starting to be something you could call 'low end'.
Re: (Score:2)
Gaming laptops don't use integrated graphics. Compute laptops (which is the other use for an i7) usually don't need high-performance graphics.
Hmm sorry, but I've just been looking at CFD stuff (CFDesign), and that i7 looks really nice (or a dual xeon but that's very spendy), and yep need the GPU as well as you have to set up a 3d CAD model before you run the sim.... it's too bad that the package we got doesn't seem to use the GPU yet.
Re: (Score:2)
Gaming laptops don't use integrated graphics.
They don't because integrated graphics are historically horrible. Llano and Trinity are beginning to change that. They offer near midrange performance at a much lower price and better battery life.
Compute laptops (which is the other use for an i7) usually don't need high-performance graphics.
Please read up on that thing... You know, GPGPU and all that jazz. The way nVidia and AMD are promoting it in apps (and even Intel with Quicksync), they're beginning to make a difference between a system without a (decent) GPU and a system that has one.
In all honesty.... (Score:1)
I don't transcode and my Excel sheets aren't that complicated. I suspect that most people are like me, we do basic work and play a game or two. I play TF2 on my laptop, it's 3 year old laptop with a new SSD. Plays fine. I can't think of the last time that I was truly CPU limited. I've been GPU limited since Crysis. I can't play that beyond low detail level.
Re: (Score:2)
I got my wife an acer 10.6 inch thing somewhere between a nettop and a laptop.. She loves it.. that little AMD 350 CPU pulls 9 watts of power, so this little thing has great battery life (about 7 hours for our usage). Plays video fine, since it has a decent video chip (not great) built into the CPU. No heat.. no loud fans that kick on all the time.. she really digs it.. NOt bad for a $350 laptop at costco.
Re: (Score:2)
It will play pretty much anything I throw at it fine, it's nearly silent, and quite cheap to build (~$300). It can even play video while copying 2 streams (via HDHomeRun) since it's my DVR too.
The only thing that pisses me off, and is not really AMD's fault, is Netflix support. It runs on Silverlight and the E-350 really struggles with it.
The only 'fix' is to configure Netflix to send on it's lowest quality (bit-rate) setting.
Re: (Score:2)
Same think on my atom. Netflix is apparently doing all the decoding in software, and the thing drops frames like crazy plugged into the wall in standard def mode. A lot of the time, its not really even watchable once the stream bitrate goes up due to action sequences or whatnot.
Re: (Score:2)
Re: (Score:2)
My E-350 system is fanless. Not even a case fan.
A mixed bag (Score:4)
From what I've read, on CPU tasks it's between an i3 and an i5. An i3 is "fast enough" for most general use, so I think that's pretty good. On GPU tasks, it's significantly faster than Intel's integrated chipsets, knocking on the door of respectable gaming performance if not walking into the room.
If you're doing CPU tasks, you really want the i7. If you're doing hard core gaming, you're also going to want the latest generation video card, even if it's an entry model. If your budget is less than $700 and you still want to play video games, Trinity is a good compromise. I think it's perfect for college students.
Re: (Score:2)
Honestly, since ditching my desktop, I've been loving my A-series A8 based laptop (upgraded it from an A4). I get respectable gaming performance, and it's perfectly fine for my music and media creation, although I will say that if I were a music and media pro I'd probably fork out the dollars for a real rig. It does everything I need it to do decent, the price was certainly right, and for anyone looking in the $500 laptop market that needs some graphics ability and isn't crunching a lot of numbers (i.e.,
Re: (Score:2)
Re: (Score:2)
My experience is that hardware of that vintage runs into issues playing back flash video nowadays. Which is more Adobe's fault, but they don't seem to care. You could try upgrading the GPU, but the AGP bus limits your choices. Kind of a shame, as it'll be powe
Gaming Laptop (Score:1)
HTPC (Score:2)
So far I have seen no mention of it, but would this not make a great HTPC platform?
Very low powered CPU but a tank of a GPU sounds great to me... Especially when your box is idling.
Any thoughts from someone more knowlegable? I'm still like 5 generations behind running an AMD X2 5200+.
Re:HTPC (Score:5, Informative)
' Some might say that Intel Atom solutions are price competitive with the A-series but the Atom solutions, just like AMD's low powered E-series lineup, really only works well for HTPC as long as 100% of your needed video codecs use GPU acceleration. If the Atom is good enough, then an E-series of the same price will be a bit better as well. Its hard to guarantee that all the codecs that you will be using will be GPU accelerated, especially so if you are stacked up on a Linux distro, so the E-series and Atoms are not really a solution that I recommend.
Re: (Score:2)
Its hard to guarantee that all the codecs that you will be using will be GPU accelerated, especially so if you are stacked up on a Linux distro, so the E-series and Atoms are not really a solution that I recommend.
I do use Linux. Gentoo, to be precise. I am not sure what you mean when referring to Linux there. Could you please clarify?
Do you have any recommendations as I am looking at building an HTPC. No capture or anything fancy, I have a huge media Library on my main PC I will mount over NFS. I had an idea to build a "retro" system into an old VHS player I have shoved in the basement. Space is very adequate, I could probably pack a full ATX PSU in there with plenty of room to spare with an ITX board. Looking to sp
That's a bigun! (Score:2)
Re: (Score:1)
Reminds me of the AMD Tombstone, a weird ass 48 bit CPU they got all ready to make and then ditched at the last minute in the late nineties. AMD has a habit of making some very strange CPUs. Hopefully this one will see some success.
Re: (Score:2, Informative)
You mean AMD TwoStone, right?
Tombstone was the "joke" name people in AMD management gave it, for obvious reasons.
Re: (Score:1)
Oh, I see where I went wrong, I completely pulled it out of my ass.
Hey! I did not, what are you talking about!
Yes you did, you dirty little harlot?
What did you just call me?!!
A dirty little harlot.
Ok, but you are still stupid.
What the fuck are you talking about and who are you?
I'm Anonymous Coward, who the hell are you?
You can't be Anonymous Coward cause I am.
No you aren't.
Yes I am.
No you aren't.
Yes I am.
No you aren't.
Yes I am.
No you aren't.
Yes I am.
No you aren't.
Yes I am.
No you aren't.
Yes I am.
No you aren'
Re: (Score:2, Flamebait)
You'll never know.
Or wait, does that mean I'll never know?
Hold on now, what the fuck is going on?
Hell if I know.
Who the fuck are you?
I already told you that, I'm Anonymous Coward.
And I already told you that you aren't, I'm Anonymous Coward.
Bullshit, there is no way you are Anonymous Coward because I am.
Wait, what?
Re: (Score:1)
ROFL!!!! oops.
Anyway, it's really annoying reading a thread of just AC's replying to each other. We have no idea who is who and who is making which argument. Just create a damn account already.
Re:AMD is done and gone... (Score:5, Insightful)
They used to be able to beat Intel in the Athlon days. Now they are hopelessly far behind and dumping huge hot graphics cores into their chips putting them further and further behind. Focus on cheap compute with unlocking cores AMD. Not stupid graphics cores which do nothing for the CPU. A 16 core phenom ii at $100 will sell much better than this insane graphics + cpu crap.
That is pretty much the exact opposite of a good plan for AMD(as much as I would like cheap compute...) Since Intel has a process advantage, and presently has a superior x86 compute core architecture, they can almost certainly beat AMD on production cost for chips of a given level of punch. Trying to compete on price with somebody kicking out chips a process node ahead of you just isn't a good plan. Unless they really fuck it up, or their yields tank horribly or similar, they'll be able to beat you on production cost every time. Intel has little to gain by cutting its own margins in order to chase AMD down a hole(since lower margins are bad, and killing AMD would mean becoming antitrust scrutiny case #1 for the indefinite future...); but there isn't any architectural barrier to their doing so.
Since Intel has comparatively worthless GPU designs, tacking GPUs onto CPU dice is a way for AMD to offer something that Intel cannot(and at a price lower than a discrete CPU + discrete GPU without totally cutting their own throat), and also happens to go well with today's enthusiasm for laptops and all-in-ones. They have a second niche, much more directly focused on price, in compute-light, memory-heavy server applications(since you can populate your sockets with AMD CPUs for less and the number of DIMMs you get is roughly proportional to the number of sockets you have active); but competing on price isn't good for your margins.
With an inferior process and a weaker x86 design, gunning directly for the compute performance crown would just be asking for a whupping from Intel.
Re:AMD is done and gone... (Score:5, Interesting)
Speaking of all-in-ones, an all-in-one AMD chip would be a dandy basis for a games console. If not one from Microsoft (who has no particular need for x86) then it would perhaps be a good match for Valve. Public distaste for Sony is at an all-time high, but is it enough to unseat them? etc etc.
if I could have a 16 core phenom ii, though, that would be pretty awesome. I could drop it right into my current machine. I'd pay $100 for even eight cores, though, let alone sixteen.
Re: (Score:2)
AMD is also in a nice position to do more work in the ultraportable (think tablets, maybe phones in the future) market. Intel has repeatedly dropped the ball with Atom architecture, especially with their garbage implementations of PowerVR graphics cores. AMD may be skiddish to compete with Atom and various ARM architectures, but the risk could be worth the payoff.
Beating ARM performance is trivial, though performance per watt is a much more difficult task, especially using an x86 instruction set. Beating
Re:AMD is done and gone... (Score:5, Interesting)
They have a second niche, much more directly focused on price, in compute-light, memory-heavy server applications(since you can populate your sockets with AMD CPUs for less and the number of DIMMs you get is roughly proportional to the number of sockets you have active)
I haven't tried AMD's latest server machines, but if they are even 1/2 as good as the old, ones they are a _MUCH_ better deal. My 6 !! year old DL585G2 is actually faster on every single thing it gets used for than the much newer westmere machines we have been buying. The problem is that intel is charging an absolute fortune for chips clocked fast, so we end up with 1.8 or 2.2Ghz westmere machines, and their single thread performance is abysmal compared to the much older 3.2Ghz AMD machine. Our application scales nicely, but quickly becomes IO bound, so both machines basically get the same throughput, but the AMD machine has much lower overall latency. This results in it actually getting much better benchmarks in our tests.
So, in theory we could get an intel that kicks the crap out of the AMD machine, but its going to cost us 5x as much (from ~$5k to ~$25k). So we buy the cheap ones, and they get their ass handed to them by a 6 year old machine that cost $5k when it was new.
Re: (Score:2)
Yep, at the highter end products AMD offers a much highter bang for your buck (on both single threaded and paralel speeds), maybe at the cost of some extra electricity and heat (I was never able to calculate that, so I don't know who wins).
Re: (Score:3)
Since Intel has comparatively worthless GPU designs...
Lets not forget what powerhogs Intel's little heat factories are, and how anemic their low power chips are compared to AMD's energy efficient offerings.
Anti-trust (Score:2)
"Intel has little to gain by cutting its own margins in order to chase AMD down a hole(since lower margins are bad, and killing AMD would mean becoming antitrust scrutiny case..."
This. Look at the history of Intel and AMD, and it become absolutely apparent that Intel is aware of the danger of landing in the government's anti-trust sights. They have always left just enough room at the bottom end for AMD to barely survive. When AMD gets uppity (like in the Athlon days), Intel pulls out the stops for a couple
Re: (Score:2)
Re: (Score:2)
the thing is, an cheap Intel CPU + cheap dedicated GPU is faster and cheaper then anything AMD can provide.
The thing is, this new CPU+GPU is already dead.
There is a reason that AMD is releasing these things as laptop parts before they get to the desktop parts, and that's because while you are quite right about desktop use cases, for laptops these things are very persuasive. Near discrete performance (in Trinity's case, VERY near discrete mobile performance) for much better battery life and a much lower price than a discrete solution.
Re: (Score:2)
Did you just seriously insinuate that Intels GPU's are better than AMD's?
So Intel has a better GPU than this?
http://www.newegg.com/Product/Product.aspx?Item=N82E16814161399 [newegg.com]
I think you might have meant that Intels best GPU's are starting to compete with AMD's bottom rung GPU's
Re:AMD is done and gone... (Score:5, Funny)
Log the fuck in so I can be sure I'm talking to the same moron who posted "Since ivy bridge intels GPU now equals anything from AMD."
Re:AMD is done and gone... (Score:4, Informative)
Okay - So talking about AMD having 384 stream processors per die in the 7660, vs... 16 for Intel in the HD4000. Not even the same game, never mind the same ballpark.
Sorry, but AMD wins this round. And although the average Joe hasn't yet realized it, the "number of cores" war has turned a corner, in that the CPU has already started serving merely as an "overseer" of massive numbers of GPU SPs/CUs. If you do, specifically and exclusively, transaction processing - The CPU still wins. In scientific computing, cryptography, signal analysis, physical simulations, CAD, and yes, even gaming - No one cares if you have a 12-way Xeon or an AMD Geode, it matters that you have an AMD 59/69/79xx (and yes, I do mean AMD - despite their overall gaming performance, for GPU computing, even NVidia doesn't even come close, though the uber-expensive Tesla does at least get to share the playing field).
/ Note that the recent Slashdot article on media transcoding dealt specifically with mass-market solutions using hacked-up shader routines, not optimized OpenCL kernels.
Re: (Score:3)
Re: (Score:2)
All it has to do to keep me happy is to keep trailing along behind Intel at a close enough distance to run all the same software at an acceptable level of performance, and meanwhile at far less cost. When I built my Phenom II system I could get more CPU at the same price but the motherboard cost $200 instead of $100... and I got a Gigabyte board with all the IO I can eat. Intel chips with good performance are incompatible with my ethic of not spending more than a hundred bucks on any one part. And it looks
Re:AMD is done and gone... (Score:4, Informative)
They aren't competitive though. You keep missing the point. Intel's Sandy Bridge/Ivy Bridge integrated GPUs basically do video playback on laptops at a suitable level. They cannot play any sort of games made within the last 2-3 years at any level beyond the most base settings. the A-series processors by comparison can play the newest games at relatively low settings and the new Trinity based models can do it at reasonable settings. With the newest A10 laptops starting prices around $600 for 17' laptop that's quite competitive since the first Nvidia/AMD dedicated laptops that can hold a candle to them start around $800-900. The small ultrabooks are going to be harder to justify using intel when the A10 will do it all faster and just as thin. In other words AMD has a serious contender in the mobile market for gaming and cost-effectiveness.
The problem remains that Intel holds the cards on mainstream OEMs and will continue to keep the A-series processors out of the big seller's hands because mobile is becoming their bread and butter.
Re: (Score:3, Informative)
Well, the benchmarks disagree with you. The HD4000 IGP in the Ivy Bridge processors are DX11 that can run recent games at low to medium settings quite well. The Anandtech review [anandtech.com] for example shows that on some games like Batman Arkham City, Dirt 3 and Skyrim, the HD4000 even outperforms this new AMD APU. It loses on the other 4 games tested but it's still competitive. I'm only talking about gaming performance here, not video decoding where Intel wins by a large margin. Since Sandy Bridge, Intel GPUs have sto
Re: (Score:3)
Don't forget that a CPU with HD4000 graphics is in a different price class, at least $100 more.
Re: (Score:2)
Re:AMD is done and gone... (Score:4, Interesting)
Built myself a PC to play WoW 3 months ago. Went with the high-end Llano, no discrete graphics required. An Intel setup would have required a graphics card, larger base (mini-itx MB), and more money. For most users that are also *casual* gamers (not hard-core), AMD's CPU/GPU balance saves a graphics cards while providing sufficient CPU power.
Re: (Score:2)
Even pretty old stuff is good, I have the last nVidia IGP without CUDA support and it's good enough to do 1080p with XBMC. This story would be cooler if I could remember which IGP it is... 9400 or something. Of course that's nVidia and this is AMD but I guess there's some hope the drivers will work since they're pretty much betting the farm on this one.
Re: (Score:2)
Re: (Score:2)
1920x1080 video decoding is not impressive.
Good thing I wasn't trying to impress anyone.
Of course, it gets pretty choppy trying to run compiz on my 3520x1200 X screen.
Too bad you weren't trying to impress anyone, because you failed.
My point was that it does what I need to do, and it's old and integrated. And most users don't need anything more than that, though faster is usually nicer.
Re: (Score:3, Insightful)
Exactly. These articles and benchmarks are a joke. The Intel CPUs are so far ahead, in performance and value, that I can't help but feel embarrassed for AMD.
Without AMD you clueless retard would have to pay 5 times the price for an Intel CPU. You should thank them for providing competition instead of dissing their products.
Re:AMD is done and gone... (Score:5, Insightful)
Appreciating competition is not mutually exclusive with being critical of the competitions quality.
Re: (Score:2)
Both your and the GP's quotes can actually be true at the same time...
Re: (Score:3)
> The Intel CPUs are so far ahead, in performance and value, that I can't help but feel embarrassed for AMD.
Not so. The Intel traditional CPU is faster, but the AMD integrated GPU is faster.
For AMD's pure-CPU parts, they seem competitively priced to me (ie: cheap).
Agreed (Score:2)
I just built a 3.3 ghz (slightly overclocked) quad core AMD system with 16 GB. Got the motherboard, cpu, graphics card (MSI GTX570 2GB, also overclocked), and memory for $600. The damn thing can compile practically anything from scratch in no time flat. I play all the latest games at the max detail settings. The system is fast as hell all around; if I had an SSD it would just be ridiculous. Why the hell would I want an Intel chip again?
Re: (Score:2)
Forgot to mention, that cost includes the power supply as well. Case is a free beige box from a '99 era AMD K6 machine someone was throwing away at the dump.
Re: (Score:2)
The Intel CPUs are so far ahead, in performance and value
"Performance"? Sure. "Value". Not so sure. If you want a top Intel CPU you'd better be prepared to pay big $$$ for it.
Re: (Score:2)
I'm going to retract that.
I just looked at the latest CPU hierarchy charts [tomshardware.com] and the conclusion is: "...we're almost-shockingly left without an AMD CPU to recommend at any price point".
Strange times, indeed.
Re: (Score:2)
Re: (Score:2, Insightful)
please realize that AMD and Intel TDPs can not be compared apples to apples. My understanding last time i looked was that Intel was a bit optimistic about how low their TDP was.
Re: (Score:2)
Re: (Score:2)
I think this really depends on what the server is doing. More cores is actually really good for a web server. Most languages used for web applications don't favor multithreading for a single request. In fact, it often doesn't make sense to do it. But handling 16 requests at the same time for medium to large sites is very useful.
Similarly, for certain types of database use it is better to have multiple cores. Queries run on a distinct core on many RDBMS so as long as there are no locking issues, you can
Re: (Score:3)