AMD FX-8350 Review: Does Piledriver Fix Bulldozer's Flaws? 259
An anonymous reader writes "AMD just officially took the wraps off Vishera, its next generation of FX processors. Vishera is Piledriver-based like the recently-released Trinity APUs, and the successor to last year's Bulldozer CPU architecture. The octo-core flagship FX-8350 runs at 4.0 GHz and is listed for just $195. The 8350 is followed by the 3.5 GHz FX-8320 at $169. Hexa-core and quad-core parts are also launching, at $132 and $122, respectively. So how does Vishera stack up to Intel's lineup? The answer to that isn't so simple. The FX-8350 can't even beat Intel's previous-generation Core i5-2550K in single-threaded applications, yet it comes very close to matching the much more expensive ($330), current-gen Core i7-3770K in multi-threaded workloads. Vishera's weak point, however, is in power efficiency. On average, the FX-8350 uses about 50 W more than the i7-3770K. Intel aside, the Piledriver-based FX-8350 is a whole lot better than last year's Bulldozer-based FX-8150 which debuted at $235. While some of this has to do with performance improvements, that fact that AMD is asking $40 less this time around certainly doesn't hurt either. At under $200, AMD finally gives the enthusiast builder something to think about, albeit on the low-end."
Reviews are available at plenty of other hardware sites, too. Pick your favorite: PC Perspective, Tech Report, Extreme Tech, Hot Hardware, AnandTech, and [H]ard|OCP.
How about idle?? (Score:5, Interesting)
90+% of my CPU is idle time.
How much power does the new chip use at idle and how does that compare to Intel?
50W at the top end means about $25/yr if I was running it 24/7. But since typical desktop is idle, what is the power difference there??
And yes, I don't care about single thread performance as I care about multithread performance. Single thread performance has been good enough for desktop for almost a decade, and the only CPU intensive task I do is running those pesky `make -j X` commands. No, not emerging world or silly things like that ;)
Re:How about idle?? (Score:5, Insightful)
I agree about multithreaded performance being important thing moving forward.
Regarding power consumption, anandtech review [anandtech.com] puts total system power consumption for Vishera tested at 12-13W more than Ivy Bridge. Scroll to bottom of page for chart. Bar and line graphs at top of page are misleading-- they put x axis at 50W, not 0W.
If you are concerned about power consumption, find 100W lightbulb in your house. Replace with CFL. You will have greater energy saving.
Re: (Score:3, Informative)
So true. AMD isn't competitive energetically in any way anymore, but the desktop is probably the only place while it doesn't matter. When you're thinking mobile, saving energy is a priority. On huge server farms, little relative gains can mean a tangible different in absolute numbers. But on desktops, their difference is about a light bulb, at load, and hardly anything when idle. And with PSUs under 350W being incresingly harder to find and an FX-8350-based system only gobbling about 200W at its most intens
Re:How about idle?? (Score:4, Insightful)
I think the biggest factor for your home desktop is noise - it takes a lot more airflow to remove 125W of heat than 77W of heat. In Anandtech's tests he actually measures 195W versus 120W total system power consumption. Sure it might not matter much if you plan to put a noisy 200W+ graphics card or two in it, but for non-gamer use I'd say that's pretty significant.
Re: (Score:2)
True. Especially for hotter cities. Here, My undervolted, underclocked 95W Athlon II already manages to annoy me with its 3500+RPM fan (it goes to about 5000RPM at full load and clock, it's madness). Maybe AMD should start putting coolers with 120mm fans on their boxed processors. If they plan on selling such behemoths, that would provide more airflow and less noise at the same time. Fitting that on a motherboard would likely be a problem, though. Water cooling would be good, but it's much too expensive to
Re: (Score:3)
Re:How about idle?? (Score:5, Insightful)
This is what I use on my Athlon 2, works perfectly, is very quiet, and it's rather old now so you could probably pick a used one up pretty cheap.
Re: (Score:2)
Get a better fan.
Re: (Score:2)
Re:How about idle?? (Score:5, Informative)
I think the biggest factor for your home desktop is noise - it takes a lot more airflow to remove 125W of heat than 77W of heat.
Larger fans with slower rotational speed.
Re: (Score:2)
If you are concerned about power consumption, find 100W lightbulb in your house. Replace with CFL. You will have greater energy saving.
Surely you already did that 3 years ago, so now you're preparing for the switch to LED?
Re:How about idle?? (Score:4, Insightful)
"Surely you already did that 3 years ago, so now you're preparing for the switch to LED?"
Why? so I can get less light output for the same watts used but instead of spending $8.95 per bulb I get to spend $39.99?
LED is a joke for home lighting, only fools are buying it right now. CFL is still way more efficient.
Re: (Score:2)
Why? so I can get less light output for the same watts used but instead of spending $8.95 per bulb I get to spend $39.99?
LED can come much closer to proper full-spectrum light than CFL ever will. CFL is just a stopgap technology we have to deal with until LED gets there.
Also, it is possible to make LED spotlights to handle that strange modern trend of building lots of spotlights into ceilings. CFL cannot do that.
Re: (Score:2)
What?!? There are CFL's with a CRI of 95, where is the LED bulb that competes? Oh, that's right they're [earthled.com] the same [kinoflo.com], because they use the same phosphor based system to create full spectrum light.
As far as the use in cans, both LED's and CFL's have the same problem with overheating ballasts, unless you have DC power run to your LED can light you'll run into the same power and a similar lumen ceiling that you do with CFL's in that application.
Re: (Score:2)
If you have a lightsource and an appropriately curved reflector, you have a spotlight. This can definitely be done with CFLs. I'm pretty sure of that, since my house has those ceiling spotlight mounts (originally with halogens, but replaced with CFLs when we bought the house.)
Re: (Score:2)
You do realize they last like 20 years, right?
Please explain why you think someone would be foolish to buy a longer lived and nearly as efficient device?
Re: (Score:2)
Re: (Score:3, Insightful)
You live in an apartment and dont plan to be there for 20 years?
I imagine for a lot of people, dumping $40 into each light socket is a losing proposition for you, and a winner for your landlord (who I am sure would greatly appreciate the gift).
Re:How about idle?? (Score:5, Insightful)
Re: (Score:2)
Really, what ones have lasted 20 years, Because I have found none that do. Note: I have worked for a LED bulb distributor. the return rate is nasty high. They last about 1-2 years on average. Warranty replacement rates are high as hell on them.
Re: (Score:2)
Now, lets expand and point out something this article got wrong. They say you would only have to replace a CFL three times over the course of an LED lifespan. In my experience, this is at least an order of a magnitude on the low side. I have dimmers in
Re:LED is a joke (Score:2)
Re: (Score:2)
We only use incandescent heaters for illumination in the studio. Stratocaster pickups hum plenty already.
Re: (Score:2)
And spend more money than the cost of the bulb to keep it lit, and I get to replace them all once a year.
I loved CFLs, but LEDs are all I buy now. Why bother changing lightbulbs?
Re: (Score:2)
Same here. Used to buy CFLs, but switched to LEDs because they never burn out. Now I have a single circuit in my whole house with a single UPS(300W) for lighting. If I lose power all the lights stay on for about 10 minutes. Very useful at times. Not to mention they never burn out. I have yet to have an LED fail me and my whole house has been LED since 2009.
Do I think I'll make my money back someday? Maybe, maybe not. But I'm glad to have given money to a technology that needs to become more common a
Re: (Score:3, Insightful)
LED's running on AC will fail for the same reason most CFL's fail, the ballast.
Re: (Score:2)
Re: (Score:2)
There are many areas of the USA where your cost per KWh go up depending on how much you use per month. There are even places you can get seasonal or time-of-use metering. The latter are great for those who produce their own solar power, since rates are highest during midday in the summer, which is the time you're actively producing more power than you use. A prime location for that is California. :)
Re:How about idle?? (Score:4, Insightful)
Because the incandescent ban is for the old out of date crap that sucked and worked better as a heater than a lightbulb. Halogen bulbs are the replacement. Who did you get your education on the ban from? Because they did not know what they are talking about, you should stop listening to the uneducated news network.
Re:How about idle?? (Score:5, Insightful)
I play games maybe 1h 30m a day on average. My 5 year old dual-core E6750 overclocked at 3.2 GHz handles most of them gracefully, but there are some new releases which require more processing power. However, in choosing a new platform, I'm mostly looking at TDP, not from a consumption perspective, but heat dissipation. I hate having to use a noisy cooler.
My current CPU has a TDP of 65W and a Scythe Ninja 1 as cooler, and the fan usually stays at 0% when the CPU is idling. While gaming, I can't figure out whtehr it makes noise, because my GPU cooling system makes enough noise to cover it. And I'd like to keep it that way when I pick my new CPU.
You're saying that graphs are misleading. No, they're not, if one has half a brain. I'm not looking at the hard numbers and the power consumption difference is of about 100W. The i5 3570K draws about 98W and Zambezi and Vishera (who the fuck names these things?) draw around 200W. if you put TWO i5 on top of the other, they barely reach ONE AMD cpu power consumption. Thanks, but things DO look bad for AMD. I'll just have to pass.
Re:Wattage costs (Score:2)
More heat equals louder fans. and more dust on the vents.
The limosine tax imposed by our ISP overlords is a couple orders of magnitude more painful.
Comment removed (Score:5, Informative)
Re: (Score:2)
Compared to the Phenom II in 40 nm, I guess that Piledriver is finally faster. There were a few benchmarks where a Phenom II X6 could beat a Bulldozer, but IIRC only by a few percent. Which is not enough to beat the Piledriver in the same tests.
Now I still wonder how a Phenom II in 32 nm would have performed. That hypothetical chip might still embarass the Piledriver ;-)
Re: (Score:2)
Umm.. 12W difference is for an IDLE system. At load the difference is clearly closer to 80-100W.
No shit, Sherlock. Look at comment subject line.
Maybe for a desktop this doesn't matter. But given the decline of desktops and consumers moving to laptops (and even more mobile devices), these results are downright TERRIBLE for AMD.
If you think desktop chips (AMD or Intel) are used in laptops, you are an idiot.
Re:How about idle?? (Score:5, Informative)
Idle power seems pretty competitive with Intel's Core offerings. Anand found little difference and attributed it to their selection of a power-hungry motherboard.
Re: (Score:3)
So they are not dead (Score:2)
Re: (Score:2)
I'm a gamer too, and I'm actually interested, mainly because the FX-4300 seems to be now a fierce competitor to Intel's i3 while costing quite a bit less. The FX-8xxx still sucks, but this is a major improvement for AMD on the mainstrem segment. They were losing to cheaper Pentiums with the FX-4100, it was embarrassing.
Re: (Score:3)
Actually check out the Anandtech review. They do it on windows 8 with the new scheduler... the difference in performance of the FX series from Windows 7 to Windows 8 is fucking mind-boggling. The FX-8350 actually trounces all but the higher end I5s and gives the I7 a run for its money in several games. For Sub-$200 if you're getting Windows 8(which I apparently will be now) you can't get anything thats even close performance wise.
Re: (Score:3)
You do realize that most of your processing occurs via your gpu in a game right...?
Re: (Score:2)
Most AI / Game processing can be handed on a single core 2 ghz processor, why do we need multi-core for gaming? ... ... ...
Windows background processes, yep it's as lame as that.
Re: (Score:2)
Something like Physx rendering gets kicked over to your cpu if your gpu can't do it... which at this point in time puts it in the bottom bracket of gpus, but besides the point even w crappy coding cpus are leagues and leagues ahead of any bad practices programmers may deploy. Read below for why mutli core.
Re: (Score:2)
Re:So they are not dead (Score:4)
You use a low resolution so that the bottleneck becomes the CPU and not the GPU.
Re: (Score:2)
Guild Wars 2 seems to be CPU-limited as well; no graphics setting change other than supersampling makes more than 1fps difference on my system.
Re: (Score:2)
That's probably your gpu not being challenged, gpu's have exceeded game requirements in recent years by quite a wide margin (also you failed to mention your cpu load)... there does seem to be a bunch of confusion on the issue, so here's a great link on the subject: http://www.thinkdigit.com/forum/hardware-q/154674-what-role-cpu-gpu-relation-gaming.html [thinkdigit.com]
Now, going back to my previous statement: unless you run a p4, you're probably fine on the cpu and the gpu is what does the heavy lifting.
Re: (Score:2)
It certainly will make things more interesting - and increased number of cores is the way to go today. Most applications and operating systems will benefit from multiple cores - even though some applications may benefit from additional tuning.
However - most bread&butter applications today runs well on even the cheap processors on the market, it's only when you start to look at things like gaming and advanced processing that you really will benefit from a faster processor. Most computers will be a lot fa
Re: (Score:2)
I miss the FX series... we may never have gotten the i7 if not for that war between intel and amd, but AMD never answered...
Re: (Score:2)
Not dead, but if they're having to sell a chip with twice as many transistors as an i7 for two thirds of the price, they must be spraying blood all over the room.
Twice as many transistors? No, it's 1.2B versus Ivy Bridge's 1.4B - but the die size is almost double, 315mm^2 to 160mm^2. That is both because Intel is on a 22nm process and because they have higher transistor density, Intel's 32nm Sandy Bridge has 1.16B transistors but is only 216mm^2. I guess die size is one of the many things AMD didn't have the time or resources to optimize for and yes they're going to hurt selling these chips for $200 and down.
Lowers barrier to entry (Score:5, Interesting)
I put together an 8way,32GB machine (no local storage) for $400 to play with ESXi. Courtesy of the freebie VMWare download and a reasonably priced 8way machine, I can get into some pretty serious VM work without spending a ton of dough. I don't need massive performance for a test lab.
Re: (Score:3, Interesting)
Get an SSD.
Local storage is a must for performance. iscsi cannot hold a candle to local SSDs. In a lab you won't need to share the storage with multiple machines anyway.
Re: (Score:2, Informative)
This is hyperbole. If what you're doing is mostly CPU or Memory intensive and requires very little disk activity having fast local storage isn't going to help much, if at all.
Besides, apparently it isn't a must for the grandparent as he stated he doesn't "need massive performance for a test lab."
Don't get me wrong, using an SSD to provide storage for a handful of VMs is a great idea (massive read/write IOPs), but it isn't necessary.
Re: (Score:2)
Without iSCSI you cant really use shared storage, which means 90% of the features of ESXi cant be used. Kind of dampers the whole "for a lab" thing.
SSDs ARE quite sweet for VMs, Id recommend setting up a VM that serves out a local SSD as iSCSI over an internal ESXi storage network-- thats actually how things were done during my VCP training. I believe they were using FreeNAS (MIGHT have been openfiler) to serve up iSCSI and NFS targets. Its a little buggy but sufficient for a lab.
Re:Lowers barrier to entry (Score:5, Informative)
Same here. I built a Bulldozer machine for compiling projects in VMs last year and it works very nicely. If Intel had had a CPU with ECC memory and hardware virtualization support at a reasonable price I would probably have bought it, but I would have needed at least a $500 Xeon for that, with a more expensive motherboard, and I wouldn't be able to overclock it. For the same performance I have now I would probably have needed a $1k CPU.
Re:Lowers barrier to entry (Score:4, Interesting)
I will keep buying AMD as long as they are cheaper and "good enough", if only to keep some competition alive.
Still running a quad-core AMD gaming machine at home as well and it is still playing every thing I throw at it.
Re: (Score:2)
8-way would mean 8 sockets on the motherboard, like http://www.supermicro.com/products/system/5U/5086/SYS-5086B-TRF.cfm [supermicro.com] .
Re: (Score:2)
I've always looked at it as number of CPUs (physical chips), number of cores (total, on all CPUs), number of threads (total cores, with HT or not etc).
-way was a socket thing for me, but perhaps it's not.
Here's the problem... (Score:5, Informative)
These chips "excel" at big, heavily threaded workloads. Which is to say that they can beat similarly priced Intel chips that are simply up-clocked laptop parts. Move up to a hyperthreaded 3770K (still a laptop part) and Vishera usually loses. Overclock that 3770K to be on-par with the Vishera clocks while still using massively less power than Vishera and the 3770K wins practically every benchmark.
Unfortunately, if you *really* care about those workloads (as in money is on the line) then Intel has the LGA-2011 parts that are in a completely different universe than Vishera, including using less total power and being much much better at performance/watt to boot. I'm not even talking about the $1000 chips either, I'm talking about the sub $300 3820 that wins nearly every multi-threaded benchmark, not to mention that $500 3930K that wins every one by a wide margin.
So if you want to play games (which is what 90% of people on Slashdot really care about): Intel is price competitive with AMD and you'll have a lower-power system to boot. If you *really* care about heavily-multithreaded workloads: Intel is price competitive because the initial purchase price turns into a rounding error compared to the potential performance upside and long-term power savings you get with Intel.
Vishera is definitely better than Bulldozer, but AMD still has a long long way to go in this space.
Re: (Score:2)
GPU's benchmark game performance...
Re: (Score:3)
GPUs are very important for games, which is why an Ivy Bridge with an on-die PCIe 3.0 controller is going to do better at keeping next-generation GPUs running full-tilt than the PCIe 2.0 controller on the Northbridge of an AM3+ motherboard.
Re: (Score:2, Insightful)
AMD has never been about pure performance. It's all bang for buck. You can : buy an AMD system for much less than an Intel one, get a motherboard that has a lot more connectivity than the equivalent Intel board for less, AND get a true 2 x PCI-Ex 16x (while Intel force you to get an LGA2011 board, much costlier). The tradeoff? You'll get a machine with a CPU that perform maybe 10-20% less in benchmark than the Intel equivalent. But seriously, who cares in 2012? Most game are GPU starved, so you're much bett
Re:Here's the problem... (Score:4, Insightful)
You are missing an importan point when it comes to "money is on the line". No one in their right mind would use a desktop processor from Intel for anything critical. Why? All non-xeon processors have been crippled to not support ecc memory. If money really is on the line there is just no way that is acceptable.
Amd on the other hand does not cripple their cpu's as all. The whole Vishera lineup support ecc memory, as did Bulldozer.
The xeon equivalent of 3820 is in a completely different price league.
So please, when you compare price and use cases make sure you fully understand which processors are the actual alternatives.
Re: (Score:2)
Sure Vishera theoretically supports ECC memory, but you need a motherboard that takes ECC memory to tango... and those are a rare beast in the consumer space, meaning you really are looking at Opteron socketed motherboards and Opteron chips (which are nowhere near as cheap as Vishera). So there is no free lunch.
Re:Here's the problem... (Score:5, Informative)
Re: (Score:2)
You are missing an importan point when it comes to "money is on the line". No one in their right mind would use a desktop processor from Intel for anything critical. Why? All non-xeon processors have been crippled to not support ecc memory. If money really is on the line there is just no way that is acceptable.
Well if it's critical then I wouldn't want to use a desktop processor or system in any case, then you should get a proper Opteron/Xeon server with all the validation and redundancy and managed environment. As for the general employee, I've yet to see anyone working on a "workstation" class machine. Unless they need the horsepower for CAD or something like that my impression is that 99.9% from the receptionist to the CEO use regular non-ECC desktops/laptops to do their work and I'm pretty sure that means mon
Re: (Score:2)
When you talk about Intel being price competitive, it depends.
AMD clearly wins on budget gaming systems where the processors and motherboards are much cheaper, but Intel has the fastest high end systems out there right now.
Just a couple months ago I priced two builds with similar benchmark numbers on NewEgg, and the AMD budget gaming rig was around $800, and the Intel equivalent was around $1100.
Re: (Score:2)
Look at the Phoronix benchmarks. Vishera beats the 3770k in many benchmarks as long as you're running multithreaded code.
http://www.phoronix.com/scan.php?page=article&item=amd_fx8350_visherabdver2&num=1 [phoronix.com]
And do the math about power savings. Unless you're a folder you'll need several years to recap the additional cost of an Intel CPU.
Re: (Score:2)
You have a very interesting definition of "many". I would say that the non-overclocked 8350 beats a 3770K in "a few" multi-threaded benchmarks, usually by a small margin, while still losing by much much larger margins in many other multi-threaded benchmarks (in fact the Phoronix article barely has any lightly-threaded benchmarks in the mix at all).
The Vishera OC'd to 4.6 GHz wins a few more benchmarks, but, as I said above, all you have to do is apply an easy OC to Ivy Bridge to get up to 4G
Re: (Score:2)
I would say that the non-overclocked 8350 beats a 3770K in "a few" multi-threaded benchmarks, usually by a small margin, while still losing by much much larger margins in many other multi-threaded benchmarks (in fact the Phoronix article barely has any lightly-threaded benchmarks in the mix at all).
Yes, thats what you would say, and the last thing you would do is mention that the 3770K costs a whopping 50% more than the FX-3850.
Obviously your fanboyism has clouded your ability to grasp the situation. The AMD chip which beats this Intel chip in "a few multi-threaded benchmarks" is doing it for significantly less cost.
If you really want to do a point-for-point comparison, and count up how many benchmark tests a CPU wins or loses by, then you need to note the cost as well, or whats the point of "poi
Re: (Score:2)
Obviously your fanboyism has clouded your ability to grasp the situation. The AMD chip which beats this Intel chip in "a few multi-threaded benchmarks" is doing it for significantly less cost.
So Intel are presumably beating the snot out of AMD's margins, as the AMD chip is twice the size of the i7 yet has to sell for a fraction of the price because it's unable to compete.
And unless you're overclocking, I believe the i7 3770 will have the same performance as the 3770K at a lower price (about $30-40 less last I looked).
Re: (Score:2)
So Intel are presumably beating the snot out of AMD's margins, as the AMD chip is twice the size...
Typical fanboy knows a fact or two but doesnt know what he is talking about (transistor counts are about equal), and why would I care about AMD's margins anyways?
You seem to want to poke at AMD without considering what it means to you. I guess if you want to ignore the fact that you have to bend over and let Intel stick it right into you in order to "enjoy" an equally performing chip.. thats fine by me. Have fun paying more.. it makes you look really smart.. really.. it does.. paying more for the same pe
Re: (Score:2)
I'm an engineer not a fanboy. I takes a whole lot more engineering talent to design a laptop chip like the 3770K that is faster in the large majority of benchmarks, including the large majority of multithreaded benchmarks, than a chip which is twice the size, has a much much higher transistor budget, uses much larger caches, has a 50 - 100% larger practical power envelope, and runs at 15% higher clockspeeds.
I could hand a price gun to a homeless guy out on the street and have him slash the price of practica
tl;dr version (Score:5, Informative)
New AMD processor, higher clocks than the last one but no massive improvements performance-wise. Still rocks at multi-threaded, integer-only workloads, still sucks at single-threaded or floating-point performance, still uses a huge amount of power. AMD giving up on the high end, their top-end parts are priced against the i5 series, not the i7. Since Intel's overpricing stuff, they're still roughly competitive. Might be good for server stuff, maybe office desktops if they can get the power down, but not looking good for gaming. Overall mood seems to be "AMD isn't dead yet, but they've given up on first place".
There. Now you don't need to read TFAs.
Re: (Score:2)
The Phenom IIs were indeed pretty good processors. I was a particular fan of the X3s - they were quad-core dies that had one defective core, but were priced only a hair above their dual-cores. And they usually had all the cache of the quad-core they were based on. A very powerful low-cost processor.
I'm still sort of surprised AMD didn't revert back to the Phenom design, which seemed like it had a good amount of life left in it. Make turbo a universal option, throw some deeper cache into it, see if you can w
Re: (Score:2)
So AMD's fastest desktop CPU from 2010 beats an Intel laptop CPU from 2008? Call me impressed.
Re: (Score:2)
Re: (Score:2)
Check some of the benchmarks. In most, you're looking at a 10-15% drop in framerates - and that causes problems in at least a few cases. Starcraft 2, with graphics set low enough that it's purely a CPU bottleneck, the top Vishera gets only 48fps. Last year's much-loved i5-2500K gets 65, and every other Intel processor is higher.
I would have loved to see benchmarks from more CPU-bound games, though. GTA IV is well-known to be a heavy CPU user, but none of the reviews I read used it. Same for Minecraft - alth
Nice CPU, high power usage (Score:5, Informative)
And please take benchmark results with a pinch of salt- most of them are compiled with Intel compiler, and will have lower results on AMD CPUs just because Intel compiler will disable a lot of optimizations on AMD CPUs.
I don't know of any site which would have Java application server, MySQL/PostgreSQL, python/perl/ruby, apache/PHP, GCC/llvm benchmarks under Linux. Video transcoding or gaming on Windows is really skewed and nowhere near to what I do with my machine.
--Coder
Re: (Score:3)
I think Phoronix took Vishera through some GCC tests. They have another article about GCC optimizations for the Bulldozer architecture in general (it seems to improve some workloads by quite a bit).
Phoronix is great but I want more (Score:5, Interesting)
However, their benchmarks are often flawed. For example they did a Linux scheduler benchmark recently which measured throughput (average this or that) and not latency/interactivity (response times), which was totally useless. Well, ok, you can consider it a test checking for throughput regressions in interactivity oritiented schedulers, but it did not measure interactivity at all.
And regarding their Vishera benchmark, they measured most of their standard stuff, mostly scientific calculation, video/audio encoding, image processing, rendering. I very rarely do any of this.
The developer related benchmarks they had were Linux kernel compilation times (Vishera won), and you might count OpenSSL as well. They didn't do PostgreSQL, they didn't benchmark different programming languages, nor application servers, nor office applications, nor anything that would really interest me. I wish someone would measure Netbeans/Eclipse and other IDE performance.
And anyway, did you notice that AMD usually does much better in Phoronix reviews than in Anandtech/Toms Hardware/whatever sites? That's because Phoronix doesn't use Intel Compiler nor Windows, so results are much less skewed.
--Coder
Re: (Score:2)
You're talking about the low-jitter Linux kernel testing where they didn't test jitter at all, right? I've seen that one, it was embarrassing. However, their often incomplete (and sometimes flawed) testing is still the best we can get for Linux.
Linux already has scheduler fixes for Bulldozer, too, unlike Windows 7. Not a game changer, but it does grant another 1-3% performance, if I remember correctly. not the the Intel compiler doesn't cripple AMD quite a bit - just run Handbrake on Windows on both platfor
For linux... (Score:5, Insightful)
Perfect for the 99% (Score:2)
The new Trinity, and now these FX Procs, are perfect for "the 99%," that is to say for what 99% of people do with their machines: surf the web, check email, maybe do some photo editing or piecing together home movies.
They're cheap, reasonably fast, and support all the latest extensions and optimizations. Plus, even for enthusiast prosumers who want to screw around with things like virtualization, you can get into the IOMMU space cheaply with AMD, which is nice for a platform like ESXi that has a mature PCI
Re: (Score:3)
Re: (Score:2)
The new Trinity, and now these FX Procs, are perfect for "the 99%," that is to say for what 99% of people do with their machines: surf the web, check email, maybe do some photo editing or piecing together home movies.
I did all those things on a Pentium-4.
I agree about AMD's low-end CPUs, but why would you want an 8-core CPU to do any of those things? Does an email program really need eight cores these days?
Re:Perfect for the 99% (Score:5, Interesting)
I am typing this on a Phenom II 6-core system. It is quiet, 45 watts, and at the time (2010) it was only 10-15% slower than an icore5. What did I get that the intel icore5 didn't?
- My whole system including the graphics card was $599! Also an Asus motherboard by the way too and part of their extended warranty boards.
- Non crippled bios where I can run virtualization extensions (most intel mobos turn this off except on icore7s)
- 45 watts
- My ati 5750 works well with the chipset
- the AM3 socket can work with multiple cpus after bios updates.
What the icore5 has
- It is made by intel
- It is 15% faster
- The cost of the cpu alone is 2x the price and I can pretty much include a motherboard as well if you are talking up to icore7s.
An icore7 system costs $1200 at the store. An icore5 gaming system similiarly specced cost $850 and does not include virtualization support to run VMWare or Virtualbox.
The FX systems ... ble. I am not a fan. But for what I do AMD offered a quieter cheaper system that could run VMs of Linux and can upgrade easier. To me my graphics card and hard drive are the bottlenecks. I would rather save money on the cpu. I was so hoping AMD would use this to have a great graphics for tablets and notebooks :-(
Re: (Score:2)
Sure, when all I am doing is surfing the web then it doesnt matter, but most of the time that I am surfing the web I've got other processes going that are actively doing work.. on top of the browsing I frequently have netflix/hulu running, and either a big compile or a video encode is also sometimes taking place (in fact, when something really time consuming begins, I am *always* browsing the web during it)
The system
Re: (Score:3)
I am typing this on a Phenom II 6-core system. It is quiet, 45 watts (...) - 45 watts
I guess if the facts don't support your argument, make something up. All the Phenom II 6-core CPUs have either 95W or 125W TDP [wikipedia.org]. But yes, the X6 was a quite competitive chip by offering you 50% more cores for about the same money as an Intel quad. Anandtech's conclusion [anandtech.com] did have a prelude to what was coming though:
You start running into problems when you look at lightly threaded applications or mixed workloads that aren't always stressing all six cores. In these situations Intel's quad-core Lynnfield processors (Core i5 700 series and Core i7 800 series) are better buys. They give you better performance in these light or mixed workload scenarios, not to mention lower overall power consumption.
Let's look at what has happened since 2010:
Cinebench R10 single-threaded: [anandtech.com]
Intel Core i5 750: 4238
Intel Core i5 3570K: 6557
AMD Phenom II X6 1090T: 3958
AMD FX-8350: 4319
Intel has improved 55%. AMD? 9%
Re: (Score:3)
The new Trinity, and now these FX Procs, are perfect for "the 99%," that is to say for what 99% of people do with their machines: surf the web, check email, maybe do some photo editing or piecing together home movies.
If you're building a "good enough" system for a non-technical user, why in the world would you even consider a Vishera FX CPU? It's expensive, power-hungry, and has a high TDP. And it doesn't even have integrated graphics, so you'd have to add the expense of a discrete graphics card.
For an in
Re: (Score:2)
What's the intended audience for Vishera
The audience is people that want the most performance they can get for their $140, and thats not the "I shopped around to find a great price on CPU's that have piled up in someones inventory" price.. thats the "I went to the place the most people trust" price.. The straight NewEgg introductory price for the FX-6300 Vishera is $139.99.
SATSQ (Score:5, Informative)
AMD FX-8350 Review: Does Piledriver Fix Bulldozer's Flaws?
No. It still guzzles power like crazy compared to Sandy/Ivy Bridge, and its single-threaded performance still sucks royally. (And that's still very important since many, many programs cannot and will not ever support full multithreading.)
Ya I having trouble seeing where it is good (Score:2)
Low end would be fine but it would need to be low power there too. Ivy Bridge CPUs are great at sipping power, particularly the dual core variety. So if what I'm doing is just real light usage like web surfing and so on, I'm better off with that.
For heavier usage, well the Intel CPUs are better particularly at floating point calculations which is what most heavy performance is these days, at least on desktops. All the programs I can think of that I have which hit the CPU real heavy, are doing FP stuff, not
A space heater included (Score:3)
Man I really want AMD to win!
I am typing this on a phenom II which is a better chip in my opinion and fast at the time in (unfortunately in 2010 standards). But these things run well over 130 watts, are loud with huge freaking fans, 4.4 ghz, and it seems AMD is trying to pump out as much speed as possible to beat intel's lowest end chips.
Just call it pentium IV 2.0 while we are at it? I am not a fan of intel because I run vmware and hate that intel cripples its chips and the bios to exclude virtualization on all but the most expensive units. I hate the cost of a high end icore 7 which in 2010 was only 10 - 15% faster than a Phenom II but cost 400% more where I can buy a whole system for the cost of a single intel core 7 extreme.
Well gentlemen. Expect dark days ahead and a return to $1000 desktops, $500 chips, and virtualization only available on xeon chips by next year. :-(
With AMD junk status [arstechnica.com] it is bound to happen now since these chips can't match intels offering.
What did Intel do? (Score:2)
I have to laugh (Score:3)
When you call an 8 core 4GHz CPU low-end.
Re: (Score:3)
CPU's tends not to be the bottle neck these days its either the hard drive or slow internet connection.
AMD fanboys have been saying that for years as AMD lagged behind Intel's CPU performance. But they then keep telling people they should buy AMD 8-core CPUs instead of Intel 4-core, when, if they really believed what they were saying, they would be telling people to buy the cheapest dual-core available, or an Atom.
Seriously, why would you possibly think an 8-core CPU was 'perfect' for your parents' PC, no matter who makes it? If they're not CPU-limited, buy the cheapest CPU you can get.
Re: (Score:3, Funny)
For non-technical parents and other users, I actually like lots of cores and lots of memory-- more so than 'power users.'
Have you ever seen some people boot up their machines? It will take 5 minutes because of the sheer amount of crap installed.
It will be shit like two different anti-virus suites (the first one's subscription expired, and they installed a new one without uninstalling the old). There will be a update checker for every conceivable thing that's been installed. There's the preloader softwar
Re: (Score:3)
Have you ever seen some people boot up their machines? It will take 5 minutes because of the sheer amount of crap installed.
So uninstall the crap or install an SSD. A faster CPU makes far less difference to boot time than an SSD, because boot time is mostly limited by the seek performance of the disk as it loads all that crap into RAM.
Re: (Score:2)
My core 2 duo with 2gb RAM booted Vista in 15 seconds. What operating system do you need more cores or RAM to boot with? For that matter, how do more cores and RAM really help booting that much?
It only had 2gb because of a bug in which it would not install with 4gb with my motherboard. It was soon upgraded to 4gb after a hotfix, and now has 8gb because one of the original sticks died, and there were 2 4gb sticks floating about. I'm not sure how long it takes to boot now, but it'll probably be a lot long
Comment removed (Score:5, Informative)