Forgot your password?
typodupeerror
AMD Hardware

AMD FX-8350 Review: Does Piledriver Fix Bulldozer's Flaws? 259

Posted by Soulskill
from the driving-piles-and-dozing-bulls dept.
An anonymous reader writes "AMD just officially took the wraps off Vishera, its next generation of FX processors. Vishera is Piledriver-based like the recently-released Trinity APUs, and the successor to last year's Bulldozer CPU architecture. The octo-core flagship FX-8350 runs at 4.0 GHz and is listed for just $195. The 8350 is followed by the 3.5 GHz FX-8320 at $169. Hexa-core and quad-core parts are also launching, at $132 and $122, respectively. So how does Vishera stack up to Intel's lineup? The answer to that isn't so simple. The FX-8350 can't even beat Intel's previous-generation Core i5-2550K in single-threaded applications, yet it comes very close to matching the much more expensive ($330), current-gen Core i7-3770K in multi-threaded workloads. Vishera's weak point, however, is in power efficiency. On average, the FX-8350 uses about 50 W more than the i7-3770K. Intel aside, the Piledriver-based FX-8350 is a whole lot better than last year's Bulldozer-based FX-8150 which debuted at $235. While some of this has to do with performance improvements, that fact that AMD is asking $40 less this time around certainly doesn't hurt either. At under $200, AMD finally gives the enthusiast builder something to think about, albeit on the low-end." Reviews are available at plenty of other hardware sites, too. Pick your favorite: PC Perspective, Tech Report, Extreme Tech, Hot Hardware, AnandTech, and [H]ard|OCP.
This discussion has been archived. No new comments can be posted.

AMD FX-8350 Review: Does Piledriver Fix Bulldozer's Flaws?

Comments Filter:
  • by Animal Farm Pig (1600047) on Tuesday October 23, 2012 @03:01PM (#41743699)

    I agree about multithreaded performance being important thing moving forward.

    Regarding power consumption, anandtech review [anandtech.com] puts total system power consumption for Vishera tested at 12-13W more than Ivy Bridge. Scroll to bottom of page for chart. Bar and line graphs at top of page are misleading-- they put x axis at 50W, not 0W.

    If you are concerned about power consumption, find 100W lightbulb in your house. Replace with CFL. You will have greater energy saving.

  • by Anonymous Coward on Tuesday October 23, 2012 @03:27PM (#41744027)

    AMD has never been about pure performance. It's all bang for buck. You can : buy an AMD system for much less than an Intel one, get a motherboard that has a lot more connectivity than the equivalent Intel board for less, AND get a true 2 x PCI-Ex 16x (while Intel force you to get an LGA2011 board, much costlier). The tradeoff? You'll get a machine with a CPU that perform maybe 10-20% less in benchmark than the Intel equivalent. But seriously, who cares in 2012? Most game are GPU starved, so you're much better spending that 50-70$ that an Intel would have costed you on a better GPU. And most day to day operation can be sped up by getting a fast SSD and more RAM. Get used to it, the CPU is simply not the main bottleneck anymore.

    HOWEVER, if you're an overclocker that looks to shatter the world benchmark record or a mad scientist that need workstation-like horse power AND you have the cash to spend on it, I'd say go for Intel. It'll worth every penny.

  • by Anonymous Coward on Tuesday October 23, 2012 @03:29PM (#41744049)

    You are missing an importan point when it comes to "money is on the line". No one in their right mind would use a desktop processor from Intel for anything critical. Why? All non-xeon processors have been crippled to not support ecc memory. If money really is on the line there is just no way that is acceptable.

    Amd on the other hand does not cripple their cpu's as all. The whole Vishera lineup support ecc memory, as did Bulldozer.
    The xeon equivalent of 3820 is in a completely different price league.

    So please, when you compare price and use cases make sure you fully understand which processors are the actual alternatives.

  • by war4peace (1628283) on Tuesday October 23, 2012 @03:30PM (#41744077)

    I play games maybe 1h 30m a day on average. My 5 year old dual-core E6750 overclocked at 3.2 GHz handles most of them gracefully, but there are some new releases which require more processing power. However, in choosing a new platform, I'm mostly looking at TDP, not from a consumption perspective, but heat dissipation. I hate having to use a noisy cooler.
    My current CPU has a TDP of 65W and a Scythe Ninja 1 as cooler, and the fan usually stays at 0% when the CPU is idling. While gaming, I can't figure out whtehr it makes noise, because my GPU cooling system makes enough noise to cover it. And I'd like to keep it that way when I pick my new CPU.

    You're saying that graphs are misleading. No, they're not, if one has half a brain. I'm not looking at the hard numbers and the power consumption difference is of about 100W. The i5 3570K draws about 98W and Zambezi and Vishera (who the fuck names these things?) draw around 200W. if you put TWO i5 on top of the other, they barely reach ONE AMD cpu power consumption. Thanks, but things DO look bad for AMD. I'll just have to pass.

  • by Lumpy (12016) on Tuesday October 23, 2012 @03:32PM (#41744093) Homepage

    "Surely you already did that 3 years ago, so now you're preparing for the switch to LED?"

    Why? so I can get less light output for the same watts used but instead of spending $8.95 per bulb I get to spend $39.99?

    LED is a joke for home lighting, only fools are buying it right now. CFL is still way more efficient.

  • by Kjella (173770) on Tuesday October 23, 2012 @03:33PM (#41744097) Homepage

    I think the biggest factor for your home desktop is noise - it takes a lot more airflow to remove 125W of heat than 77W of heat. In Anandtech's tests he actually measures 195W versus 120W total system power consumption. Sure it might not matter much if you plan to put a noisy 200W+ graphics card or two in it, but for non-gamer use I'd say that's pretty significant.

  • For linux... (Score:5, Insightful)

    by ak3ldama (554026) <james_akeldamaNO@SPAMyahoo.com> on Tuesday October 23, 2012 @03:34PM (#41744105) Homepage Journal
    Here are a set of benchmarks that are more centered on the Linux world from phoronix [phoronix.com] and are thus a little less prone to intel compiler discrimination. The results seem more realistic: better and worse and similar to an i7 at different work, still hard on power usage, low purchase price.
  • by LordLimecat (1103839) on Tuesday October 23, 2012 @04:15PM (#41744569)

    You live in an apartment and dont plan to be there for 20 years?

    I imagine for a lot of people, dumping $40 into each light socket is a losing proposition for you, and a winner for your landlord (who I am sure would greatly appreciate the gift).

  • by afidel (530433) on Tuesday October 23, 2012 @04:26PM (#41744721)

    LED's running on AC will fail for the same reason most CFL's fail, the ballast.

  • by tftp (111690) on Tuesday October 23, 2012 @04:31PM (#41744801) Homepage
    Light bulbs are easily removable, and there can't be that many of them in a rented apartment - or even in a rented home. Take them with you, since you will need them at the new place anyway, and leave the old ones (that you saved) in their place.
  • by guises (2423402) on Tuesday October 23, 2012 @05:03PM (#41745219)
    Well if that's no good, how about this: http://www.silentpcreview.com/Ninja2 [silentpcreview.com]

    This is what I use on my Athlon 2, works perfectly, is very quiet, and it's rather old now so you could probably pick a used one up pretty cheap.
  • by Lumpy (12016) on Tuesday October 23, 2012 @05:13PM (#41745341) Homepage

    Because the incandescent ban is for the old out of date crap that sucked and worked better as a heater than a lightbulb. Halogen bulbs are the replacement. Who did you get your education on the ban from? Because they did not know what they are talking about, you should stop listening to the uneducated news network.

One small step for man, one giant stumble for mankind.

Working...