The Loyalty To AMD's GPU Product Among AMD CPU Buyers Is Decreasing (parsec.tv) 157
An anonymous reader shares a report: Data from the builds on PCPartPicker show an interesting trend among the buyers of AMD CPUs. Of the 25,780 builds on PCPartPicker from the last 31 months with a price point between $450âS - $5,000, 19% included an AMD CPU. This is in-line with the Steam Hardware Surveys, but things have changed recently. Builds with AMD CPUs tend to be much less expensive than those with Intel CPUs. The builds with an AMD CPU were $967 on average versus the Intel CPU builds, which were on average $1,570. In the last 31 months, brand loyalty to AMD seemed to push AMD CPU builders to choose AMD graphics cards at a much higher rate than Intel CPU builders. 55% of machines with an AMD CPU also had an AMD GPU; whereas, only 19% of builds with an Intel CPU included an AMD GPU. In the last six months, AMD has started to lose even more ground to Intel and to Nvidia. On the CPU builds, only 10% of gamers building on PCPartPicker were opting to buy an AMD CPU. Among these, the percentage that decided to pair their AMD CPU with an AMD GPU dropped to 51%. The challenges that AMD is seeing in the overall GPU market are being felt even amongst their loyal supporters.
No shit Sherlock (Score:5, Insightful)
Comment removed (Score:4, Interesting)
Re: (Score:3)
I'm in the same boat, happy AMD customer not needing to upgrade yet. (A-8 6600K APU)
The chip I have I selected because it uses less power than the faster chips. This is a whole new situation. Companies can't expect the same level of hardware thrash in desktops as existed in the past.
The companies that can be happy with more stable sales will survive, the ones addicted to growth will immolate themselves.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2, Insightful)
Which is where they're extremely good but does anybody remember Voodoo? 3dfx? [wikipedia.org] Anybody?
Re: (Score:2)
Me, I had the Monster 3D (first version) and it was the first 3D card to enable silk smooth FPS and 3D simulators. It was day and night and blew everything else out
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
The X86 line is old, tired. Now there's Zen in the future but I'm not holding my breath. It's like waiting for the dragons in a well known TV drama. Their A10 line isn't that great unless you want a really cheap x86 setup; i5s beat them in almost every category. Also getting sued because your core count you advertised isn't what you get doesn't bode well in terms of your roadmap or architecture plans. It also doesn't help when everybody looks at your flagship bulldozer architecture and says "Meh."
Haven't the new ones been out a while? (Score:2)
Re: (Score:3)
of course people aren't buying AMD CPU's in the last six months, we've been waiting for the new ones to come out.
And yet the article is about AMD CPU users preferring GeForce over Radeon, not buyers switching over to Intel CPUs, so you might wanna read the summary. Anyway, to me, in 2016, the only "new" AMD GPU worth your USD is the RX480 8GB which stuck right in the middle of the chart. In contrary, Nvidia releases a full spectrum of GPUs for anyone from any budget class, from the quite affordable 1060 to the new Titan X. AMD CPU users wanting to spend more (or less in the case of 1060) than what the RX480 can offer
because it's too damn hot! (Score:2, Funny)
my gtx 780 blew a resistor or something. since it was last minute, I bought a r390x because it was cheap.
I no longer bother turning the heat on in my office. What's the point?
Re: (Score:2)
Yeah, too hot in the heat waves. :(
Not rocket science (Score:3)
AMD GPUs, not so much.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
an new
You're the first person I've ever encountered who considers the letter n in "new" to be silent. It must get a little confusing for listeners when you're complaining that the 'ew computers aren't really very 'ew and you want something better.
Re: (Score:2)
Thanks Captain Obvious.
It's called a joke. A JOKE. You know, someone makes a mistake with potentially funny consequences, and you point it out to tease them a bit? A Joke.
Way to ruin the joke.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
AMD's GPUs are the successors of ATi, which was never a leader in the graphics market when it was a separate company.
What? Yes it was. It absolutely was. The two market leaders in graphics have been ATI (and now AMD) and nVidia for as long as anyone needs to remember.
That's me.... (Score:2)
I am looking hard at Zen for my desktop as I have run out of SATA ports and the CPU is starting to show its age when running 2-3 VMs. Then I will consider the next gen AMD GPU.. maybe in another year or so.
Re: (Score:2)
Haven't you a free slot (PCIe 1x or 16x) for a dirt cheap PCIe 1x controller board to add two more SATA ports?
Also, funnily, perhaps you can add more virtual CPUs to your VMs. Overallocate, and let the schedulers sort them out.
Re: (Score:2)
Of course AMD desktops are cheaper (Score:2)
Re: (Score:2)
Rubbish, if you want a cluster compute node with 64 cores and 1TB of RAM you can get an AMD solution a vast amount cheaper than an Intel one. Your "high end" is not as high as you think it is.
Shit (Score:2, Interesting)
I guess this means I will have to buy a complete AMD system on my next desktop gaming PC upgrade.
I have this terrible fear of a reality in which AMD has shut down and the world is at the mercy of the one and only Intel GPU monopoly.
Re: (Score:2)
And of course I totally meant "NVidia GPU monopoly" in my post above. :/
I'm switching to nVidia this year... (Score:2)
NVIDIA has cornered the market (Score:2)
Re: (Score:2)
Dude 1070 isn't high end.
Wake me up when ATI comes up with anything that can outperform my Pascal TitanX, has at least equally stable drivers, not need multiple cards/slots, not sound like a jet taking off, and not heat my whole house up.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Dear AMD: (Score:5, Insightful)
Yes, we are a pretty small slice of the gaming (or general computing) pie. But we are influential. We're the ones people turn to when they ask what they should buy. Some of us (not me) will start submitting useful GPU driver patches to you, for free.
What have you got to lose? Do you really think your current drivers are so goddamn awesome that NVIDIA is going to use them for inspiration?
Re: (Score:2)
Re: (Score:2)
As far as GPUs go, admittedly these days I'm out of the loop when it comes to cutting edge gaming, but I thought ATI/AMD has basically always managed to stay relatively close to NVIDIA when it comes to the hardware, but they always lagged behind on the driver front.
Re: (Score:2)
Re: (Score:1)
The drop in AMD graphics card use is because the world is switching to LINUX, not because of any branding nonsense. NVidea rules Linux, so that's the reason.
In the current post-truth era, facts just get in the way. Making logical inferences is so 2016. If you want to prevail, just lie your teeth out and get a bunch of know nothings to become you fanatic followers. In this case, Linux fan boys. Then have them repeat your nonsense and attack anyone who disagrees. One you get the
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
1. Yes yes, they'v
Re: (Score:2)
Re: (Score:2)
The good will isn't nothing, either. AMD needs to *stand* for something. Even when they were beating Intel in performance with the early Athlons (as well as price), Intel still destroyed them in the marketplace. They need to st
Re: (Score:2)
Re: (Score:2)
I haven't gamed seriously in quite a few years, and it's been even more years since I've owned ATI/AMD GPU, so I'm probably behind the times. I do recall someone recently bitching about the fact that the open sourced Radeon driver had taken over as default in their favorite distro, re
Re: (Score:2)
There was also a competing semi-open s
Re: (Score:2)
Re: (Score:2)
AMD has previously made the vast majority of their profits selling off their technologies and research
Their driver technology? They're been making money off of "selling" their drivers? To whom?
Who is buying up their CPU designs, for that matter? I know I'd drop Intel in a heartbeat if AMD had a fully open, audited design that supported all the instruction sets I care about.
You don't even know their history on this and how it's not helped them.
That's their fault for marketing it shitty then, if that's true. But given your other comments, I suspect they didn't do what I was saying. I'm not saying whitewash it with OSS; I'm saying actually embrace it.
This isn't like soft
Re: (Score:2)
Re: (Score:2)
Considering it's been brought up on Slashdot previously, that's your own willful ignorance in my opinion.
It's willful ignorance that I don't follow the driver status for a company I gave up on like 8 years ago? The last post on slashdot I saw about regarding drivers is someone bitching that the OSS Radeon driver was being included by default in their favorite distro instead of the proprietary one.
That little tidbit, apocryphal as it may be, did not to me imply that AMD had open sourced the good stuff. And maybe they can't for legal reasons, which sucks for them, but seeing as how ATI/AMD has had driver issu
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
If I could ditch AMT and other worrying out of band stuff, I'd gladly sacrifice a bit of performance. My broader point here is about AMD having an identity beyond "the cheap one." For a few years they were both cheaper and more powerful, but even with that killer combo they couldn't beat In
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
The opensource x.org initiatives, while many of the issues above are now resolved, will never be performant in the way nVidia is and lead to the creation of alternative graphical backends like Wayland, which, are trying to resolve these issues for 'good' opensource citizens
Wayland is not in any meaningful way an alternative to the nVidia driver. It basically says you do the rendering, I'll do the compositing. For say full-screen games that means it's dumped straight to the display buffer while doing practically nothing. Right now the nVidia driver doesn't know how to play nice with Wayland but patches are coming [gnome.org], when it does it'll still do 99% of the heavy lifting.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
2. They should advertise what they *are* doing better. Make a brand identity out of it. I'm sorry I don't follow every AMD story out there reli
Re: (Score:2)
Re: (Score:2)
But I'm sorry but the fact that you as a FOSS user doesn't know about it? Just shows how shitty the community is
That's bullshit. Do you claim to know even 1/10 of the FOSS contributions in the world today? Were you aware that, for instance, in ~2012 that Intel released Cilk++ as open source when previously it was proprietary? Is every FOSS user in the world supposed to automatically know this the moment it happens?
It's not a clubhouse. It's AMD's responsibility to say something if they want people to sit up and take notice. Your previous post implied AMD did it less than two years ago, and I haven't bought any AMD
As it should be (Score:1)
Re: (Score:2)
Maybe because they aren't the best right now. (Score:3)
Re: (Score:2)
Thats a lot of money for a video card. If that floats your boat then more power to you. In my opinion though, games move too fast these days for that level of detail to even be noticeable. I might lay down 250$ for an AMD 480 soon though. 600+ is just way too much for me to justify when you can buy a whole computer for that.
Re: (Score:2)
Ha! (Score:1)
Ironically, I still use AMD graphics cards, but I switched to Intel CPUs a while back.
Loyal support of a giant company is just dumb (Score:2)
Unless the company has personally treated you amazing, being the white knight for a giant corporation is an incredibly dumb thing to do, and something you only do because we're just hairless tribal chimps. But you can overcome that.
Be a whore. Buy whatever's best at the time. When the AMD M1s were out I used nothing but. Then they slowly fell behind and I switched to Intels, and have been there ever since - I had hopes for Piledriver, but no. But if the AMD Zen is as good as it looks then I'll be all over
AMD experience has been poor of late (Score:2)
Sorry, but the last straw for me was when I upgraded the radeon drivers on my W10 machine (which I use for gaming). It took an hour to remove all the crapware AMD installed in addition to the drivers. Particularly onerous was their new video recording technology deciding that it would record a game session without telling me so it could pop up a 'see how great this was' window later on.
My answer - spend an hour removing it all from the machine. Then go out and replace my radeon card with a low-end GTX 10
AMD ZEN needs to come intel pci-e lanes suck (Score:2)
AMD ZEN needs to come intel pci-e lanes suck and with pci-e storage, usb 3.X more are needed.
Stability (Score:2)
AMD GPU is a viable option, CPU is not (Score:2)
The GPUs are in some cases (often in my country) about 65 to 75% of the price of the nvidia options but 75 to 95% as fast...... they are often a no brainer product.
The CPUs however are atrocious, AMD offers nothing and has offered nothing in a heck of a long time, their new stuff best be significantly better or in the very least, quite a bit better and MUCH cheaper than Intel
I've always split my *PUs. (Score:2)
Drivers (Score:2)
I love the AMD GPU hardware but I regret buying my RX 470. I am stuck on Windows 8.1 because of drivers crashing SWTOR and Hyper-V Virtual machines on 10
Re:Former AMD User (Score:5, Funny)
fukin hell man, use the enter key once in a while
ps: no one cares about your entire cpu history
Re: (Score:2)
When I quote the GP, I see new paragraphs at "As far as AMD goes, I had a ATI/AMD Rage 128" and "They have my business with video cards,". He probably has Posting options set to "HTML Formatted" which requires <br> or <p> tags for new lines. I prefer "Plain Old Text" because I can use enter.
AMD had the market when Intel was marketing the Flop of the Pentium 4 (Spaceheater 4). Leading the way with 64 bit as well. My understanding AMD management starting going out of control at that point. Intel t
Re: (Score:2)
This. I just switched to Plain Text. Thats for the help! I read articles a lot via RSS but hardly comment or visit the site directly anymore.
Ah I remember those Pentium 4 Mobile's as well. I worked on a few of those at my old job, cleaning up dried out thermal paste and gunked up giant fans that were in those laptops.
The K8's also weren't bad. I almost switched then but kept a Socket 478 1.6ghz P4 for a while. Wish I went K8 then though, would have had DDR memory instead of more expensive PC133 at the time.
Re: (Score:2)
Here's mine but more than CPU. At least there are bullets and paragraphs. Enjoy! :P
Re: (Score:2)
Re: (Score:2)
(YMMV)
Re: (Score:2)
I can't even tell what their newest CPU's are.
The easy answer regarding what to look for and what to ignore: Buy an Fx-8300 for normal work, or an X4 845 for games, or one of the new Bristol Ridges for something that can be GPU-accelerated, once they hit the market, and in any other case (including high end), wait for Zen.
Intel does not "make it easy" (Score:5, Informative)
Intel makes it easy, i3 Basic, i5 mid range i7 high end.
First off, this information is useless without knowing the generation (Sandy Vagina or whatever) and even knowing the generation isn't nearly enough information. U (low power) variants are slower across the board, K variants mean overclockability or something, and if you actually care about specific features like AMT, Vt-d, Vt-x, AES-NI, etc. you pretty much *have* to head on over to Ark [intel.com] because there's no consistency whatsoever. I've seen i7s that didn't support Vt-d and goddamn 1.5ghz Celerons that did.
Their market segmentation strategy is chaos and the i3/5/7 thing is pretty much worthless, though admittedly Ark is nice saving grace that I really wish AMD would copy.
Re:Intel does not "make it easy" (Score:5, Insightful)
I've seen i7s that didn't support Vt-d and goddamn 1.5ghz Celerons that did.
That is one of the reasons why Intel can generally go fuck itself. You're much more likely to find whatever AMD has implemented in a CPU in all of them. ECC on AM1? Why Not?(TM)
Re: (Score:2)
Re: (Score:2)
That's how great my AMD APU is; it acts so much like everything else, nobody can even tell what is different. I mean, unless they're looking at the power usage or MB part count, or the lack of video "card" ;)
Re: (Score:2)
Plus they throw in a low end "Pentium" and lower end "Celeron". How low end? Don't look at clock speed, head over to benchmarks.
And dual/quad / hyperthreading changes between desktop and mobile.
Eg: last I checked desktop i5: quad core, no hyper threading, mobile i5: dual core with hyper threading
Desktop i5: Quad core with hyper threading, mobile i7: Quad core no hyper threading. Or is it some Dual core, some quadcore, some with, some without hyperthreading.
Re: (Score:2)
Plus they throw in a low end "Pentium" and lower end "Celeron". How low end? Don't look at clock speed, head over to benchmarks.
You forgot Atom, too. Baytrail was surprisingly good and supported Vt-x, not that you're likely to find it in a machine with the RAM and cooling required to do a ton of virtualization.
But yeah, I forgot about the hyper threading madness.
Re: (Score:2)
Atom at least tries to have a purpose: Very low power consumption at the expense of performance, for use in devices like tablets. Modern "Pentium" and "Celeron" chips are just low end Desktop / laptop chips.
Re: (Score:2)
Well you don't need to know the generation name, it's the first number of the model number. I never refer to them as Sandy Bridge, Ivy Bridge, etc.. just "2nd gen" "3rd gen" etc. And it's pretty easy, U is low power, K is unlocked multiplier, then m3, m5 and m7 are the new lower power variants for the 7th gen CPU's instead of using the U designation. And like you said if you need more info, a quick Google away with Ark. AMD has nothing like it. i have to look up reviews to see that their current high end CP
Re: (Score:2)
Well you don't need to know the generation name, it's the first number of the model number.
Yeah but the existence of the name means that, if the number isn't given, you don't instantly know what they're talking about unless you have these things memorized. At least Android and Ubuntu had the good sense to go alphabetically.
then m3, m5 and m7 are the new lower power variants for the 7th gen CPU's instead of using the U designation.
Lovely.
a quick Google away with Ark
This is admittedly nice but on the flipside, as several others have noted, AMD doesn't do nearly as much of these "let's disable random features with no rhyme or reason" market segmentation games. I wish they had an Ark equivalent, but their products themselves are, at l
Re: (Score:2)
Re: (Score:2)
It really does seem as though they have a genetic algorithm trying to optimize their market segmentation strategy for them.
Re: (Score:2)
Yeah, but it's too bad that Nvidia doesn't feel the same way.
Re: (Score:2)
Just out out of interest, other than a random religious war, why pick on GPU drivers?
I mean for example do you have the source code to your motherboard bios?
Can you even get the binary blob for the embedded CPU in your microwave or dishwasher?
Even If nvidia gave you the soruce code could/why would you change it?
Re: (Score:2)
You don't need the source to your BIOS to boot an OS; it's an entirely separate thing. (Unless your BIOS will only boot a cryptographically signed kernel.)
You don't need the source code to your microwave to run an OS on your computer. And while it'd be nice to be able to modify your microwave to play DOOM or whatever, most people just don't care much about that, they just want it to cook their food. Microwaves and dishwashers just aren't something many people care about modifying; they're simple applianc
Re: (Score:2)
>> You don't need the source to your BIOS to boot an OS; it's an entirely separate thing.
No it really isn't. You also don't need the source to the loadable firmware of your GPU to be able to use it either.
>> the problem is that their driver just doesn't work that well in Linux because of the way it's packaged,
Nvidia GPUs have always worked MUCH better/more reliably than any ATI GPU/driver i've ever (tried to) use under Linux, at least for me. Most linux distros already come with nouveau installe
Re: (Score:2)
Stop being loyal to brands cause they certainly arnt loyal to you.
Define loyalty ? Brands position themselves as being certain things, and as long as they maintain that they are "loyal to you".
Nvidia is a good example of a computer hardware company that has maintained it's brand values, if you have one of their cards for whatever reasons you are going to to get good performance from it. ATI/AMD not so much. They have had crap drivers for windows going all the way back to win 95. That's another feature of understanding a brand.
Re: (Score:2)
$1570 average for an Intel game machine. Meanwhile all the AAA games are designed with a console in mind, and those sell for $200-$250.
You can build a console-murdering PC for around six to eight hundred bucks.
There is a price penalty for Intel, but it is superior. Notably, minimum frame rates are higher. I went AMD anyway because the CPU was a hundred bucks cheaper than an Intel processor about 15% faster. But I went with an nVidia video card because I've been down that other road enough times already.