AMD's Project Quantum Gaming PC Contains Intel CPU 138
nateman1352 links to an article at Tom's Hardware which makes the interesting point that chip-maker AMD will offer Intel -- rather than AMD -- CPUs in their upcoming high-end gaming PC. (High-end for being based on integrated components, at least.) From the article:
Recently, AMD showed off its plans for its Fiji based graphics products, among which was Project Quantum – a small form factor PC that packs not one, but two Fiji graphics processors. Since the announcement, KitGuru picked up on something, noticing that the system packs an Intel Core i7-4790K "Devil's Canyon" CPU. We hardly need to point out that it is rather intriguing to see AMD use its largest competitor's CPU in its own product, when AMD is a CPU maker itself.
The answer's simple... (Score:5, Insightful)
The i7 4790k is faster than any CPU AMD make, by quite a wide margin. They're trying to sell this as the ultimate graphics crunching box... That needs a faster CPU than they can produce.
Yes (Score:5, Interesting)
Also gives a pretty good internal target for AMD - v2 of this box WILL have an AMD CPU in it (or else we're getting out of the CPU market).
Re:Yes (Score:4, Insightful)
Re: (Score:2, Informative)
no, thats not true.
amd pioneered the modern architectural shift towards cpu/memory/io/cache complexes being switched networks
for a brief period of time they totally outshone intel which was still culturally crippled by the concept of
a parallel 'front end bus'
the truth is, that in the 'free markets', players tend to snowball and take the whole pie. thats just happens.
cisco, microsoft, google, intel, apple.
sorry legitimate competion
Re: (Score:1, Flamebait)
for a brief period of time they totally outshone intel which was still culturally crippled by the concept of
a parallel 'front end bus'
Um, that's utter crap.
AMD beat Intel when it was crippled by the concept of 'long pipelines with high clock frequencies' and 'pushing 64-bit users onto Itanium'. The early Core chips took back the crown from AMD, and they had an FSB.
'AMD rulez because no FSB' was just AMD fanboy claptrap, like 'AMD rulez because Intel put two chips in one package for a quad-core, but AMD puts four on one chip.' None of those things made any significant difference to performance in that era.
Re:Yes (Score:5, Insightful)
Um, that's utter crap.
No it ain't. AMD at that point had an actually scalable architecture using hyper transport and could scale in multi socket boxes way, way better than Intel. It made a huge difference.
Re:Yes (Score:5, Insightful)
It's not crap. The Athlons crushed the Pentium 4's. I remember that very clearly. Slashdot people should know this unless you were born yesterday.
Re: (Score:2)
unless you were born yesterday.
Rational explanation for phenomenon has been found. News at 10.
Re: (Score:1)
It's not crap. The Athlons crushed the Pentium 4's. I remember that very clearly. Slashdot people should know this unless you were born yesterday.
1. Comparably priced 32-bit Athlons only beat the P4 on floating-point-intensive tasks that didn't use SSE. Which typically meant games, but not pro-level 3D applications like the ones I was running.
2. The performance difference was nothing to do with the FSB.
Re: (Score:2)
Yes. AMD had a brief window of a few years of total dominance. I recall the Athlon64 (AthlonXP?) crushed the Intel counterparts. They were faster, cooler, and better supported. Then Intel ditched the P4 and went to Core 2 Duo's and the rest is history.
I am kind of shocked however on the decision. Usually corporate types like to stick head in the sand and make believe hard enough until the get their bonuses. I wonder how far up the chain this decision went, and if not far enough if heads will roll. If this w
Re: (Score:3)
'AMD rulez because no FSB' was just AMD fanboy claptrap, like 'AMD rulez because Intel put two chips in one package for a quad-core, but AMD puts four on one chip.' None of those things made any significant difference to performance in that era.
What? Yes they did. The AMD chips had substantially more memory bandwidth at the time, which was readily discoverable, and they had a far superior NUMA interconnect technology which permitted them to do many more cores in a single machine without nearly as much contention as the crossbar architecture intel was using. Intel has since come around to a more modern bus and has a faster on-board memory controller than AMD, which just proves how wrong you are; if it's the inferior approach, intel wouldn't have ad
Re: (Score:1)
Like I said, AMD fanboy claptrap. Memory bandwidth has never had much impact on the majority of computing tasks, which are typically limited by CPU performance and memory latency.
Re: (Score:2)
Like I said, AMD fanboy claptrap.
You said it, but you were a big idiot at the time, so you talked shit.
Memory bandwidth has never had much impact on the majority of computing tasks
There is only one typical computing task which stresses the home PC, and that is gaming — where memory bandwidth is absolutely relevant. And guess what this conversation is about? Yeah, if you look at the page title there, you might see that it's about gaming, you fucking ignoranus.
Re: (Score:2)
you fucking ignoranus
Dear sir, may I ask you to keep this conversation civil, on-topic and orderly, as is custom when having friendly conversation over the Internet.
Re: (Score:2)
Um, that's utter crap.
I beg to differ. clock for clock, AMD blows Intel out of the water. Clocks-per-instruction (CPI) is a useful metric when it comes to comparing CPU horsepower across differing architectures, and AMD is the clear winner when you do. In an understandable decision, Intel listened to their marketers, who told them, "we need higher clocks than our competitors, because that's what people want to buy." So Intel chose long instruction pipes so they could get higher clock frequencies, and then focussed on minimiz
Re: (Score:2)
Re: (Score:2)
AMD used to kick ass (Score:2)
Re: (Score:1)
I am pretty sure that there was no P4 in 2000. That was around the time I discovered the joy of an AMD K6 II (350 Mhz but stable OCed to a bit over 500 Mhz) on a system that came with Windows ME installed. Oddly, it ran ME well. I owned no other system that ran ME well. It was then that I developed an affinity for AMD which I still have today. I find them stable and fast enough for anything I do but I have not done any serious number crunching since I retired about 7 years ago.
Re: (Score:1)
Wow. No, make that almost 9 years ago.
Re: (Score:2)
Re: (Score:1)
I had thought it was still in development then. I do not recall seeing them for sale until 02 or so. The K6 II was a fun little chip for its day. It compared well enough with the PIIIs at 500 Mhz when OCed to there clock speed. I ran it OCed for a while actually. It eventually became an OpenNap hub and stayed that way for a a couple of years. Then I got a visit from some guys with bad fashion sense and ID badges that had important sounding letters on them so, for the sake of ease, I retired the box. There w
Re: (Score:2)
I had thought it was still in development then.
That's just what I said. The op was saying that AMD has always played second fiddle, including (his words) 10 and 15 years ago. The developments I outlined (Athlon/Opteron/Duron beating the crap out of P4&PD/Xeon/Celeron) took place precisely during that era. I guess ~10 years of domination is a long time in the computing world but it's sad that slashdot's collective memory seems to be that short, particularly since it appears (as I've said) that we are fast approaching another fork in the road with fab
Re: (Score:2)
Re: (Score:1)
Ah! Neat. I had thought they arrived in about 2002 or so. We used to joke about not having to heat the office because we had a bunch of workstations with PIIIs in them. I suspect that my memory of 2002 being the year is because we probably migrated to a bunch in 2002 or something like that. I do not have access to the records to check but that seems likely. We were never brand loyal but all workstations had a two year life cycle so we would go with whatever was adequate and priced well. Employees got first
Re: (Score:2)
The P4 really didn't take off until about 2002 or so. The first ones ran on Socket 423 which required Rambus memory. They were expensive, not really any faster than the P3, were hobbled by small L2 caches, and ultimately Socket 423 ended up being a very short-lived dead end socket. It wasn't until about 2002 when Socket 478 came out, chipsets that supported SDRAM and later DDR memory, and the Northwood P4 that had 512k of L2 cache came out that the P4 started to catch on.
Re: (Score:1)
That makes sense. We crammed the office full of P4s at 1.2ghz as I recall. I also seem to recall they were miniature space heaters and not very efficient. I also seem to recall that the times were changing with a quickness at this point and we did not stay on them for long but moved back to an AMD offering not much later and then stayed with AMD after that until I sold the company a number of years ago. I have no idea what they have now.
The reason we went with AMD is that they were faster for some of what w
Re: (Score:2)
Re: (Score:1)
Re: (Score:3, Insightful)
What's refreshing is that they've recognized this. I'm reasonably sure this choice was the output of some rather heated meetings
I guess nobody here at /. took the Nokia lesson. No matter how badly your product sucks, you never, ever admit that to the market. It doesn't matter if you got less credibility than the Iraqi information minister, it's still better than the alternative. Do you know how much ridicule they're going to get for this with funny fake ads with the "Intel inside" logo and jingle? It's brand suicide. The only plausible explanation is that AMD is in "screw tomorrow, we need sales NOW" mode. It's not a shocker if the
Re: (Score:1)
I guess nobody here at /. took the Nokia lesson. No matter how badly your product sucks, you never, ever admit that to the market. It doesn't matter if you got less credibility than the Iraqi information minister, it's still better than the alternative. Do you know how much ridicule they're going to get for this with funny fake ads with the "Intel inside" logo and jingle? It's brand suicide. The only plausible explanation is that AMD is in "screw tomorrow, we need sales NOW" mode. It's not a shocker if the market pairs an Intel CPU with an AMD dGPU if that makes sense, but if I was head of marketing at AMD I'd rather resign than have this to my name.
Maybe.
Their Piledriver processors was mostly released in 2012-2014. It's three years old by now.
Zen won't be here until 2016.
I have no idea whatever they intend to do the SMT ("hyper-threading") with the same number of cores or not but the IPC / clock is supposed to be 40% quicker.
If you take one of their 8 core chips and make it 40% quicker and then added SMT on top of that maybe it would be somewhat competitive.
Skylake which Intel releases real soon is supposed to be 15% faster / clock than current Haswel
Re: (Score:3)
The problem with "SMT on top" of their current design is that their current design is SMT. They're just marketing it as true 8 cores, not SMT.
The current piledriver design doesn't have 8 separate floating point units, or 8 separate instruction decode units. It has 4 of each. They just have 8 ALUs - 2 to each decode unit. It's ALU/ALU SMT, when Intel has ALU/FP SMT.
Re: (Score:2)
It doens't have 8 separate 256 bit FPUs. It can however split the 256 bit ones into separare 128 bit ones. So unless you're hammering on AVX instructions, then it effectively has one FPU per core.
Re: (Score:3)
Perhaps they don't want to sell an ultimate gaming machine powered by the best GPU's AMD has to offer being crippled by the CPU and risk getting beaten by nVidia.
That would make both their CPU's and GPU's look bad.
Their CPU's are already in the "best bang per buck" category and I'm guess have thin margins because of that as well.
It is their fault (Score:1)
They should have let ATI keep their brand, ATI did nothing to be ashamed of.
They immediately gave up the ATI brand. Now, ATI finally have top of the line graphics processor but AMD doesn't have top of the line CPU. Gamers and press aren't stupid, they would eventually compare i7 CPU by taking out AMD CPU.
It is AMD brand manager suits who created this awkward situation. It still shows AMD is a mature company who dares to take such decisions like putting Intel in it. Imagine Oracle suggesting IBM servers for
Re: (Score:2)
Re: (Score:2)
Re:The answer's simple... (Score:5, Insightful)
The answer is even simpler than that.. They are offering a both, because they know the customer base is fickle and brand loyal.
You'll probably see a lower priced version with an AMD CPU and a much higher priced Intel based model for the kids who want bragging rights. They win either way.
They designed the product to actually compete in the market, not to show off their CPUs.
Comment removed (Score:4, Insightful)
Re: (Score:3)
I interpret this differently.
AMD cpu division has been losing money hands over fists since the first i series for quite some time. It doesn't matter if it fits your needs. It only matters if they can compete with Intel. They can't.
ATI however brings in some money I guess. So this is a test to see if the intel version sells 4x more. If it does perhaps it is time for AMD to leave Intel to the x86 to remain solvent? It pains me to type this as I went AMD since the athlonXP days to the phenom II I just retired
Re: (Score:3)
I don't think he's an intel fanboi as much as an Intel hater, which is fine, because they're pretty despicable. They're crooks but the legal system seems to love large companies so, for example when they dealt an illegal yet crippling blow, they got away with a $1bn payoff which is certainly less than they've made from their illegal activities.
When fines for bad behaviour have a strongly positive ROI, then there's something deeply broken with the system.
It's also funny that on Linux, with fully open benchm
Re: (Score:2)
It's also funny that on Linux, with fully open benchmarks on phoronix, the AMD chips trade blows with the Intel ones and the top end ones of each are actually pretty close, with AMD being a bit slower on average than the top intel ones, but not far off.
For liberal amounts of "pretty close", sure. One of the things to remember is that AMD's CPU's are now several process shrinks behind Intels latest, so its not a surprise that they could be significantly behind in performance. What is surprising is that they are not, and this tells us exactly what Intel is doing. Intel is not throwing most of the advantage of the process shrinks into performance. What they are doing is throwing those advantages into efficiency (power/heat) because the guerrilla in the room
Re: (Score:3)
You obviously haven't played any system stressing games... Most games are not multiuthreaded, so do not benefit from AMD's main competitive edge. Not to mention AMD chips run hotter and use more power than a comparable intel. Never a good sign of good design .
Then you imply that intel rigged ALL the benchmarks they are in because there is a conspiracy and the US DOJ should get involved....
Like this right, this is rigged by intel?
http://www.cpubenchmark.net/hi.. [cpubenchmark.net]
Re: (Score:3)
You obviously haven't played any system stressing games... Most games are not multiuthreaded,
Who told you that? Maybe you should try firing up some thread monitors before you talk this bullshit. Most games of any complexity, and even many games of virtually none, are multithreaded. This was mostly true even before the advent of the Xbox 360, but after that just about every cross-platform game became multithreaded, with at least three threads. Since Microsoft and Sony have both gone to consoles with eight cores, multithreaded games are even more ubiquitous.
So no, you're full of crap right there.
Intel has been dominating since core 2 duo / core 2 quad and they continue to do so.
No.
Re: (Score:2)
Who told you that? Maybe you should try firing up some thread monitors before you talk this bullshit. Most games of any complexity, and even many games of virtually none, are multithreaded. This was mostly true even before the advent of the Xbox 360, but after that just about every cross-platform game became multithreaded, with at least three threads. Since Microsoft and Sony have both gone to consoles with eight cores, multithreaded games are even more ubiquitous.
So no, you're full of crap right there.
Well there certainly are more threads in use, but the real question is what are they doing? The biggest bottleneck in games - as far as the CPU is concerned - is setting up your command lists and buffers. In current GPU APIs this is a sequential process and is inherently single-threaded. You can do multithreading to some degree (see DX11 multi-threading) but the actual access to the immediate context is single-threaded so you suffer the synchronization penalty anyway which is why DX11 single-threaded vs mul
Re: (Score:2)
since games haven't been CPU bound in years
Actually that's not true. Where exactly do you think the performance advantage of AMD's Mantle comes from? In fact the very reason for the creation of DirectX 12, Vulkan (built on Mantle) and Metal is because of the dependence on the CPU resource (mostly on one core) due to high-overhead sequential drivers. Once drivers support these new APIs and games are written with them we will see a decrease in games being so CPU-bound as this load can be spread over more cores but also can be done more efficiently as
Re: (Score:2, Troll)
They're trying to sell this as the ultimate graphics crunching box...
So does it have the GTX Titan, or at least an 980?
Re:The answer's simple... (Score:5, Funny)
No, but my sources tell me they're planning an Intel+Nvidia second generation product that will totally blow this rig out the water!
Re: (Score:2)
I was reading that the 980 TI is 80%-90% as good as the Titan and at a much better price so maybe the 2nd generation will have a 980 TI which will offer almost Titan performance at a less than titanic price
Re: (Score:2)
Re: (Score:2)
I can't recall which site I read that review but I like Anandtech so I'd trust their review over others
Re: (Score:2, Insightful)
So AMD goes away and intel prices the new CPUs even higher in the sky? Are you a retard? Besides, i'm buying AMD any day.
Re: (Score:2)
So AMD goes away and intel prices the new CPUs even higher in the sky? Are you a retard? Besides, i'm buying AMD any day.
CPU prices were higher back when AMD was competitive with Intel.
Their competition in most markets today is ARM, not AMD.
Also lower power for performance (Score:2)
Intel's chips have been real good in terms of performance/watt these days. AMD has had real problems in that regard. Their high end chips are massive power sinks. Now in some uses, maybe that isn't important, but in a small system, it matters. You are going to have to jump though hoops to make sure you thermal system fits, is sufficient, and isn't loud anyhow, trying to put a ton more power in there isn't a winning idea.
Thus when you have the 4790k on the one hand, which is rated at 88 watts TDP, and the AM
Re: (Score:1)
It depends.
According to anand, AMDs Jaguar was best in class perf/watt (and you bet perf/buck too) so no wonder it ended up in both Xbox/Playstation.
AMD has great notebooks chips (look at Carrizo) too, but nobody offers them, as in old "@HP we will give you our processors for free! No, thanks" times. I couldn't care less about i3 being faster in single core tasks, if its integrated GPU is so pathetic compares to AMDs and yet gaming is the only stressful task my notebook ever has.
Re: (Score:2)
AMD has great notebooks chips (look at Carrizo) too, but nobody offers them, as in old "@HP we will give you our processors for free! No, thanks" times.
That's true but you have to remember the OEMs then suffer on volume discounts so it's a case of switching everything to AMD. Now of course they could just have a small subset of systems with AMD chips but it's not a matter of just switching CPUs, you also have to redesign and manufacture the tooling since you need to design a whole new motherboard (usually one of each different chassis too) which is often not economically viable if it is for a small run of systems especially if you're running on thin margin
They actually support both. (Score:1)
FX-8350 isn't that bad compared to i7 4790k, but not at gaming.
Anyway, Intel CPU inside it will be ONE OF THE OPTIONS. AMD CPU configurations will also be available
We have Quantum designs that feature both AMD and Intel processors, so we can fully address the entire market.
http://www.tomshardware.com/ne... [tomshardware.com]
Strange (Score:3)
AMD knows their CPU dissipates too much heat for the SFF PC?
What a fucking stupid submission. (Score:4, Insightful)
What a goddamn stupid submission.
Yes, companies that make one product do use products from competitors in some situations. Microsoft is a great example of this. Yes, they provide Windows, but you can also use Linux with Azure. There's nothing wrong with that. They're using a product that competes with Windows because that's what the Azure users want and need. It's the smart thing to do, for crying out loud.
A much bigger problem is when an open source project like, say, Debian, ends up having to support systemd thanks to political skullduggery, even though systemd is not what Debian's users want, it is not good for the Debian project's quality, it causes many problems, and causes many Debian users to lose trust in the project and its software. That's a real problem. This AMD-using-Intel-CPU shit is totally a non-issue.
Re: (Score:2)
You know that Linux isn't an actual company which is competing with Microsoft, right? And putting your own software on a competing platform is very different from actually BUYING something from a competitor and using it as part of your platform. Of course in reality it's not so simple; it's possible that the AMD people figured this would be to their net advantage. And this fits in with the recent pattern of AMD conceding sectors of the market to Intel and focusing more on its core businesses.
Re: (Score:1)
Where does the GP's comment say that Linux is a company? It says "products from competitors". It does not say that the competitors are companies. Clearly in the case of Linux they are multiple open source projects, even if some of those projects have some corporate backing.
Re: (Score:1)
No but Oracle is and they offer a variety of their products too.
Re: (Score:2)
Yes, companies that make one product do use products from competitors in some situations. Microsoft is a great example of this. Yes, they provide Windows, but you can also use Linux with Azure. There's nothing wrong with that. They're using a product that competes with Windows because that's what the Azure users want and need. It's the smart thing to do, for crying out loud.
Well, Windows doesn't run Linux applications but AMD CPUs do run the same software as Intel CPUs. That sort of thing matters. To use a car analogy, it's one thing to use a competitor's trucks because you don't make trucks even though they also have cars that compete with yours. It's another thing if your sales reps show up in a competitor's car. I'd wager the people at Samsung use Windows and Office, but I don't expect to see many Lumia phones. Last I heard AMD is still making desktop CPUs. Now they're maki
Not just a CPU company anymore (Score:2)
Re: (Score:2)
Indeed. AMD make good graphics chips, and mediocre CPUs. They should sell off the CPU side and start a new company that just makes graphics. They could call it... I don't know... ATI?
Re:Not just a CPU company anymore (Score:4, Insightful)
Re: (Score:1)
You're right. I spend 0 hours tweaking my init system when I'm using systemd, because my computer didn't even boot properly, so I just ended up installing Windows instead.
It's smart. People want component choice. (Score:1)
That's the best value chip for high performance available right now, so it's WISE to offer it to consumers with AMD's graphics platform = more market and more gross. They need both.
AMD takes care of its customers? (Score:5, Insightful)
Precisely. (Score:5, Interesting)
Been with nVidia for the last batch of GPUs and for last few CPUs - but I'm still rooting for them. The plucky, power-guzzling underdog
Maybe my next upgrade will switch me back to them, maybe it won't - but this decision at least shows me they've not lost their minds, and should still be considered.
Re: (Score:1)
Now only if they was get a Carrizo laptop on the market with 1920x1080 screen, ssd, and proper dual channel memory config. Look at the recent HP 15z offering. All of that is available with an Intel cpu. As soon as you select an AMD cpu, all the goodies are no longer options.
Re: (Score:3)
Re: (Score:1)
The problem is, following this logic they should have used Nvidia GPU parts as well. This showcases AMD's weaknesses more than anything else. Its confirmation of what everyone already knows, AMD cant make low heat parts.
The Fury X is quicker than the GTX 980 and in half of the games seem to be quicker than the Titan X it seems:
http://www.tomshardware.com/re... [tomshardware.com]
So why the fuck would they use an Nvidia card if they got as quick card themselves?
I know it may not support feature level 12.1 of Direct X but that's it. One advantage is that it will allow you to get a cheaper FreeSync monitor.
Re: (Score:1)
I really like this. I've always thought good of AMD as a company. I really wish they were a solid competitor to intel in the processor market... but they just don't have the efficiency. Maybe things will swing back towards them some day.
GPU market they seem to be hanging on barely. Nvidia is focusing heavily on efficiency and lower power consumption. AMD is competing by using a lot more power in their equivalent GPUs and by having their prices be a bit better.
It's not too late for a single CPU/GPU package t
Re: (Score:1)
It's not too late for a single CPU/GPU package to completely change the playing field.
I think Intel say something like their integrated graphics is like 75 times more capable than their first one or whatever.
(I'm not comparing Intel and AMD here. Just stating how things have moved. There's of course the fact that Nvidia invest into Nvidia Grid, cloud rendering and streaming even games to consumers instead.)
It's all about what you need though. Integrated stuff is enough for many. But not for everyone. And streaming games will likely be the same.
But yeah. Who knows how many purchase graphics c
Not surprising (Score:3, Insightful)
The only reason to buy AMD these days is if you're on a budget, and you're OK with middling performance.
Re: (Score:2)
Wow, that's me! More computer for the buck, anything else is a fool's game.
Re: (Score:3)
The only reason to buy AMD these days is if you're on a budget, and you're OK with middling performance.
Or you want ECC memory...
Re: (Score:2)
Currently the premium to get ECC on an Intel system versus a comparable AMD system is about $250.
Re: (Score:2)
Middling performance, high performance, low performance, that is all silly twaddle. Appropriate performance for the appliance application is the correct. 50% higher performance than you need is simply wasted and idle. Gaming performance is tricky, the game itself needs to target the majority of the market segment for that style game, it needs to run well and look good at middling performance and reality is, games often run far more stably at those levels rather than at higher graphics which tend to crash m
Not the point (Score:2)
They are showcasing their GPU. I've had better experiences with ATI/AMD GPU over nVIDIA.
I'd buy a AMD GPU over an Intel one any day.
But yes, until AMD makes something significantly better, I'll buy an Intel CPU. My current system is Intel CPU, AMD GPU. The system I built before that was an Intel CPU, and ATI GPU, which is pretty much the same thing. Were I to build one tomorrow, probably also.
At least this shows that AMD doesn't have their own head stuck up their ass to know that their customers regularly p
Re: Not surprising (Score:1)
The i5 is still a better buy than anything AMD offers right now. Let's hope Zen improves the situation.
Re: (Score:1)
Well I think there is "gaming" and gaming: people who buy the best HW and those who are surprised how powerful the "low end" HW can be in practice. A 50 buck processor (two-core 4 GHz, 65W TDP) paired with a decent graphics card, "enough" memory and an SSD gives OK performance. After this it's just a diminishing curve with each FPS costing proportionally more (plus the overclocking, amped-up watts). The same is true for any hobby there is, hi-fi, for example.
Re: (Score:1)
The "gamers" are probably the only group driving traditional PC desktop part costs down. They need memory: at least 32..64 GB should be enough? They need multi core CPUs because they game while doing other productive tasks at the same time (multiple screens, SLI etc.) And first of all, they need powerful graphics cards and 500..1000 watt PSUs. Most of the PC desktop parts seem to be "branded" for gamers, from Intel and from AMD. The point is to differ as much as possible from the integrated solutions, other
not worthy of praise (Score:1)
i think the only admirable part of this if it can be called that is that AMD is not overtly suicidal.
you can't claim to offer the ultimate driving experience and slap in a lawnmower engine.
Wait, AMD is selling a Computer? (Score:3, Insightful)
I am still waiting to see the part were this was anything more than a promotional and inspiration design from AMD. Nowhere has AMD said they are going to sell this, or any, boxed PC.
Re: (Score:2)
I just wish AMD gpu drivers weren't so shitty.
I'd love to have a solution to compare against Nvidia.
What do the editors do exactly? (Score:2)
Duh (Score:1)
Re: (Score:2)
A couple details here (Score:1)
Intel's CPU will be an option, but surely you can get it with AMD as well:
http://www.tomshardware.com/ne... [tomshardware.com]
Fury X beats Titan X at many games at 4k resolution and even more at 5k.
Fury X beats 980 Ti (pre-emptive release by nvidia, that anticipated Fury X) at 4k, whit the same recommended price.
Now, these boxes will have to of Furys.
FuryX also has a nice "FPS cap" feature, which allows it to drop frequency to save power when you are beyond reasonable FPS (i.e. 90+, actual number depends on your taste).
Had th
Quantum Gaming (Score:2)