Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AMD Intel Games Hardware

AMD's Project Quantum Gaming PC Contains Intel CPU 138

nateman1352 links to an article at Tom's Hardware which makes the interesting point that chip-maker AMD will offer Intel -- rather than AMD -- CPUs in their upcoming high-end gaming PC. (High-end for being based on integrated components, at least.) From the article: Recently, AMD showed off its plans for its Fiji based graphics products, among which was Project Quantum – a small form factor PC that packs not one, but two Fiji graphics processors. Since the announcement, KitGuru picked up on something, noticing that the system packs an Intel Core i7-4790K "Devil's Canyon" CPU. We hardly need to point out that it is rather intriguing to see AMD use its largest competitor's CPU in its own product, when AMD is a CPU maker itself.
This discussion has been archived. No new comments can be posted.

AMD's Project Quantum Gaming PC Contains Intel CPU

Comments Filter:
  • by beelsebob ( 529313 ) on Saturday June 27, 2015 @05:06PM (#50003745)

    The i7 4790k is faster than any CPU AMD make, by quite a wide margin. They're trying to sell this as the ultimate graphics crunching box... That needs a faster CPU than they can produce.

    • Yes (Score:5, Interesting)

      by goldcd ( 587052 ) on Saturday June 27, 2015 @05:26PM (#50003817) Homepage
      What's refreshing is that they've recognized this. I'm reasonably sure this choice was the output of some rather heated meetings - but so.. 'refreshing' to see that the correct decision was made, for those people wanting to purchase the product.
      Also gives a pretty good internal target for AMD - v2 of this box WILL have an AMD CPU in it (or else we're getting out of the CPU market).
      • Re:Yes (Score:4, Insightful)

        by AK Marc ( 707885 ) on Saturday June 27, 2015 @08:06PM (#50004371)
        They've always recognized this. If you want 80% of the best for the lowest price, you want AMD. If you want the top 20% at any price, go Intel. I remember the marketing from 5, 10 and 15 years ago, and it seems to me that AMD always knew that. They may not like it, but they knew it.
        • Re: (Score:2, Informative)

          by Anonymous Coward

          no, thats not true.

          amd pioneered the modern architectural shift towards cpu/memory/io/cache complexes being switched networks

          for a brief period of time they totally outshone intel which was still culturally crippled by the concept of
          a parallel 'front end bus'

          the truth is, that in the 'free markets', players tend to snowball and take the whole pie. thats just happens.
          cisco, microsoft, google, intel, apple.

          sorry legitimate competion

          • Re: (Score:1, Flamebait)

            by 0123456 ( 636235 )

            for a brief period of time they totally outshone intel which was still culturally crippled by the concept of
            a parallel 'front end bus'

            Um, that's utter crap.

            AMD beat Intel when it was crippled by the concept of 'long pipelines with high clock frequencies' and 'pushing 64-bit users onto Itanium'. The early Core chips took back the crown from AMD, and they had an FSB.

            'AMD rulez because no FSB' was just AMD fanboy claptrap, like 'AMD rulez because Intel put two chips in one package for a quad-core, but AMD puts four on one chip.' None of those things made any significant difference to performance in that era.

            • Re:Yes (Score:5, Insightful)

              by serviscope_minor ( 664417 ) on Sunday June 28, 2015 @03:57AM (#50005445) Journal

              Um, that's utter crap.

              No it ain't. AMD at that point had an actually scalable architecture using hyper transport and could scale in multi socket boxes way, way better than Intel. It made a huge difference.

            • Re:Yes (Score:5, Insightful)

              by Mike Frett ( 2811077 ) on Sunday June 28, 2015 @04:17AM (#50005469)

              It's not crap. The Athlons crushed the Pentium 4's. I remember that very clearly. Slashdot people should know this unless you were born yesterday.

              • unless you were born yesterday.

                Rational explanation for phenomenon has been found. News at 10.

              • by 0123456 ( 636235 )

                It's not crap. The Athlons crushed the Pentium 4's. I remember that very clearly. Slashdot people should know this unless you were born yesterday.

                1. Comparably priced 32-bit Athlons only beat the P4 on floating-point-intensive tasks that didn't use SSE. Which typically meant games, but not pro-level 3D applications like the ones I was running.
                2. The performance difference was nothing to do with the FSB.

              • Yes. AMD had a brief window of a few years of total dominance. I recall the Athlon64 (AthlonXP?) crushed the Intel counterparts. They were faster, cooler, and better supported. Then Intel ditched the P4 and went to Core 2 Duo's and the rest is history.

                I am kind of shocked however on the decision. Usually corporate types like to stick head in the sand and make believe hard enough until the get their bonuses. I wonder how far up the chain this decision went, and if not far enough if heads will roll. If this w

            • 'AMD rulez because no FSB' was just AMD fanboy claptrap, like 'AMD rulez because Intel put two chips in one package for a quad-core, but AMD puts four on one chip.' None of those things made any significant difference to performance in that era.

              What? Yes they did. The AMD chips had substantially more memory bandwidth at the time, which was readily discoverable, and they had a far superior NUMA interconnect technology which permitted them to do many more cores in a single machine without nearly as much contention as the crossbar architecture intel was using. Intel has since come around to a more modern bus and has a faster on-board memory controller than AMD, which just proves how wrong you are; if it's the inferior approach, intel wouldn't have ad

              • by 0123456 ( 636235 )

                Like I said, AMD fanboy claptrap. Memory bandwidth has never had much impact on the majority of computing tasks, which are typically limited by CPU performance and memory latency.

                • Like I said, AMD fanboy claptrap.

                  You said it, but you were a big idiot at the time, so you talked shit.

                  Memory bandwidth has never had much impact on the majority of computing tasks

                  There is only one typical computing task which stresses the home PC, and that is gaming — where memory bandwidth is absolutely relevant. And guess what this conversation is about? Yeah, if you look at the page title there, you might see that it's about gaming, you fucking ignoranus.

                  • you fucking ignoranus

                    Dear sir, may I ask you to keep this conversation civil, on-topic and orderly, as is custom when having friendly conversation over the Internet.

            • Um, that's utter crap.

              I beg to differ. clock for clock, AMD blows Intel out of the water. Clocks-per-instruction (CPI) is a useful metric when it comes to comparing CPU horsepower across differing architectures, and AMD is the clear winner when you do. In an understandable decision, Intel listened to their marketers, who told them, "we need higher clocks than our competitors, because that's what people want to buy." So Intel chose long instruction pipes so they could get higher clock frequencies, and then focussed on minimiz

        • Um, your "15 years" bit is off the mark. 15 years ago Intel was so busy chasing the mhz marketing dollar with the Pentium 4 (and the derived Pentium D) that AMD was able to dethrone them at the top end, and for an encore they then turned around and demolished Intel's 64-bit Itanium server architecture with the backwards compatible AMD-64 (which Intel quickly licensed from them. And renamed.) For the hat trick, they absolutely destroyed low end Celeron with their AMD Duron. For several years there, Intel was
          • by KGIII ( 973947 )

            I am pretty sure that there was no P4 in 2000. That was around the time I discovered the joy of an AMD K6 II (350 Mhz but stable OCed to a bit over 500 Mhz) on a system that came with Windows ME installed. Oddly, it ran ME well. I owned no other system that ran ME well. It was then that I developed an affinity for AMD which I still have today. I find them stable and fast enough for anything I do but I have not done any serious number crunching since I retired about 7 years ago.

            • by KGIII ( 973947 )

              Wow. No, make that almost 9 years ago.

            • The P4 was in development 15 years ago, and the OP said "10 years" as well, so I think my point stands. Incidentally I had a K6-II as well and it was fairly awful... then again, we never upgraded from Windows 98 SE.
              • by KGIII ( 973947 )

                I had thought it was still in development then. I do not recall seeing them for sale until 02 or so. The K6 II was a fun little chip for its day. It compared well enough with the PIIIs at 500 Mhz when OCed to there clock speed. I ran it OCed for a while actually. It eventually became an OpenNap hub and stayed that way for a a couple of years. Then I got a visit from some guys with bad fashion sense and ID badges that had important sounding letters on them so, for the sake of ease, I retired the box. There w

                • I had thought it was still in development then.

                  That's just what I said. The op was saying that AMD has always played second fiddle, including (his words) 10 and 15 years ago. The developments I outlined (Athlon/Opteron/Duron beating the crap out of P4&PD/Xeon/Celeron) took place precisely during that era. I guess ~10 years of domination is a long time in the computing world but it's sad that slashdot's collective memory seems to be that short, particularly since it appears (as I've said) that we are fast approaching another fork in the road with fab

            • There was a P4 in 2000, barely. "Pentium 4 is a line of single-core desktop, laptop and entry level server central processing units (CPUs) introduced by Intel on November 20, 2000" (quote from Wikipedia).
              • by KGIII ( 973947 )

                Ah! Neat. I had thought they arrived in about 2002 or so. We used to joke about not having to heat the office because we had a bunch of workstations with PIIIs in them. I suspect that my memory of 2002 being the year is because we probably migrated to a bunch in 2002 or something like that. I do not have access to the records to check but that seems likely. We were never brand loyal but all workstations had a two year life cycle so we would go with whatever was adequate and priced well. Employees got first

                • The P4 really didn't take off until about 2002 or so. The first ones ran on Socket 423 which required Rambus memory. They were expensive, not really any faster than the P3, were hobbled by small L2 caches, and ultimately Socket 423 ended up being a very short-lived dead end socket. It wasn't until about 2002 when Socket 478 came out, chipsets that supported SDRAM and later DDR memory, and the Northwood P4 that had 512k of L2 cache came out that the P4 started to catch on.

                  • by KGIII ( 973947 )

                    That makes sense. We crammed the office full of P4s at 1.2ghz as I recall. I also seem to recall they were miniature space heaters and not very efficient. I also seem to recall that the times were changing with a quickness at this point and we did not stay on them for long but moved back to an AMD offering not much later and then stayed with AMD after that until I sold the company a number of years ago. I have no idea what they have now.

                    The reason we went with AMD is that they were faster for some of what w

          • Intel was basically forced to license AMD64. Microsoft told Intel that they were only going to support one 64-bit version of the x86 architecture, and that it was going to be AMD64 because it was already established in the market.
        • In the 90's and early 2000's it flip-flopped who was on top almost every time one of the companies released a new processor. It wasn't until the i-series that Intel firmly grasped the market.
      • Re: (Score:3, Insightful)

        by Kjella ( 173770 )

        What's refreshing is that they've recognized this. I'm reasonably sure this choice was the output of some rather heated meetings

        I guess nobody here at /. took the Nokia lesson. No matter how badly your product sucks, you never, ever admit that to the market. It doesn't matter if you got less credibility than the Iraqi information minister, it's still better than the alternative. Do you know how much ridicule they're going to get for this with funny fake ads with the "Intel inside" logo and jingle? It's brand suicide. The only plausible explanation is that AMD is in "screw tomorrow, we need sales NOW" mode. It's not a shocker if the

        • by aliquis ( 678370 )

          I guess nobody here at /. took the Nokia lesson. No matter how badly your product sucks, you never, ever admit that to the market. It doesn't matter if you got less credibility than the Iraqi information minister, it's still better than the alternative. Do you know how much ridicule they're going to get for this with funny fake ads with the "Intel inside" logo and jingle? It's brand suicide. The only plausible explanation is that AMD is in "screw tomorrow, we need sales NOW" mode. It's not a shocker if the market pairs an Intel CPU with an AMD dGPU if that makes sense, but if I was head of marketing at AMD I'd rather resign than have this to my name.

          Maybe.

          Their Piledriver processors was mostly released in 2012-2014. It's three years old by now.

          Zen won't be here until 2016.

          I have no idea whatever they intend to do the SMT ("hyper-threading") with the same number of cores or not but the IPC / clock is supposed to be 40% quicker.

          If you take one of their 8 core chips and make it 40% quicker and then added SMT on top of that maybe it would be somewhat competitive.

          Skylake which Intel releases real soon is supposed to be 15% faster / clock than current Haswel

          • The problem with "SMT on top" of their current design is that their current design is SMT. They're just marketing it as true 8 cores, not SMT.

            The current piledriver design doesn't have 8 separate floating point units, or 8 separate instruction decode units. It has 4 of each. They just have 8 ALUs - 2 to each decode unit. It's ALU/ALU SMT, when Intel has ALU/FP SMT.

            • It doens't have 8 separate 256 bit FPUs. It can however split the 256 bit ones into separare 128 bit ones. So unless you're hammering on AVX instructions, then it effectively has one FPU per core.

        • Perhaps they don't want to sell an ultimate gaming machine powered by the best GPU's AMD has to offer being crippled by the CPU and risk getting beaten by nVidia.
          That would make both their CPU's and GPU's look bad.

          Their CPU's are already in the "best bang per buck" category and I'm guess have thin margins because of that as well.

        • They should have let ATI keep their brand, ATI did nothing to be ashamed of.

          They immediately gave up the ATI brand. Now, ATI finally have top of the line graphics processor but AMD doesn't have top of the line CPU. Gamers and press aren't stupid, they would eventually compare i7 CPU by taking out AMD CPU.

          It is AMD brand manager suits who created this awkward situation. It still shows AMD is a mature company who dares to take such decisions like putting Intel in it. Imagine Oracle suggesting IBM servers for

    • It's also an oddball small(by the standards of what's inside) form factor case. That likely takes the upper end of what they do make out of consideration on thermal grounds. The i7-4790K is a 90watt part, and AMD's thermally equivalent options compare even less favorably.
    • by Anonymous Coward on Saturday June 27, 2015 @05:49PM (#50003905)

      The answer is even simpler than that.. They are offering a both, because they know the customer base is fickle and brand loyal.

      You'll probably see a lower priced version with an AMD CPU and a much higher priced Intel based model for the kids who want bragging rights. They win either way.

      They designed the product to actually compete in the market, not to show off their CPUs.

      • Comment removed (Score:4, Insightful)

        by account_deleted ( 4530225 ) on Saturday June 27, 2015 @08:20PM (#50004411)
        Comment removed based on user account deletion
        • I interpret this differently.

          AMD cpu division has been losing money hands over fists since the first i series for quite some time. It doesn't matter if it fits your needs. It only matters if they can compete with Intel. They can't.

          ATI however brings in some money I guess. So this is a test to see if the intel version sells 4x more. If it does perhaps it is time for AMD to leave Intel to the x86 to remain solvent? It pains me to type this as I went AMD since the athlonXP days to the phenom II I just retired

          • I don't think he's an intel fanboi as much as an Intel hater, which is fine, because they're pretty despicable. They're crooks but the legal system seems to love large companies so, for example when they dealt an illegal yet crippling blow, they got away with a $1bn payoff which is certainly less than they've made from their illegal activities.

            When fines for bad behaviour have a strongly positive ROI, then there's something deeply broken with the system.

            It's also funny that on Linux, with fully open benchm

            • It's also funny that on Linux, with fully open benchmarks on phoronix, the AMD chips trade blows with the Intel ones and the top end ones of each are actually pretty close, with AMD being a bit slower on average than the top intel ones, but not far off.

              For liberal amounts of "pretty close", sure. One of the things to remember is that AMD's CPU's are now several process shrinks behind Intels latest, so its not a surprise that they could be significantly behind in performance. What is surprising is that they are not, and this tells us exactly what Intel is doing. Intel is not throwing most of the advantage of the process shrinks into performance. What they are doing is throwing those advantages into efficiency (power/heat) because the guerrilla in the room

        • "(since games haven't been CPU bound in years)"

          You obviously haven't played any system stressing games... Most games are not multiuthreaded, so do not benefit from AMD's main competitive edge. Not to mention AMD chips run hotter and use more power than a comparable intel. Never a good sign of good design .

          Then you imply that intel rigged ALL the benchmarks they are in because there is a conspiracy and the US DOJ should get involved....

          Like this right, this is rigged by intel?
          http://www.cpubenchmark.net/hi.. [cpubenchmark.net]

          • You obviously haven't played any system stressing games... Most games are not multiuthreaded,

            Who told you that? Maybe you should try firing up some thread monitors before you talk this bullshit. Most games of any complexity, and even many games of virtually none, are multithreaded. This was mostly true even before the advent of the Xbox 360, but after that just about every cross-platform game became multithreaded, with at least three threads. Since Microsoft and Sony have both gone to consoles with eight cores, multithreaded games are even more ubiquitous.

            So no, you're full of crap right there.

            Intel has been dominating since core 2 duo / core 2 quad and they continue to do so.

            No.

            • Who told you that? Maybe you should try firing up some thread monitors before you talk this bullshit. Most games of any complexity, and even many games of virtually none, are multithreaded. This was mostly true even before the advent of the Xbox 360, but after that just about every cross-platform game became multithreaded, with at least three threads. Since Microsoft and Sony have both gone to consoles with eight cores, multithreaded games are even more ubiquitous.

              So no, you're full of crap right there.

              Well there certainly are more threads in use, but the real question is what are they doing? The biggest bottleneck in games - as far as the CPU is concerned - is setting up your command lists and buffers. In current GPU APIs this is a sequential process and is inherently single-threaded. You can do multithreading to some degree (see DX11 multi-threading) but the actual access to the immediate context is single-threaded so you suffer the synchronization penalty anyway which is why DX11 single-threaded vs mul

        • since games haven't been CPU bound in years

          Actually that's not true. Where exactly do you think the performance advantage of AMD's Mantle comes from? In fact the very reason for the creation of DirectX 12, Vulkan (built on Mantle) and Metal is because of the dependence on the CPU resource (mostly on one core) due to high-overhead sequential drivers. Once drivers support these new APIs and games are written with them we will see a decrease in games being so CPU-bound as this load can be spread over more cores but also can be done more efficiently as

    • Re: (Score:2, Troll)

      by TeknoHog ( 164938 )

      They're trying to sell this as the ultimate graphics crunching box...

      So does it have the GTX Titan, or at least an 980?

      • by binarylarry ( 1338699 ) on Saturday June 27, 2015 @06:25PM (#50004023)

        No, but my sources tell me they're planning an Intel+Nvidia second generation product that will totally blow this rig out the water!

      • I was reading that the 980 TI is 80%-90% as good as the Titan and at a much better price so maybe the 2nd generation will have a 980 TI which will offer almost Titan performance at a less than titanic price

        • by sd4f ( 1891894 )
          Anandtech claimed that the 980Ti is 97% of the Titan X. Apart from the difference in vram (which is hardly important at the moment, and by the time it will be important, the graphics core will be inadequate), there really is no reason to look at the titan anymore.
          • I can't recall which site I read that review but I like Anandtech so I'd trust their review over others

    • Intel's chips have been real good in terms of performance/watt these days. AMD has had real problems in that regard. Their high end chips are massive power sinks. Now in some uses, maybe that isn't important, but in a small system, it matters. You are going to have to jump though hoops to make sure you thermal system fits, is sufficient, and isn't loud anyhow, trying to put a ton more power in there isn't a winning idea.

      Thus when you have the 4790k on the one hand, which is rated at 88 watts TDP, and the AM

      • by Kartu ( 1490911 )

        It depends.
        According to anand, AMDs Jaguar was best in class perf/watt (and you bet perf/buck too) so no wonder it ended up in both Xbox/Playstation.

        AMD has great notebooks chips (look at Carrizo) too, but nobody offers them, as in old "@HP we will give you our processors for free! No, thanks" times. I couldn't care less about i3 being faster in single core tasks, if its integrated GPU is so pathetic compares to AMDs and yet gaming is the only stressful task my notebook ever has.

        • AMD has great notebooks chips (look at Carrizo) too, but nobody offers them, as in old "@HP we will give you our processors for free! No, thanks" times.

          That's true but you have to remember the OEMs then suffer on volume discounts so it's a case of switching everything to AMD. Now of course they could just have a small subset of systems with AMD chips but it's not a matter of just switching CPUs, you also have to redesign and manufacture the tooling since you need to design a whole new motherboard (usually one of each different chassis too) which is often not economically viable if it is for a small run of systems especially if you're running on thin margin

    • FX-8350 isn't that bad compared to i7 4790k, but not at gaming.

      Anyway, Intel CPU inside it will be ONE OF THE OPTIONS. AMD CPU configurations will also be available

      We have Quantum designs that feature both AMD and Intel processors, so we can fully address the entire market.

      http://www.tomshardware.com/ne... [tomshardware.com]

  • by ArcadeMan ( 2766669 ) on Saturday June 27, 2015 @05:07PM (#50003751)

    AMD knows their CPU dissipates too much heat for the SFF PC?

  • by Anonymous Coward on Saturday June 27, 2015 @05:13PM (#50003763)

    What a goddamn stupid submission.

    Yes, companies that make one product do use products from competitors in some situations. Microsoft is a great example of this. Yes, they provide Windows, but you can also use Linux with Azure. There's nothing wrong with that. They're using a product that competes with Windows because that's what the Azure users want and need. It's the smart thing to do, for crying out loud.

    A much bigger problem is when an open source project like, say, Debian, ends up having to support systemd thanks to political skullduggery, even though systemd is not what Debian's users want, it is not good for the Debian project's quality, it causes many problems, and causes many Debian users to lose trust in the project and its software. That's a real problem. This AMD-using-Intel-CPU shit is totally a non-issue.

    • You know that Linux isn't an actual company which is competing with Microsoft, right? And putting your own software on a competing platform is very different from actually BUYING something from a competitor and using it as part of your platform. Of course in reality it's not so simple; it's possible that the AMD people figured this would be to their net advantage. And this fits in with the recent pattern of AMD conceding sectors of the market to Intel and focusing more on its core businesses.

      • by Anonymous Coward

        You know that Linux isn't an actual company which is competing with Microsoft, right?

        Where does the GP's comment say that Linux is a company? It says "products from competitors". It does not say that the competitors are companies. Clearly in the case of Linux they are multiple open source projects, even if some of those projects have some corporate backing.

      • by bspus ( 3656995 )

        No but Oracle is and they offer a variety of their products too.

    • by Kjella ( 173770 )

      Yes, companies that make one product do use products from competitors in some situations. Microsoft is a great example of this. Yes, they provide Windows, but you can also use Linux with Azure. There's nothing wrong with that. They're using a product that competes with Windows because that's what the Azure users want and need. It's the smart thing to do, for crying out loud.

      Well, Windows doesn't run Linux applications but AMD CPUs do run the same software as Intel CPUs. That sort of thing matters. To use a car analogy, it's one thing to use a competitor's trucks because you don't make trucks even though they also have cars that compete with yours. It's another thing if your sales reps show up in a competitor's car. I'd wager the people at Samsung use Windows and Office, but I don't expect to see many Lumia phones. Last I heard AMD is still making desktop CPUs. Now they're maki

  • by Anonymous Coward

    That's the best value chip for high performance available right now, so it's WISE to offer it to consumers with AMD's graphics platform = more market and more gross. They need both.

  • by Iamthecheese ( 1264298 ) on Saturday June 27, 2015 @05:16PM (#50003779)
    That's what I'm reading. That AMD is willing to go the extra mile to offer what its customers are looking for.
    • Precisely. (Score:5, Interesting)

      by goldcd ( 587052 ) on Saturday June 27, 2015 @05:30PM (#50003831) Homepage
      This decision underlines that AMD wanted to make something great, that people would want and would buy - rather than being a vanity project for the company.
      Been with nVidia for the last batch of GPUs and for last few CPUs - but I'm still rooting for them. The plucky, power-guzzling underdog :)
      Maybe my next upgrade will switch me back to them, maybe it won't - but this decision at least shows me they've not lost their minds, and should still be considered.
      • by Anonymous Coward

        Now only if they was get a Carrizo laptop on the market with 1920x1080 screen, ssd, and proper dual channel memory config. Look at the recent HP 15z offering. All of that is available with an Intel cpu. As soon as you select an AMD cpu, all the goodies are no longer options.

      • The problem is, following this logic they should have used Nvidia GPU parts as well. This showcases AMD's weaknesses more than anything else. Its confirmation of what everyone already knows, AMD cant make low heat parts.
        • by aliquis ( 678370 )

          The problem is, following this logic they should have used Nvidia GPU parts as well. This showcases AMD's weaknesses more than anything else. Its confirmation of what everyone already knows, AMD cant make low heat parts.

          The Fury X is quicker than the GTX 980 and in half of the games seem to be quicker than the Titan X it seems:
          http://www.tomshardware.com/re... [tomshardware.com]

          So why the fuck would they use an Nvidia card if they got as quick card themselves?

          I know it may not support feature level 12.1 of Direct X but that's it. One advantage is that it will allow you to get a cheaper FreeSync monitor.

    • I really like this. I've always thought good of AMD as a company. I really wish they were a solid competitor to intel in the processor market... but they just don't have the efficiency. Maybe things will swing back towards them some day.

      GPU market they seem to be hanging on barely. Nvidia is focusing heavily on efficiency and lower power consumption. AMD is competing by using a lot more power in their equivalent GPUs and by having their prices be a bit better.

      It's not too late for a single CPU/GPU package t

      • by aliquis ( 678370 )

        It's not too late for a single CPU/GPU package to completely change the playing field.

        I think Intel say something like their integrated graphics is like 75 times more capable than their first one or whatever.

        (I'm not comparing Intel and AMD here. Just stating how things have moved. There's of course the fact that Nvidia invest into Nvidia Grid, cloud rendering and streaming even games to consumers instead.)

        It's all about what you need though. Integrated stuff is enough for many. But not for everyone. And streaming games will likely be the same.

        But yeah. Who knows how many purchase graphics c

  • Not surprising (Score:3, Insightful)

    by kuzb ( 724081 ) on Saturday June 27, 2015 @05:23PM (#50003803)

    The only reason to buy AMD these days is if you're on a budget, and you're OK with middling performance.

    • Wow, that's me! More computer for the buck, anything else is a fool's game.

    • The only reason to buy AMD these days is if you're on a budget, and you're OK with middling performance.

      Or you want ECC memory...

      • by Agripa ( 139780 )

        Currently the premium to get ECC on an Intel system versus a comparable AMD system is about $250.

    • by rtb61 ( 674572 )

      Middling performance, high performance, low performance, that is all silly twaddle. Appropriate performance for the appliance application is the correct. 50% higher performance than you need is simply wasted and idle. Gaming performance is tricky, the game itself needs to target the majority of the market segment for that style game, it needs to run well and look good at middling performance and reality is, games often run far more stably at those levels rather than at higher graphics which tend to crash m

    • They are showcasing their GPU. I've had better experiences with ATI/AMD GPU over nVIDIA.

      I'd buy a AMD GPU over an Intel one any day.

      But yes, until AMD makes something significantly better, I'll buy an Intel CPU. My current system is Intel CPU, AMD GPU. The system I built before that was an Intel CPU, and ATI GPU, which is pretty much the same thing. Were I to build one tomorrow, probably also.

      At least this shows that AMD doesn't have their own head stuck up their ass to know that their customers regularly p

  • by Anonymous Coward

    i think the only admirable part of this if it can be called that is that AMD is not overtly suicidal.

    you can't claim to offer the ultimate driving experience and slap in a lawnmower engine.

  • by WilCompute ( 1155437 ) on Saturday June 27, 2015 @06:04PM (#50003959) Homepage

    I am still waiting to see the part were this was anything more than a promotional and inspiration design from AMD. Nowhere has AMD said they are going to sell this, or any, boxed PC.

  • Do the editors actually review submissions? I submitted this a more than a week ago http://slashdot.org/submission... [slashdot.org]
  • Intel's CPU will be an option, but surely you can get it with AMD as well:
    http://www.tomshardware.com/ne... [tomshardware.com]

    Fury X beats Titan X at many games at 4k resolution and even more at 5k.
    Fury X beats 980 Ti (pre-emptive release by nvidia, that anticipated Fury X) at 4k, whit the same recommended price.

    Now, these boxes will have to of Furys.

    FuryX also has a nice "FPS cap" feature, which allows it to drop frequency to save power when you are beyond reasonable FPS (i.e. 90+, actual number depends on your taste).

    Had th

  • Boy was I disappointed when I got past the title.

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...