Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AMD Hardware

AMD's Radeon R9 290X Review 212

Billly Gates writes "AMD may have trouble in their CPU department with Intel having superior fabrication plants. However, in the graphics market with GPU chips AMD is doing well. AMD earned a very rare Elite reward from Tomshardware as the fastest GPU available with its fastest r9 for as little as $550 each. NVidia has its top end GPU cards going for $1,000 as it had little competition to worry about. Maximum PC also included some benchmarks and crowned ATI as the fastest and best value card available. AMD/ATI also has introduced MANTLE Api for lower level access than DirectX which is cross platform. This may turn into a very important API as AMD/ATI have their GPUs in the next generation Sony and Xbox consoles as well with a large marketshare for game developers to target"
This discussion has been archived. No new comments can be posted.

AMD's Radeon R9 290X Review

Comments Filter:
  • ATI drivers (Score:5, Informative)

    by CockMonster ( 886033 ) on Sunday October 27, 2013 @06:53PM (#45254203)
    I installed fresh ATI graphics drivers today. 90MB for a driver. .Net 4.5 needed to be installed. GTFO.
    • Re:ATI drivers (Score:5, Informative)

      by stox ( 131684 ) on Sunday October 27, 2013 @06:58PM (#45254223) Homepage

      148MB for the latest Nvidia driver.

      • 148MB for the latest Nvidia driver.

        *sigh*

      • Re:ATI drivers (Score:5, Insightful)

        by bored ( 40072 ) on Sunday October 27, 2013 @08:50PM (#45254867)

        Its more than that by the time the package decompresses.

        Just some data points from a single machine.

        C:\NVIDIA folder
        V197 (~2010) 85M
        V320 (~2013) 182M

        The vast majority of it appears to be the control panel, and the physx package.

        The display driver is just a few megs by comparison. If you skim off the hd audio/nv stereo/cuda/opencl/GL libraries you probably could get the whole shebang in under 10MB, and you could still play directX games.

        I've been killing the nview and services for years. Never had a problem with the machine, but it always bothers me that they have a bunch of crap running that doesn't actually appear to do anything.

        After all the once or twice a year I actually manipulate my monitor settings I am fully capable of finding my way into the windows display control panel and adjusting things there of opening the control panel from the actual control panel.

        Its quite possible I'm not getting the absolute best performance playing games, but frankly I would much rather adjust settings from within the games than have nvidia overriding the game settings.

        • Re: (Score:2, Informative)

          by Anonymous Coward

          For Nvidia drivers, don't forget to remove the 2 "AppInit_DLLs => nvinitx.dll" entries inside the registry. Preloading this DLL inserts nasty hooks for optimus's support. I that kind of tricks.
          Also remove the "updatus" user and its account/files.

          In the services, after the parameters of the card are configured as desired inside the control panel, you should turn off permanently the nvidia 3D profiles updating service and the driver's support service.
          Once this is done the computer is more stable, less bloa

          • I'm not interested in defending Nvidia (if they want to pull stunts like Optimus, it's their job to make them work, nobody said life was fair.); but 'Optimus' is woven throughout the system like an inoperable late-stage cancer for a reason: detecting arbitrary 3d acceleration load and (theoretically) transparently grabbing the work from the Intel GPU, handing it off to the Nvidia GPU, and using the now-lobotomized Intel part purely as a place to dump the finished frames for display (this arrangement, where
            • by bored ( 40072 )

              BTW: The original PowerVR cards from the late 1990's worked the same way. They formed the frames and dumped them to the system video card.

              The whole thing worked better than the VGA pass-through on the voodoo boards I also own (cause they are still in a PC in the attic). Most of the time I simply disconnected the voodoo pass through and plugged my monitors in directly to either the system video card, or the voodoo. That is because the pass-through interface totally screwed up high resolution (1600x1200 at th

              • Interesting. I'm surprised that they managed to pull that off over a PCI bus. Especially before people got serious about security-through-randomization of the address space layout, it's always been conceptually fairly simple for anything with memory access to dump stuff into the framebuffer (they seem to have bitrotted; but there used to be some amusing examples designed for use against classic macs over firewire, since that was both external and had DMA, dumping flying toasters directly on top of your vict
                • The original PowerVR cards did not really take over your bus; everything in the system would still work fine. But you have to remember that we're talking about a maximum resolution of 1024x768 for those cards, too. They weren't having to handle today's resolutions. Also, you need to remember that they were literally the slowest GPU of their day. I had 'em all, more or less; TNT, TNT2, Voodoo, Voodoo 2, Power VR, Permedia 2. The PowerVR was significantly slower than any of these, and it had lower visual qual

                  • Even 1024x768 is 786432 pixels. 8bpp, 30FPS is 22.5MB/s, 16bpp obiously twice that. I suppose, on consideration, that 50MB/s (purely of framebuffer transferring) would probably be an acceptable load for a nominally 133MB/s basic-desktop PCI bus. A shared, graphics only, one would be better; but at that level of fiddling, you might as well just dump the 2d features on the 3d board (why, hello there, exactly what in fact happened, we were just talking about you...)
                    • A pretty fast hard disk back then (say, an original ultra-scsi barracuda) would stream around 20MB/sec peak, most of us were still using modems or had moved up to ISDN... There was plenty of free bus bandwidth, even accounting for overhead. And even then there were machines with multiple PCI buses, though they remained rare throughout the dominance of that bus. Now it's common to have a PCI bus and a PCI-E bus, but that's not really the same thing. Then again, it's not the shared graphics-only bus, either.

                    • I don't know about you, but I almost always had two cards on VLB in my 486's

                      I don't know about you, but I never saw that work without a whole lot of hassle, and I seldom saw it work properly. In theory, you could have three VLB cards. In practice, you could only be sure that they would work if you only had one of them. Since few people had a need for them (you could get pretty good throughput from one of those Adaptec ISA SCSI cards with its own Z80 running the show, basically any of the ones that work without drivers) few people found it worth the trouble.

                      So, no, VLB wasn't just a graphics bus.

                      In theory, no. In practic

                • Interesting. I'm surprised that they managed to pull that off over a PCI bus.

                  They were able to do it because the PowerVR chip processed a tile-at-once and only wrote the final rendered frame over PCI. Since you never have to read the frame buffer for more complex multi-pass effects (unlike all other cards at the time), you could get by with PCI throughput. The card had local memory to store textures and the scene draw ordering buffer.

                  Out of curiosity, did the PowerVR cards manage to behave well in that re

      • Re: (Score:2, Insightful)

        by Anonymous Coward
        At least the Nvidia driver works reliably. That's worth a few megs.
    • 241mb for the latest beta driver.
      • Does that include the tool you have to download and run in order to download the installer?
        • No, but it's a universal installer. One driver package for all supported operating systems, 32 and 64 bit variants, and all supported graphics cards. It's pretty impressive really.

    • Remember that the package contains drivers for multiple chips and, 3D graphics drivers inevitably contain really much code. 90MB does not sound that unreasonable at all. Think about all the stuff to translate DirectX/OpenGL calls to the specific hardware processing units. NetFx requirement is indeed a bit silly and comes from Catalyst Control Center, which is an app that is slow junk on Windows. The Linux version uses Qt and works ok.
    • by Anonymous Coward on Sunday October 27, 2013 @07:30PM (#45254393)

      The drivers do NOT need .Net, or 90Mb. The extremely crappy control panel, which has NOTHING to do with the drivers, uses the dreadful .Net API, and thusly needs loads of HDD space. People in-the-know install third-party front-ends like 'Tray Tools' or the like.

      Sadly, ATI loves to take significant pay-offs from companies like MS, acting if THEY are the customer, not the person who purchased the graphics card. This, we can truly describe as ATI/AMD endlessly shooting themselves in the foot. Using .Net for the official control panel was a disgusting and despicable act, and was a great example of the contempt the older version of ATI had for its users.

      AMD/ATI is a much better company today- it was either improve or die, and after the longest possible time, AMD finally made the right choice. However, we get glimpses of the bad old ATI with issues like the fiasco over the recent release of 'new' GPU cards that are almost all just re-brands of older cards, with the free games removed (AND higher prices). This kick-in-the-teeth for customers was done simply so AMD can make a song and dance about free games with all their cards AFTER they finish releasing the new 290 family (the 290X is just the first of three 290 cards- the 'free' games won't be announced until after AMD launches all of them).

      In truth, ATI/AMD customers need to be smarter than customers of Nvidia products. Nvidia prides itself on cards that 'just work'. With AMD, you frequently need to know what you are doing, at which point AMD rivals Nvidia- but 'out of the box' the AMD experience is usually worse. Nvidia supports its older graphics cards MUCH better than AMD, but older graphics cards from AMD tend to get faster with time as newer games exploit the more forward looking architecture of ATI designs.

      People have more problems with ATI cards in games, but this happens because uncommon settings in ATI's control panel (like the number of frames being rendered ahead) can cause terrible game problems if not adjusted per game on the desktop. Again, informed ATI owners KNOW which settings to tweak, but for the average user, the ATI experience can be frustrating. This is entirely ATI's fault, because a PC game, with a tiny amount of code, can programmatically set the correct options, but many game developers do not know how to do this. Nvidia does a much better job helping developers set-up their game code correctly for all usable generations of Nvidia graphics cards.

      ATI has a nasty habit, as well, of disowning very recent cards that, on paper, had the features to support current games. ATI likes its shills to say ('jeez, your 4 year old card is out-of-date junk') whereas Nvidia happily ensures every generation of its cards that support DX9 work as well as their hardware allows. In reality, ATI cards from the 2000, 3000 and 4000 series are effectively the same as everything up to the 6000 series (excluding the orphan architecture of the 6900 VLIW4 oddities). However, ATI pays technical sites to state the cards from the 5000 series and earlier are obsolete (technically this is completely untrue). In contrast, Nvidia is proud to support cards from the 8000 series and onwards, which is a similar timeframe to the 2000 series from ATI.

      While it is true that 'cheap' current gen cards destroy premium cards from that far back, it is the principle that matters.

      • by Smauler ( 915644 )

        ATI has a nasty habit, as well, of disowning very recent cards that, on paper, had the features to support current games. ATI likes its shills to say ('jeez, your 4 year old card is out-of-date junk') whereas Nvidia happily ensures every generation of its cards that support DX9 work as well as their hardware allows.

        This... I've gone from a ti4200 to an 8800GT to a 460GTX, and that is all. I've had decent graphics throughout... not great, I've always been behind the times (save for my 8800 for a while),

      • by daern ( 526012 )

        In reality, ATI cards from the 2000, 3000 and 4000 series are effectively the same as everything up to the 6000 series (excluding the orphan architecture of the 6900 VLIW4 oddities). However, ATI pays technical sites to state the cards from the 5000 series and earlier are obsolete (technically this is completely untrue). In contrast, Nvidia is proud to support cards from the 8000 series and onwards, which is a similar timeframe to the 2000 series from ATI.

        While it is true that 'cheap' current gen cards destroy premium cards from that far back, it is the principle that matters.

        Fair comment, except that in Windows 8.1 you /cannot/ install any AMD-supplied driver on my HD3870. It's a perfectly serviceable card, but has now been rendered obsolete through the manufacturer abandoning it. The reason is that they won't supply WDDM 1.3 or 1.2 drivers for this card, and they won't supply updated WDDM 1.1 drivers for 8.1

        Certainly makes me think twice about buying another AMD card...

    • Re:ATI drivers (Score:4, Insightful)

      by fuzzyfuzzyfungus ( 1223518 ) on Sunday October 27, 2013 @07:48PM (#45254475) Journal

      I installed fresh ATI graphics drivers today. 90MB for a driver. .Net 4.5 needed to be installed. GTFO.

      As much as I find 'Catalyst Control Center' to be totally fucking useless, and would be pleased by a 'just the damn driver, the OS already has interfaces for changing monitor resolution and whatnot' edition, isn't using relevant vendor APIs for your application, rather than rolling your own or using real antiques, sort of what you are encouraged to do?

      Its existence is obnoxious; but it would hardly be the better for depending on an older .NET version, or QT, or some braindead AMD custom nonsense, would it?

    • by thegarbz ( 1787294 ) on Sunday October 27, 2013 @10:39PM (#45255481)

      I installed fresh ATI graphics drivers today. 90MB for a driver. .Net 4.5 needed to be installed. GTFO.

      You didn't download a 90MB driver. You downloaded a 90MB package which includes all drivers for all versions of windows, for all architectures, for all ATI cards, and it came with a utility that automatically installs the correct thing for your situation.

      I wish more companies did this. Take the guess work out of the download screen. NVIDIA does it too.

      Also what's wrong with .NET 4.5? Do you regularly judge applications solely by the framework their developers chose?

      • Re: (Score:3, Informative)

        by edxwelch ( 600979 )

        > Also what's wrong with .NET 4.5?
        It's slow... so slow to open what is basically a dialog box.
        Also, it's not cross platform - so they can't use it for Mac and Linux.

        • Re: (Score:3, Informative)

          by thegarbz ( 1787294 )

          Nothing about drivers is cross platform. I highly doubt that even made it into a list of considerations.

          As for .NET being slow, yes it's slow for the end user. But how often do you use it? I don't think I've opened the NVIDIA control panel since I installed windows a year ago. You know what .NET is fast at? Developing. Your complicated dialogue box you likely never use was also likely very quick to throw together.

          • A great deal about drivers _is_ cross platform. Many of the same libraries, used by programs to manage the actual behavior of the card, are OpenGL based which is indeed cross-platform. The binary drivers do require _compilation_ for a particular graphical environment, and that does take thoughtful development to manage OS-specific function calls.

            My experience of .NET is that it's very fast for developing programs that are unusably slow because they are bloated. This is not a good trade-off.

  • by brxndxn ( 461473 ) on Sunday October 27, 2013 @06:54PM (#45254209)

    After seeing AMD bet the farm on Athlon and beat a company with 10x the r&d budget, I cannot help but be a fan. The biggest reason for AMD being behind in CPUs today is lack of r&d budget based on unfair duopoly competition from Intel during the years where AMD was superior. Hopefully, AMD can make up for this missing r&d money by being superior with graphics for a while.

    I do not believe there is a tech company pushing more innovation with less resources..
     

    • I like them simply because they are significantly cheaper for the same performance.
      I prefer saving more money and getting 'pretty damn good' rather than 'ultimate system which is out of date a week after it arrives'.

    • Has AMD only cut the budget for CPU R&D and not GPUs? That might be a strategic decision, betting that x86 is a dead-end technology and GPUs are the future. It's plausible that more and more computing (not just graphics) will shift to the GPU in the future, and a (standardized?) compute-capable GPU will become a required part of the PC platform.
      • by Kjella ( 173770 )

        Has AMD only cut the budget for CPU R&D and not GPUs?

        They don't report R&D per division, only overall so you'd pretty detailed internal knowledge to say.

    • by Kjella ( 173770 ) on Sunday October 27, 2013 @08:50PM (#45254869) Homepage

      Partly, but they never could match Intel on process technology which meant Intel always had a cost advantage, even when their CPU designs were inferior. As for more recent events, AMD looks saved for a while as the division that includes consoles more than doubled last quarter and gave them an overall profit so at least for the next year or so with big console sales they should be good. Still, with all their diversifying I'm worried that they simply don't want to step back into the ring with Intel, but instead focus on graphics cards, graphics-heavy APUs, heterogenous computing, semicustom designs, ARM micrservers and so on.

      The reason I say that is because their CPU sales are way down, still going down and losing money - they have to either really step it up or step out and their roadmaps don't exactly indicate going on the offensive, just moderate revisions that might keep them from losing more ground. They have CPUs good enough to be "console-quality" for this generation of consoles, that'll sell for a good while since many PC games will be console ports and so play well on that level of hardware even if they give up competing with future Intel CPUs. It's not like they're competing very well on high-performance or performance per watt today, jjust performance per dollar and it's showing on AMDs financials.

    • I've been looking at getting a card to that will run Star Citizen decently during the dogfighting module. Not looking to spend more than $100 as I'll buy a new desktop computer next year as originally scheduled. Biggest problem was the weak power supply in the existing PC. Well the the R7 240 only draws 30 watts of power for about $90. I know it's not the highest performance card. Existing 7750's and 7770's beats it's performance, but with the R7 240 I didn't have to worry about spending $50 and repla

      • That's good, if you pay attention to gddr5 vs ddr3 ; the former is much better even on the lowest card. i.e. look only at 1GB cards with explicit gddr5, even for radeon R7 240, 250, 7730, 7750.

    • I'm actually just as impressed with their business wins of late. They've gone from posting massive losses with no signs of anything on the horizon to getting all the processors for the next-gen consoles except the Wii U's CPU, plus a heavy feature in the new Mac Pro, plus a growing tablet side. And while Bulldozer still seems to be an overall failure, GCN is very competitive and Jaguar seems to be pretty powerful.

      If they can fix the IPC problems with Bulldozer, or otherwise get a decently competitive deskto

      • by 0123456 ( 636235 )

        I'm guessing the only reason they got CPU wins for the consoles is because they got the GPU wins for those consoles. And while volumes may be high, the margins will be minimal compared to a desktop CPU sale.

        • Yes, but at this point AMD needed the publicity win. I'm on a few gaming-oriented sites, and opinion on AMD has pretty much pulled a 180 from a year ago. The standard responses have gone from "they're doomed" and "they suck" to "they're winning in some areas" and genuine interest in their future plans. Bulldozer CPUs aren't popular, except for some video editors, but their GPUs are on the upswing (they're fast going from "OK but cheap" to "good and cheap") and there's growing interest in Jaguar chips in low

      • by Noishe ( 829350 )

        accidentally moderated redundant so posting to clear moderation

  • "fastest GPU available with its fastest r9 for as little as $550 each" Well, I am glad that they are available for as little as that.
  • by Anonymous Coward on Sunday October 27, 2013 @07:09PM (#45254303)

    We've also come to learn that AMD changed the double-precision rate from 1/4 to 1/8 on the R9 290X, yielding a maximum .7 TFLOPS. The FirePro version of this configuration will support full-speed (1/2 rate) DP compute, giving professional users an incentive to spring for Hawaii's professional implementation.

    Lots of folks use ggpu but don't have a "professional" budget to pay the extortion fee to have artificial limits lifted from the hardware they purchase.

  • > AMD/ATI also has introduced MANTLE Api for lower level access than DirectX which is cross platform. This may turn into a very important API as AMD/ATI have their GPUs in the next generation Sony and Xbox consoles as well with a large marketshare for game developers to target

    I read somewhere that that's unfortunately not true; Mantle will not be available for the new Xbox or Playstation. My speculation is that Microsoft and Sony don't actually want to be THAT compatible as it would make porting too eas

    • I read somewhere that that's unfortunately not true; Mantle will not be available for the new Xbox or Playstation. My speculation is that Microsoft and Sony don't actually want to be THAT compatible as it would make porting too easy...

      Its the other way around. Mantle on PC is the equivalent of console APIs, with all the lovely low level access you get in consoles.
      You will be able to take console game and throw a little shim between GFX calls to make it PC Mantle game.

  • by Dahamma ( 304068 ) on Sunday October 27, 2013 @07:44PM (#45254459)

    ...but "for as little as $550 each" just blows my mind.

    I thought I was crazy when I spent $400 on a graphics card once, but I (and I understand it's subjective) was perfectly happy with the performance on any game I played for the next 2 years. $500-$1000 (x2) Crossfire/SLI setups just seem to me to be about people with too much money and not enough creativity as to how to spend it...

    • by pspahn ( 1175617 ) on Sunday October 27, 2013 @07:57PM (#45254531)

      I've always had the notion that if you just wait a year, you can get yesterday's models for a great price and instead play the games that now have been out long enough to be properly patched. This has the bonus effect of weeding out a lot of crap games.

      • by thegarbz ( 1787294 ) on Sunday October 27, 2013 @10:41PM (#45255493)

        I've always had the notion that if you just wait a year, you can get yesterday's models for a great price and instead play the games that now have been out long enough to be properly patched. This has the bonus effect of weeding out a lot of crap games.

        Which of course comes with some downsides. [xkcd.com]

        • I resemble this remark, but I don't think I'm the target audience of these graphic card vendors. The only game I currently play is DOTA on the Warcraft III engine, under wine on Linux Mint 14 on an AMD64 system I assembled around 2005 that includes a Radeon 9500 or 9600.
          I don't meet the system requirements for DOTA2 on Steam, so I'm guess I need to upgrade some time in the next couple of years.

    • You have to consider performance against the current flagship, the ATI card is as fast as a TITAN in most cases and significantly less costly. So if the highest performing part (titan) is 200-400 more than the ATI card, then the ATI card (for that performance level) is in fact the lowest cost option.

      You always pay premium for the bleeding edge. When the 8800 GTX/GTS came out it was the same since it was the top performer so they could charge premium prices. At the time the 8800 was so good they could cha

    • by symbolset ( 646467 ) * on Monday October 28, 2013 @12:20AM (#45255961) Journal

      This stuff is just amazing to me. The bottom end R7 260x card clocks 1.97 TFLOPS for $139. For that $550 you get up to 5.6 TFLOPS. It wasn't so long ago you would expect to pay $2,000 for a desktop PC. In fact, you still can.

      In June 1997 ASCI Red at Sandia Labs was the first supercomputer in the TOP500 to breach the 1TFLOPS barrier. It had 7,264 cores in 104 cabinets or system racks consuming a total of 1600 square feet of datacenter space. It required 850 KW of power, not including cooling. With upgrades it remained at the #1 spot on the supercomputer charts until 2000, and wasn't decomissioned until early 2006 when it remained in the TOP500 list as #276 with only 2.4 TFLOPS.

  • Right. Cross platform would be important, especially if the API appeared on the next-gen consoles.

    However, I can't really see Microsoft implementing this API on their console. And I don't think Sony will do that either.

    And then there's the fact that a game developer now needs to implement two APIs - and if "Mantle" is actually closer to the hardware then there won't be much portability between the two. Which makes this somewhat dead in the water.

    • by Tr3vin ( 1220548 )
      The benefit of being a hardware manufacturer is that AMD can build and provide the low level access. Guess what? The are providing the GPU/CPU combo in both the Xbox One and the PS4. So it doesn't matter what Microsoft or Sony do, that low level support is there. High-end games were typically done using a low level API like Mantle in previous gens. The big deal here is that AMD is bringing that support to PC. Since their console GPUs and the new desktop GPUs are built around the same architecture, any optim
      • Those are locked-down consoles. Might be a trifle difficult to install a direct competitor on such a device. You're pretty naive if you think that AMD can simply provide such tools against the wishes of the creators of the console.

        Not to mention that I'd be wary because even if you got it to run, you as a developer might simply be hit by the banhammer for using an inofficial API.

  • rich people problems (Score:4, Informative)

    by Gothmolly ( 148874 ) on Sunday October 27, 2013 @08:44PM (#45254839)

    "only" 550 dollars. Most people spend less than that on a whole computer, or don't HAVE 550 dollars.

    • by Kjella ( 173770 )

      Most people also don't have any use for a $550 graphics card, if you buy one you're almost certainly a serious gamer who'll get many hours of use out of that card. Of course you're not doing that on minimum wage, but having a $1k gaming PC is hardly just for the excessively wealthy. Honestly the cash investment is very low compared to many other hobbies, it's mostly time and effort. Just like WoW addiction is probably the cheapest addiction you can get, well if you don't count losing your job over it. Perso

  • by bertok ( 226922 ) on Sunday October 27, 2013 @09:04PM (#45254941)

    Has anyone else noticed that despite the endless 4K resolution marketing being put out there by AMD, there is not a peep on the specific type of HDMI port the card has?

    There is a HUGE difference between HDMI 2.0 and 1.4, but it's always specified as just "HDMI" with no version number. No review mentions the HDMI version, even though one would think that a real journalist would put in some effort to research this and find out.

    I suppose it's easier to run the card through a bunch of automated benchmarks, cut & paste 15 pages of results to maximise ad impressions, and call it a "review".

    • by batkiwi ( 137781 )

      It will have DisplayPort, as will any monitors of 4k resolution.

      • by bertok ( 226922 )

        That's great, but my 4K TV only has HDMI inputs, just like every other new TV out there.

        • by dabadab ( 126782 )

          That's great, but my 4K TV only has HDMI inputs

          That most probably means that those HDMI inputs are NOT HDMI 2.0 ports (especially because the HDMI 2.0 specs were released this September).

  • AMD/ATI also has introduced MANTLE Api for lower level access than DirectX which is cross platform. This may turn into a very important API as AMD/ATI have their GPUs in the next generation Sony and Xbox consoles as well with a large marketshare for game developers to target

    MANTLE is not on any of the consoles. [tomshardware.com] This articles mentions only the lack of Mantle on the Xbone, but since the PS4 GPU is the same architecture with bigger numbers, it's safe to say it's not on the PS4 either.

    Anyway, the problem with Mantle is not mantle it self, but the lack of games that will actually make good and innovative use of that tech. Sure, FrostByte 3 games will support Mantle but for what? So that you can play console games with better graphics? Sorry, good graphics are a great but afte

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...