Forgot your password?
typodupeerror
AMD Graphics Hardware

AMD Radeon HD 7990 Released: Dual GPUs and 6G of Memory for $1000 189

Posted by Unknown Lamer
from the faster-than-a-voodoo-4 dept.
An anonymous reader writes "Today AMD has officially unveiled its long-awaited dual-GPU Tahiti-based card. Codenamed Malta, the $1,000 Radeon HD 7990 is positioned directly against Nvidia's dual-GPU GeForce GTX 690. Tom's Hardware posted the performance data. Because Fraps measures data at a stage in the pipeline before what is actually seen on-screen, they employed Nvidia's FCAT (Frame Capture Analysis Tools). ... The 690 is beating AMD's new flagship in six out of eight titles. ... AMD is bundling eight titles with every 7990, including: BioShock Infinite, Tomb Raider, Crysis 3, Far Cry 3, Far Cry 3: Blood Dragon, Hitman: Absolution, Sleeping Dogs, and Deus Ex: Human Revolution." OpenGL performance doesn't seem too off from the competing Nvidia card, but the 7990 dominates when using OpenCL. Power management looks decent: ~375W at full load, but a nice 20W at idle (it can turn the second chip off entirely when unneeded). PC Perspective claims there are issues with Crossfire and an un-synchronized rendering pipeline that leads to a slight decrease in the actual frame rate, but that should be fixed by an updated Catalyst this summer.
This discussion has been archived. No new comments can be posted.

AMD Radeon HD 7990 Released: Dual GPUs and 6G of Memory for $1000

Comments Filter:
  • by Jedi Holocron (225191) on Wednesday April 24, 2013 @10:18AM (#43536253) Homepage Journal

    ...how fast can it mine Bitcoins.

    • Not as fast as the ASIC cards...
    • Sounds silly but it probably wouldn't have been that hard to throw in some bitcoin optimizations, resulting in some easy sales.
    • by dubdays (410710)

      ...how fast can it mine Bitcoins.

      For that price, it had better hash at 1 GH/s! Money, money, money, money! ....MONEY! (at least until the next crash in what will probably be--at the rate that shit is happening--the very near future :-/ )

  • I'm going to rush out and buy four of them right away.

  • Is it worth it? (Score:5, Insightful)

    by onyxruby (118189) <onyxrubyNO@SPAMcomcast.net> on Wednesday April 24, 2013 @10:27AM (#43536339)

    That card has quite impressive specs and frankly has as much horsepower as a fair number of computers that were being produced as recently as - yesterday. Trickle down tech works wonders and we will see something like this that is affordable for the masses within a few years. For that reason alone I can't knock the card and it's feature set.

    The price on this is through the roof and it makes me think that this is a waste of money for 99.9999% of gamers. If you were put in a blind test with this card and a 'mere' $500 card how many people would even be able to notice the difference? This isn't a CAD card meant for workstations and it makes me wonder what the real world benefits of the card are other than bragging rights?

    • by tompaulco (629533)
      This isn't a CAD card meant for workstations and it makes me wonder what the real world benefits of the card are other than bragging rights?
      Well, you are correct there. A CAD card would cost five times what this little glorified VGA adapter does.
    • Re: (Score:3, Informative)

      by Anonymous Coward

      As they mentioned, it can run all top games in 4k+ resolutions (either a dedicated 4k display or eyefinity configuration), something that your $500 card can't do. But yes, you don't need it to run Crysis 3 at 1920x1080

      • Thank you... I would love to get a dedicated 4K display, but the cost is still a bit prohibitive.
    • True, true. These kind if deluxe GPUs (and the Intel Extreme CPU line, or the discontinued Nokia Vertu phones, for example) always have a bad value. They are a "stupid money" purchase: when you have a lot of money to burn and want to just get the best bling and be done with it. Hey, at least it keeps the economy running. ;)
    • by FreonTrip (694097)
      Short of using it for OpenCL at lower rates than FirePro cards or gaming across multiple high-res monitors or a numbingly high-res projector, I've got to agree. But if its generous margins help subsidize ongoing research into cards mortal humans can use and appreciate, I can't complain either.
    • by Nidi62 (1525137)

      The price on this is through the roof and it makes me think that this is a waste of money for 99.9999% of gamers. If you were put in a blind test with this card and a 'mere' $500 card how many people would even be able to notice the difference? This isn't a CAD card meant for workstations and it makes me wonder what the real world benefits of the card are other than bragging rights?

      It essentially IS a $500 card, considering it comes with almost $500 worth of games.

      • by chispito (1870390)

        It essentially IS a $500 card, considering it comes with almost $500 worth of games.

        I can't find where TFA lists the bundled games, but I imagine if I were the target market for this card, I'd already own them. And since I'm not the target market, I'll probably buy those $500 worth of games for $50 during the next Steam blowout (and then suffer through the crappy framerate).

    • Is it worth it?

      Honestly? No. My personal experience has shown that dual GPUs do not pay in the end. Many problems (microstutter, needs compatible games) and performance gains are questionable (performance gain versus cost). Is better, at least to me, to buy a single-GPU card
    • Essentially all 'dual GPUs on a single card' products are mostly for e-peen(because of their relatively low production volume, it's pretty common to find 2x of the single GPU equivalents for rather less than the price within a short time, so they only really make sense if you are doing some absolutely nutty 4+ GPU configuration).

      However, given the ability to ('eyefinity', I think they call it) aggregate multiple monitors into one virtual monitor(to allow you to paint applications that don't understand multi

    • I would agree a $1K GPU is largely a waste for the majority however you are missing WHY someone would even spend $1,000 on a GPU in the first place.

      Namely, I recently picked up a GTX Titan for a couple of reasons:

      * I run all* my games at 120 Hz** so I can use LightBoost*** on my Asus VG248QE monitor. (I don't care about triple monitor 5760x1080)
      * Since it is a single GPU chip I don't have to worry about microstuttering**** issues plague that ALL SLI / XFIRE cards.
      * It only uses 250W***** under full load
      * It

      • by onyxruby (118189)

        So before your smug comments you might actually want to TALK to a gamer and find out their _reasons_ instead of just dissing everything as some epeen -- those losers are the posers / fanbois.

        Who's being smug about anything? I never criticized gamers or spending money on it as a hobby. I've certainly spent a decent chunk over the years for my own gaming. I have no issue with that and your effectively putting words in my mouth. As for talking to gamers, I think it's a safe bet that Slashdot has many gamers wh

      • Titan is a good choice, but way to expensive on my country. Mind you, here he costs almost $2,000!
  • GPGPU (Score:5, Interesting)

    by godrik (1287354) on Wednesday April 24, 2013 @10:41AM (#43536487)

    many slashdotters points to the extreme price of that graphic card for gamers. I am more interested in the GPGU performance. The comparison uses OpenCL to be able to compare against nvidia's hardware. But I feel like OpenCL is a second class citizen for nvidia. How much the performance difference would be using a carefully crafted CUDA implementation on the nvidia hardware?

    • by Shinobi (19308)

      AMD is decent if you fit into some very specific memory access patterns... If you don't, they slow down to a crawl.

      Nvidia with CUDA is far more versatile, and not to mention MUCH more solid drivers(and don't need X under Linux, unlike AMD....)

  • The opencl performance really speaks for the direction AMD is going with their APU's in the future. The design of this chip will be directly imbedded into AMD's next generation Kaveri APU where the GPU and CPU share the same cache, which should allow all sorts of crazy performance optimizations in everything from Programming languages http://news.softpedia.com/news/AMD-and-Oracle-Team-Up-for-Heterogeneous-Computing-on-Java-295882.shtml [softpedia.com] and databases: http://pbbakkum.com/db/ [pbbakkum.com] I know the database link deals wi
  • by DarthVain (724186) on Wednesday April 24, 2013 @11:04AM (#43536703)

    LOL, so at full load you will need pretty much a secondary PSU to run the damn thing... really decent power management I guess! Though I guess considering the stupid 1000$ price tag, you probably don't care about buying a 200$ 1200-1500W PSU I suppose.

    Then again if you want to run a crossfire configuration, that's a 750W under load minimum. As a few HD and a high end processor, well you are hitting some PSU limits!

    That said if I had unlimited money I might buy it, though even then probably not as it is such a waste.

    Also htf did they pick the name "Malta"? I mean at least nVidia had the good sense to call their penis "Titan" for gods sake!

    • LOL, so at full load you will need pretty much a secondary PSU to run the damn thing... really decent power management I guess! Though I guess considering the stupid 1000$ price tag, you probably don't care about buying a 200$ 1200-1500W PSU I suppose.

      For $1000 they should throw in a new power supply as part of the package.

      But seriously, there are stories all over the place about PC sales declining and AMD is losing money --- and this is what they do? A $1000 video card? Even if there's a huge profit margin on this thing, how many of them are they really going to sell?

      • by armanox (826486)

        While they won't sell many, they won't produce too many either.

      • by DarthVain (724186)

        Someone mentioned to me that these are already crossfire, so you are really buying two 500$ cards, which isn't as crazy really.

        That said, these things were never meant to generate sales, only media and "prestige".

        I liken it to car companies making racing cars, even those they only sell a few and loose money on everyone. However for the rest of us shmucks that buy the 30,000$ car that has that race car pedigree... i.e. AMD or nVidia want to prove them against each other at the very highest end, hoping that

      • But seriously, there are stories all over the place about PC sales declining and AMD is losing money --- and this is what they do? A $1000 video card? Even if there's a huge profit margin on this thing, how many of them are they really going to sell?

        That's what they are actually going to sell the best.

        PC sales aren't declining because suddenly the public has decided to massively run away from PCs.
        PC sales are declining simply because what people have is "good enough". They has an old PC or laptop, and they aren't going to buy a new one anytime soon because it still does the work. They prefere to concentrate their money on other stuff (buying portable devices like phone and tablet, which get completely obsolete very quickly, in fact sometimes even befor

    • by Khyber (864651)

      Ummm.... the 7990 is already a crossfire GPU. In case you weren't paying attention, this is a DUAL-GPU card, which means while its single-GPU counterparts might use ~270w (7970 GPU) this one ends up being more efficient since everything is on the same board, whereas you could just CrossFire two 7970 cards and end up with over 500w max load power consumption.

      • by DarthVain (724186)

        Ah, I guess that would make sense, sorta, insomuch as a 1000$ video card makes sense. Though at the same time if what you say is true (and I didn't RTFA), then really this is more like bulk buying two 500$ video cards, which realistically have been around for long time as a "reasonable" price point. I think the old nVidia Geforce 500TI started at 500$ and that was like 13 years ago or something.

        Anyway there will be fools who want to "quad" these things out, so 750W would still apply. Regardless 375W is stil

        • by Khyber (864651)

          "I think the old nVidia Geforce 500TI started at 500$ and that was like 13 years ago or something."

          Back in 2000, nVidia was on GeForce 2 (right after the GeForce 256.)

  • When I was looking for a new computer, I was between the NVidia 680 and the AMD 7950. People were having horrendous problems with 7950 performance on some popular games. It was bad enough that the choice was a no-brainer for me. I'd be interested to see what any current owners might say.

  • by Control-Z (321144) on Wednesday April 24, 2013 @01:04PM (#43538207)

    Cutting edge is cool but I always go for the best $150 card I can buy. That gets you a good last-gen card that will still be good for years of service.

  • Since I already have 6 of those games, and don't want the other 2, I wouldn't pay 1k$ for this card.
    By my math (~55$ per game), that's 440$ of games I don't need. $352 if assume a 25% markup.
    Thus, I would buy an OEM bundle of the card, for between $550 - $650.
    Makes sense to me.

    (Ya, I know they get paid to include the games in the bundle...that's not my point)

  • Skip this one (Score:4, Interesting)

    by Fackamato (913248) on Wednesday April 24, 2013 @02:47PM (#43539127)

    Massive coil whine issues. No matter of HSF replacement or chassis sound proofing/dampening will get rid of it, the coil whine will be the loudest part of your computer when you're playing a game.

    How can AMD release a 1000 dollar card that has such a massive issue? The dual GPU ASUS card did not have this problem.

Porsche: there simply is no substitute. -- Risky Business

Working...