Forgot your password?
typodupeerror
This discussion has been archived. No new comments can be posted.

AMD Releases FirePro V5900 and V7900 Workstation GPUs

Comments Filter:
  • by fuzzyfuzzyfungus (1223518) on Tuesday May 24, 2011 @10:26AM (#36227710) Journal
    Typically, the "workstation" card makes you pay out the nose per unit silicon(though, at the same time, the top of the "workstation" range is going to be the only place to find the maximum RAM available to that generation, along with genlock and similar); but the "gamer" card will probably skimp on things like double-precision math and drivers that don't suck for anything other than playing Metal of Duty Crysis Evolved.
  • An Actual Summary. (Score:5, Informative)

    by YojimboJango (978350) on Tuesday May 24, 2011 @10:32AM (#36227782)

    A summary since we don't seem to have a good one here:
    AMD releases two new video cards targeted at the CAD type audience competing with the Quadro line from Nvidia. The hardware itself isn't anything you couldn't find in your average high end gaming card, but new but they've done stupid amount of driver optimisation for design work which is why these cards cost more. More interesting though is how (comparatively) low AMD has priced these models ($599 and $999).

    From the Article:
    "We’ll do a follow-up article with the charts and graphs that the more pedantic among you expect, along with some interesting comparisons to other products, but in the meantime, I will summarize it with this: In SpecViewperf 11, the V7900 is about neck-and-neck with the $4000 NVIDIA Quadro 6000, and in some tests exceeded the legendary Q6000."

  • by gman003 (1693318) on Tuesday May 24, 2011 @10:38AM (#36227852)
    The consumer cards actually do make sense. For nVidia, it's "first number is the generation, second number is the part within that generation" - a 580 is better than a 570, but worse than a 590. Likewise, a 480 is newer than a 280, but not as new as a 580. You can also generally make the assertion that cards with the same ending numbers, but different generations, will fill the same role (and same rough price), but the newer one will be slightly better. AMD/ATI uses four numbers, but the last is always a 0 and can be ignored. They essentially follow a similar patter - first number is generation, middle two are part within that generation, and last one is a zero (to make it look cooler). So a 5870 is better than a 5770, but not as good as a 5970. And a 5970 is older than a 6990, but newer than a 4970. AMD recently changed how their within-generation numbers go, so you can't just assume that, say, a 6970 will outperform a 5970 (it won't, actually), but comparisons within a generation are still good. And these are hardly new - ATI/AMD has used that patten since 2006, while nVidia has been using theirs since 2008 (prior to that, they had a 4-digit number (really two digits with a 00 at the end) and a few letters).

    The workstation cards, though, are an absolute mess. About the only claim you can even generally defend is that "bigger numbers are better". And even that is rather iffy. And trying to figure out which consumer card a workstation card was based on requires an encyclopedia of them.

    While I imagine workstation cards can get away with having non-linear names like that (since anyone buying a $3,500 graphics card will do their research), I imagine even professionals get confused by it all easily.
  • by Skynyrd (25155) on Tuesday May 24, 2011 @01:04PM (#36229722) Homepage

    between a 300W $500 high-end gaming video card and a $500 "workstation" card that consumes half the power? What is missing from the workstation card?

    What's missing from the card? Certification for SolidWorks, Inventor, etc is missing from the consumer card.

  • by sexconker (1179573) on Tuesday May 24, 2011 @01:37PM (#36230062)

    You're just oh so wrong.
    Nvidia used to use 4 digits (FX 5xxx, 6xxx, 7xxx, 8xxx, 9xxx), then they went to 2xx.

    After 2xx they went to 4xx. Along the way they peppered in a few 1xx and 3xx parts that nobody bought (they were all rebadges of the defective G92 chips. The 9xxx and early 2xx were also defective. The revamps in the 8xxx (8800 GT, and the second revision of the 8800 GTS) line were also defective. Then they went to 5xx.

    The last number hasn't always been 0, either. There's the GTX 285, for example. And of course, OEMs can add whatever bullshit they want at the end of it, such as OC, SE, SSE. And of course they have to include the Nvidia shitfest of GT, GTX, Ultra, M, whatever.

    In order of performance (best to worst) it goes Ultra, GTX, GT, GTS, (nothing), GS, then M, LE, and other shit. You can compare within a single model number, but not across generations. These monikers didn't exist until the 6800 family came out. We had the 6800, the 6800 GT, and the 6800 Ultra. The 7000 series was just a rehash of the 6000 series. The 8000 series was indeed a new GPU, and introduced the shitfuck of GTS (which seemed to be tacked on to the cards that would otherwise NOT have a GT/Ultra/whatever shit added to them). Now we start adding GX or GX 2 to shit to indicate it's a dual-gpu card. The next family of chips came with the 2xx series. Not the first few out the door, mind you, but the Fermi 280s.

    The only thing consistent about Nvidia for the last decade is that if the second number is an 8, you have the flagship part. You always want the flagship part, because it is the only one that actually receives proper engineering and testing. You can choose whatever binning (ultra gtx gt gts) or overclocked horseshit you want from msi/asus/whoever.
    6800 Ultra/GT/vanilla.
    8800 Ultra/GTX/GTS.
    280, 480, 580, etc into the future maybe.

    When they start futzing around with 8600s or 7950s, or GX2s, or the 8800 GT or 8800 GTS v2 (aka 8800 GTS 512), what's happening is they're hastily tweaking shit to adapt to the market sectors and reduce cost (or slap as much shit as they can fit in an ATX case and go for the supid performance crown for bragging rights). These are always sloppy jobs. With Nvidia, we had bumpgate. But in general, you get less reliable shit, at a later date, for a bit less money.

    With ATi/AMD, you've got a whole different can of worms.
    They were doing 8xxx and 9xxx a decade ago, then went to x1xxx (the firxt x is a literal x) and x2xxx.
    Then they went to HD 3xxx, HD 4xxx, HD 5xxx, and HD 6xxx.

    They've used monikers such as XT, Pro, and LE.
    They've stopped using the 8 as the flagship indicator (from the old 9800 pro to the 5800 series) and now use 9 for the flagship.
    As such, a 5850 is better than/the same as a 6850, and a 5870 is better/the same as than a 6870.
    The generational bump this round added a 1 to the second number. The 6970 is the big brother of the 5870.
    And if you want to compare the 5970 to something, you'll be looking at the 6990.

    So no. In short, it makes zero sense. You can't look at the name and discern anything when they add changing and intentionally consufing XT Pro GT GTX GTS GTS v2 GS LE M GX GX2 etc, along with model numbers that can't be directly compared unless the first digit is the same. Add in the OC tweaks and branding, ePeen gun-style cases, and CG girls and orcs, and no one can tell you the difference between the Gigabyte Radeon HD 5870 SOC and the MSI Radeon HD 6870 HAWK without looking at benchmarks.

For God's sake, stop researching for a while and begin to think!

Working...