Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD Graphics Hardware Technology

AMD Unveils Radeon RX Vega Series Consumer Graphics Cards Starting At $399 (hothardware.com) 91

MojoKid writes: AMD has officially lifted the veil on its new Radeon RX consumer graphics line-up, featuring the company's next-generation Vega GPU architecture. Initially, there are four cards in the Radeon RX Vega line-up, the standard air-cooled Radeon RX Vega 64, a Radeon RX Vega 64 Limited Edition with stylized metal fan shroud, the liquid-cooled Radeon RX Vega 64 Liquid, and the lower-cost Radeon RX Vega 56. At the heart of all Radeon RX Vega series cards is the Vega 10 GPU which is comprised of roughly 12.5 billion transistors and is manufactured using a 14nm FinFET LPP process. Vega 10 can reliably reach the 1.7GHz range, whereas AMD's previous gen Fiji hovered around 1GHz. The base GPU clock speed of the air-cooled Vega 64 is 1,247MHz with a boost clock of 1,546MHz. There is 8GB of HBM2 memory on-board that offers up peak bandwidth of 484GB/s. All told, the Radeon RX Vega 64 is capable of 25.3 TFLOPs (half-precision) of compute performance. The Radeon RX Vega 64 Liquid-Cooled Edition has the same GPU configuration, but with higher base and boost clocks -- 1,406MHz and 1,677MHz, respectively. The lower cost Radeon RX Vega 56 features the same Vega 10 GPU, but 8 of its CUs have been disabled and its clocks are somewhat lower. Although AMD touts a number of efficiency improvements, the Vega RX series requires some serious power. Vega 56 board power is in the 210 Watt range, while the top-end liquid-cooled card hits 345 Watts. AMD claims top-end Vega cards will be competitive with NVIDIA's GeForce GTX 1080 series of cards. AMD Radeon RX Vega graphics cards are expected to ship on August 14th.
This discussion has been archived. No new comments can be posted.

AMD Unveils Radeon RX Vega Series Consumer Graphics Cards Starting At $399

Comments Filter:
  • 210 Watts?! (Score:3, Insightful)

    by DontBeAMoran ( 4843879 ) on Monday July 31, 2017 @06:03PM (#54916597)

    When their "low-end" graphics card requires low-end gamers to buy a bigger power supply as the first step, something is wrong.

    • by oic0 ( 1864384 )
      Never had any power limitations except on prebuilt PCs. A good power supply costs so little extra and adds headroom for age, meaning I can keep it for multiple generations.
    • Re:210 Watts?! (Score:4, Insightful)

      by wangmaster ( 760932 ) on Monday July 31, 2017 @06:15PM (#54916673)

      Keep in mind this is a $399 "low end" graphics card. We're not talking a maintsream card here, but still a card targetted toward enthusiasts and gamers. Big big difference compared to a truly "low end" mainstream card.

      • Ah, a lot of articles never list the damn prices so it's hard to understand who the products are for.

        • by Kjella ( 173770 )

          Ah, a lot of articles never list the damn prices so it's hard to understand who the products are for.

          You could RTFH... not article, not summary, but headline would suffice.

    • Re: 210 Watts?! (Score:4, Informative)

      by K. S. Kyosuke ( 729550 ) on Monday July 31, 2017 @06:43PM (#54916811)
      Their "low end" graphics is at around 60 watts currently.
    • That's just for the GPU core. The total power draw of the card approaches 300 W for the normal full (64 CU) version, and 350 for the water-cooled full version.

    • I might be misremembering, but I recall some older graphics cards were designed to be plugged straight into a wall outlet.
      • by Kjella ( 173770 )

        I might be misremembering, but I recall some older graphics cards were designed to be plugged straight into a wall outlet.

        A graphics card that could make direct use of 110/220V AC? I think not. Perhaps some strange specialty card with its own wall wart/on-board converter, but that would only be used because the PSU was impossible to replace / upgrade. Not that I've ever heard of it, but stranger things have happened.

        • Re: (Score:2, Informative)

          by Anonymous Coward

          What about a 110VAC sarcasm detector?

        • The infamous quad GPU 3dfx Voodoo 5 6000 required so much power that it needed an external power adapter called the "Voodoo Volts". This was back when we only had AGP graphics card slots, so the idea of having dual PCI-E power connections on a single card was unheard of at the time.

          Nowadays, one of these watercooled Radeon Vega cards can draw up to 400 watts when it's overclocked. And there will be people who will run 2 of them in Crossfire. Crazy.

          • by Kjella ( 173770 )

            The infamous quad GPU 3dfx Voodoo 5 6000 required so much power that it needed an external power adapter called the "Voodoo Volts". This was back when we only had AGP graphics card slots, so the idea of having dual PCI-E power connections on a single card was unheard of at the time.

            Didn't know that one. From some quick googling they got 25W over AGP and could have used a standard molex connector for the rest - in fact the leads were there on the board - but they didn't trust the wimpy PSU in most PCs to take another 50-60W. I'm guessing lesser cards drawing more than 25W did, but rather than risk the magic smoke coming out they made an external connector and included a 12V adapter. I guess that makes sense if people built stuff from parts and you didn't trust them, if you were an OEM

      • They used to sell power supplies just for video cards too that fit in a CD bay on your pc and had their own power cable!
    • That's actually not all that surprising. The GTX 1080 is rated [geforce.com] at 180 Watts.
  • I for one will wait for the discount prices to be expected at the end of the current mining hype.

    Lots of AMD GPUs will then be sold on the second-hand market, also putting pressure on the prices of new GPUs.
  • Remember when... (Score:4, Insightful)

    by green1 ( 322787 ) on Monday July 31, 2017 @06:12PM (#54916653)

    Remember when advertisements for graphics cards talked about what the card could show you rather than how many transistors it had and the processor speed?

    What I want to know about a new card is what picture it can put out and to how many monitors of what connection type.

    This sounds more like it's advertising a CPU than a graphics card.

    • by adolf ( 21054 )

      Even my current "low-end" card (an RX 460) can drive six normal-ish displays: One HDMI, one DVI-D, and four HDMI on its singular DisplayPort output using an adapter. And then the motherboard itself sports three more outputs (DVI-D, VGA, HDMI), which (in many OSs) also all get rendered by the discrete GPU.

      That's nine fucking independant, concurrent screens on a low-end budget-built PC from last year. How many do you want?

      I have no idea how it behaves at 4k-ish resolutions, or with high refresh rates, but

      • by Kjella ( 173770 )

        Even my current "low-end" card (an RX 460) can drive six normal-ish displays: One HDMI, one DVI-D, and four HDMI on its singular DisplayPort output using an adapter.

        I guess you should let AMD know, since they say [amd.com]:

        Up to 5 displays with DisplayPort MST hub

        Not sure why that's still a subject though... for non-intensive applications I think it's been solved a while and for games I'd rather go for a single ultra-wide, if games have trouble you can presumably set it to a normal 16:9 resolution and get black bars. There was a time you couldn't get monitors to match but with the current 34/38" monsters the only advantage to multi-monitor is if you get them cheap/free. And for a big video wall there's probably cheaper

    • by ncc74656 ( 45571 ) *

      Remember when advertisements for graphics cards talked about what the card could show you rather than how many transistors it had and the processor speed?

      What I want to know about a new card is what picture it can put out and to how many monitors of what connection type.

      Lots of "graphics" cards will never have a monitor plugged in. GPU computing (whether for cryptocurrency mining or other purposes) is very much a thing now. The cards I use for mining run rings around the cards I use to drive monitors.

  • Uh, no... (Score:5, Insightful)

    by sconeu ( 64226 ) on Monday July 31, 2017 @06:16PM (#54916681) Homepage Journal

    That's not a "Consumer Graphics Card". That's a gaming enthusiast card. Consumer cards top out at $150 or so, and do not draw 210W. Hell, most "Consumer Grade" PCs don't even have 8GB of RAM.

    • So gamers aren't consumers?
      The difference between these cards are targetted toward end users. Sure it's targetted toward a specific sub-set of end users, but it's still meant to be sold in individual quantities to end users.

      The Vega Frontier edition was meant to be targetted towards science and research, and maybe crypto-miners(?). Those are generally not consumer. I don't disagree that this is a gaming enthusiast line right now, but it's still meant for consumers and not institutions/professionals.

    • by Kjella ( 173770 )

      A cheap bottle of vodka and a fine champagne are both consumer products. You're making an arbitrarily line in the sand where there is none.

    • Comment removed based on user account deletion
    • by Lennie ( 16154 )

      Let's be very clear about this: most consumers don't buy a PC anymore.

      They buy a laptop, tablet or phone. Or game-console. Or a smart-TV.

  • Ive always wondered why spend as much if not more on a PC graphics card as a complete game system will cost (Xbox PlayStation...ect)? I can understand people who do a lot of video rendering and such, but it seems silly to spend that much for gaming with the dedicated systems being cheaper.
    • by Anonymous Coward

      Consoles are grossly inferior for gaming. Most consoles can barely do 1080P at more than 30fps and 4K? Fat fucking chance. Just upscaled bullshit.

      Not to mention unlike consoles, your average PC doesn't shit the bed every three years or less. Fuck unless PCI-E dies any expected death you can just buy a new $400 graphics card in 5 years and still have a superior gaming experience to whatever new crappy console is being peddled.

      Also... mouse and keyboard are vastly superior inputs for a large amount of games.

    • why spend as much if not more on a PC graphics card as a complete game system will cost (Xbox PlayStation...ect)?

      Because it's far more powerful, therefore it will perform better: more detailed models, better quality visual effects, higher resolution, and most importantly, smoother framerate! This makes a huge difference in enjoyment.

    • by Anonymous Coward

      'advanced' games on even the best consoles have lousy framerates and refresh rates when the going gets hard. They lack mosue and keyboard. Their graphics settings are what we call 'low' or 'low-medium' on a PC and look noticably worse in many places. And consoles have much lower resolutions.

      On the PC you can experience the best games as nature intended. You can 'mod' games like Skyrim and Fallout to remarkable degrees (and no, the lame limited modding on the consoles doesn't start to compare). And you can r

      • Thanks. I had always assumed that because they were built just for gaming, that consoles would run them best. I guess its fairly obvious that I am no gamer.
  • also $550 THREADRIPPER quad channel ram and 64 pci-e intel can't touch that.

    • I'm waiting for 32 core Threadripper parts with 8 channels for RAM and 128 lanes of PCIe. The server parts (Epyc) have this, and the Threadripper parts are nearly identical. They even have 4 dies under the heatspreader just like the Epyc parts. (Each die has 2 "CCX" modules which each have 4 cores.)

      • 2 of the dies are dummies/dead dies. They're there for structural integrity and possibly Infinity Fabric routing for the half of the PCI-E lanes that would normally be used by the other chips.

        • Yes, in current samples that is the case. There's nothing stopping them from releasing an SKU with 32 cores. The socket can handle it.

          • by Kjella ( 173770 )

            Yes, in current samples that is the case. There's nothing stopping them from releasing an SKU with 32 cores. The socket can handle it.

            Well obviously since it appears to be the same socket as EPYC. It's quite possible you'll be able to use one of those in a Threadripper motherboard at EPYC prices of course. I don't see much reason for AMD to release a separate 32C consumer chip for less though.

            • The socket is keyed slightly differently, I believe. I'm not exactly sure.
              AMD could do it because Intel trotted out an 18 core part for bragging rights. It'd be a huge FU.

  • by Artem S. Tashkinov ( 764309 ) on Monday July 31, 2017 @07:16PM (#54916945) Homepage

    Mind that "unveils" in this case means a paper launch and the actual video cards will be released after August, 14, 2017. Or even later considering the number of delays to this point.

    Given everything that we already know about this AMD's GPU generation one can only wonder why they release these GPUs at all. Underpowered, consuming twice as much power as the nearest competition (~350W vs 180W), costing too much to produce (HBM2) and most likely resulting in a huge write off when the company desperately needs successful competitive products to stay afloat. Consumer Vega is anything but.

    I still want to believe that Vega to AMD is like Fermi to NVIDIA and AMD's new generation of GPUs will be actually competitive.

  • by Anonymous Coward

    So these cards are near 1080 speeds? Not the 1080 ti's but the slower, older 1080. When can we expect flagship cards from AND that compete with Nvidia's flagship devices?

    • Navi, so Q3/4 2018 at the earliest, might slip to Q1/2 2019 depending on 7nm process. That being said, for gamers who don't play twitch FPS's competitively or are rocking higher refresh rate 4k monitors, Vega looks to be a very good solution as the framerate band is narrower than the GTX 1080.

      • Those gamers can do very well with a GTX 1070.
        Sadly Vega doesn't fit anywhere - in theory. In practice, if it's bad at mining, gamers might stand a chance of buying a gaming card, because GTX 1070 is rarer than hair on my wife's pussy.

        • Vega 56 is in the same spot regarding the 1070 if not a bit ahead, with the same narrow frameband. It's true that the MSRPs for the base 1070s are lower by 20 bucks, but the memory bandwidth makes it good for currency mining and thus actual prices are 100 dollars higher.

          • Vega 56 is in the same spot regarding the 1070 if not a bit ahead, with the same narrow frameband. It's true that the MSRPs for the base 1070s are lower by 20 bucks, but the memory bandwidth makes it good for currency mining and thus actual prices are 100 dollars higher.

            Thats not good enough. IT has to be the best or gamers will shun the brand. Why do you think the 1050TI sells like hotcakes over the 200% faster RX 470 for just $40 more?

            It is because it is an Nvidia. THat is why. Besides the last month before prices skyrocketed I saw a guy pick up the shitty 1050ti and I mentioned that rx 470 was double the performance. He got mad and said games are meant to be played on Nvidia. It can't possible match this etc.

  • by Anonymous Coward on Monday July 31, 2017 @07:53PM (#54917117)

    A year late, and AMD has a part with key specs identical to its two year old Fury chip. The new chip isn't more power efficient than Fury, nor does it do more work per clock. And the two year old Fury chip itself was a disaster, compared to the earlier 290, considering die size and power draw and HBM memory stack.

    AMD's new Zen CPU, on the other hnad, literally slaughters the current Intel competition in all key metrics.

    Vega reminds us of 'bulldozer', AMD's horrible pre-Zen CPU architecture that cloned Intel's horrible CPU architecture, Netburst. After AMD made the world's first x64 (64-bit x86) CPU and the world's first true x64 dual core, AMD's management became very corrupt and chose to follow Intel's netburst as the simplest management decision that would maximise management bonuses and pensions. Intel, meanwhile, cancelled the putrid netburst, and copied the AMD x64 design- creating the highly successful Core 2 design.

    When AMD's bulldozer CPU (very very late) finally appeared, its performance bore no relationship to the appaernetly good specs of the CPU. Later it transpired that all the key memory blocks of the chip were so terrible, it didn't matter how many pipes the core had or how powerful the ALUs were.

    I think Vega's memory sub-systems are totally broken as well. On paper Vega is a 'maths monster' (shaders- the units used to give rendered triangles their advanced lighting and material properties). On paper the triangle rate matches Nvidia's best- memory bandwidth is as good- the ROP system (finished pixels) likewise. But in practice the massive die runs slower than Nvidia's much smaller 1080, and uses much more power when doing so. Synthetic benchmarks show the maths power is as advertised. So Vega has to be a horrible STALL monster like bulldozer (stall is when your work units are constantly starved of any work to do).

    The saddest fact is that AMD's 480/580 polaris chip is really very good- and AMD could easily have added 50% more performance by building a polaris part with 50% more of everything. This chip would have cost next to nothing to design, could have been ready in 6 months, would cost little to build a card around, would use ordianry cheap memory, and would have been a little better than the Nvidia 1070 card. But the head honcho at AMD's graphic division knew such a project would make his personal Vega chip look like a terrible joke by comparison- so cancelled competing 'big' polaris designs.

    AMD's recent GPU history has seen the pointless 285/380 chip, the terrible Fury chip and the terrible Vega chip. in the same time frame AMD delvered just one good chip- the above mentioned 480/580. That's a metric ton of wasted R+D from a company with little money to spend. Meanwhile Nvidia is on a killer streak- most recently with the 1070/1080 and 1080TI. While AMD goes for hopeless unrequired exotic new designs, Nvida just keeps refining a successful old one.

    Until AMD sacks the engineers responsible for the broken blocks in Fury and Vega, these engineers will continue to screw up future designs.

    • please mod this up... I already replied to this thread.

    • This is a node size transition.

      All the words... missing the key.
      • This is a node size transition.

        It is not. Polaris was the node size transition from 28nm to 14nm. Vega ist a supposedly totally new and better chip architecture.

    • by Kjella ( 173770 )

      While AMD goes for hopeless unrequired exotic new designs, Nvida just keeps refining a successful old one.

      On memory architecture that's true, the bet on HBM was premature but nVidia is now also doing HBM2 in the V100 data center GPU so it's not a low performance choice though it might be a cost driver. For the rest of the GPU though nVidia's Maxwell architecture brought a tile-based rasterizer which was a huge new trick. Vega was supposed to bring the same functionality to AMD, but so far it's disabled on the frontier edition and probably the gaming edition too either because the drivers aren't ready or they co

    • Comment removed based on user account deletion
    • Re: (Score:3, Interesting)

      by Anonymous Coward

      Your analysis of bulldozer and core2 are made almost entirely of "alternative facts". Yes, bulldozer was a bad idea, but it wasn't because AMD tried to copy netburst. Yes, Core2 was a good platform, but intel didn't implement most of the AMD64 platform until nehelem, and as such their memory throughput was much worse until then. Instead, Core2 was a bigger cache, dual core version of Pentium M (which was mostly a reworked P3). Intel didn't even have to match the architecture efficiency of Athlon64 to co

    • AMD's new Zen CPU, on the other hnad, literally slaughters the current Intel competition in all key metrics.

      Except single core performance, which I'd say is a pretty big key metric (especially for gamers). Honestly I don't understand why people are being so blinded by Ryzen. Competition is great, but let's not start a cult here. Ryzen does have great multicore performance. It "slaughters" intel in low-end workstation builds, which I'm sure a lot of professionals trying to save a buck can appreciate.

    • I agree that the performance is hugely underwhelming, especially considering the power consumption and the release date. I have no regrets at all about my GeForce purchase now.

      If this is a typical AMD/ATI driver clusterfuck, we can expect to see the performance ramp up to more reasonable levels over the next few months. This wouldn't be their first card to launch with subpar drivers. Not by a long shot.

      But, yeah, right now there is no reason to recommend these as gaming cards at all. Maybe their compute per

  • No thank you [gamersnexus.net]. Hell, even the 1060 is about as fast in 4 Honor!!

    Face it AMD is done. They killed what was mediocre of ATI at the turn of the decade and never recovered. Drivers are shitty and it reaks of a cheap quality knock off. Not saying this as a troll, but realistically if you ask any ATI/AMD users where Nvidia drivers are uncrashable and just work at launch.

    • Given the impressive hardware specs, I suspect this is largely driver issues.

      But there is no way I would pay those asking prices until the performance nudged up.

      Maybe these will be decent cards in 2-3 months, but I wouldn't pay for a "maybe".

      • Given the impressive hardware specs, I suspect this is largely driver issues.

        But there is no way I would pay those asking prices until the performance nudged up.

        Maybe these will be decent cards in 2-3 months, but I wouldn't pay for a "maybe".

        That's the issue with AMD cards. Maybe it's Async compute slowing things down? Or like another poster said maybe it is just a bad chip similiar to Bulldozer? An RX 470 has the same specs as a 1070 in terms of math! Look it up? However, gaming performance is drastically different. WIth async optimized titles like Due Ex mankind divided it can get kind of close an GTX 1070. Other than that no just like bulldozer if you maxed out 100% of all cores would get close to an i5/i7 in performance but that is rare out

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...