AMD Unveils Radeon RX Vega Series Consumer Graphics Cards Starting At $399 (hothardware.com) 91
MojoKid writes: AMD has officially lifted the veil on its new Radeon RX consumer graphics line-up, featuring the company's next-generation Vega GPU architecture. Initially, there are four cards in the Radeon RX Vega line-up, the standard air-cooled Radeon RX Vega 64, a Radeon RX Vega 64 Limited Edition with stylized metal fan shroud, the liquid-cooled Radeon RX Vega 64 Liquid, and the lower-cost Radeon RX Vega 56. At the heart of all Radeon RX Vega series cards is the Vega 10 GPU which is comprised of roughly 12.5 billion transistors and is manufactured using a 14nm FinFET LPP process. Vega 10 can reliably reach the 1.7GHz range, whereas AMD's previous gen Fiji hovered around 1GHz. The base GPU clock speed of the air-cooled Vega 64 is 1,247MHz with a boost clock of 1,546MHz. There is 8GB of HBM2 memory on-board that offers up peak bandwidth of 484GB/s. All told, the Radeon RX Vega 64 is capable of 25.3 TFLOPs (half-precision) of compute performance. The Radeon RX Vega 64 Liquid-Cooled Edition has the same GPU configuration, but with higher base and boost clocks -- 1,406MHz and 1,677MHz, respectively. The lower cost Radeon RX Vega 56 features the same Vega 10 GPU, but 8 of its CUs have been disabled and its clocks are somewhat lower. Although AMD touts a number of efficiency improvements, the Vega RX series requires some serious power. Vega 56 board power is in the 210 Watt range, while the top-end liquid-cooled card hits 345 Watts. AMD claims top-end Vega cards will be competitive with NVIDIA's GeForce GTX 1080 series of cards. AMD Radeon RX Vega graphics cards are expected to ship on August 14th.
210 Watts?! (Score:3, Insightful)
When their "low-end" graphics card requires low-end gamers to buy a bigger power supply as the first step, something is wrong.
Re: (Score:2)
Re:210 Watts?! (Score:4, Insightful)
Keep in mind this is a $399 "low end" graphics card. We're not talking a maintsream card here, but still a card targetted toward enthusiasts and gamers. Big big difference compared to a truly "low end" mainstream card.
Re: (Score:2)
Ah, a lot of articles never list the damn prices so it's hard to understand who the products are for.
Re: (Score:2)
Ah, a lot of articles never list the damn prices so it's hard to understand who the products are for.
You could RTFH... not article, not summary, but headline would suffice.
Re: (Score:2)
The keywords here are "consumer graphics cards". That's usually the low-end and mid-range cards.
Re: (Score:2)
We're supposed to read even the headlines now? What has this place turned into?!
Re: 210 Watts?! (Score:4, Informative)
Re: (Score:2)
That's just for the GPU core. The total power draw of the card approaches 300 W for the normal full (64 CU) version, and 350 for the water-cooled full version.
Re: (Score:2)
Re: (Score:2)
It also had no heatsink, and it ran hot enough to boil water. And it did not give a fuck, pumping out frames like nothing a PC had ever seen before.
(That said... neither my Voodoo3 2000 nor the 3500TV I scored somewhat later had any auxiliary power connectors.)
Re: (Score:2)
a three-phase, 240V outlet
Seriously? That sounds more like an SGI box rather than a PC card.
Re: (Score:2)
I might be misremembering, but I recall some older graphics cards were designed to be plugged straight into a wall outlet.
A graphics card that could make direct use of 110/220V AC? I think not. Perhaps some strange specialty card with its own wall wart/on-board converter, but that would only be used because the PSU was impossible to replace / upgrade. Not that I've ever heard of it, but stranger things have happened.
Re: (Score:2, Informative)
What about a 110VAC sarcasm detector?
Re: (Score:2)
The infamous quad GPU 3dfx Voodoo 5 6000 required so much power that it needed an external power adapter called the "Voodoo Volts". This was back when we only had AGP graphics card slots, so the idea of having dual PCI-E power connections on a single card was unheard of at the time.
Nowadays, one of these watercooled Radeon Vega cards can draw up to 400 watts when it's overclocked. And there will be people who will run 2 of them in Crossfire. Crazy.
Re: (Score:2)
The infamous quad GPU 3dfx Voodoo 5 6000 required so much power that it needed an external power adapter called the "Voodoo Volts". This was back when we only had AGP graphics card slots, so the idea of having dual PCI-E power connections on a single card was unheard of at the time.
Didn't know that one. From some quick googling they got 25W over AGP and could have used a standard molex connector for the rest - in fact the leads were there on the board - but they didn't trust the wimpy PSU in most PCs to take another 50-60W. I'm guessing lesser cards drawing more than 25W did, but rather than risk the magic smoke coming out they made an external connector and included a 12V adapter. I guess that makes sense if people built stuff from parts and you didn't trust them, if you were an OEM
Re: (Score:2)
Re: (Score:2)
Discount prices at the end of the mining hype (Score:2)
Lots of AMD GPUs will then be sold on the second-hand market, also putting pressure on the prices of new GPUs.
Re: Discount prices at the end of the mining hype (Score:2)
Re: (Score:2)
Etherium can't be mined with ASICs.
Sure it can. It's just that no one has bothered to make ASICs with memory sizes suited to mining Ethereum.
Why?
1) The difficulty spikes are such that an ASIC that is profitable one day can be worse than useless the next day. With Bitcoin, you get a more gradual and more predictable difficulty change, and ASICs lose profitability pretty much in lock step along with it. Ethereum ASIC profitability would drop off a cliff once you run out of memory.
2) Because that same development money is better (and more sa
Re: (Score:2)
I don't think it is that simple. You can build an ASIC for Etherium, but different from Bitcoin it requires a good external memory interface. The memory requirements for Etherium are just too high for using only internal memory. Your ASIC architecture would likely look a lot like a GPU but remove many things that are not required for ETH mining.
Re: (Score:2)
Re: (Score:2)
I believe someone announced a card specifically for that. basically a stripped down and optimized vid card that doesn't do vid.
Also ETH is heading for a change from POW to POS which I believe removes much of the processing requirements.
Remember when... (Score:4, Insightful)
Remember when advertisements for graphics cards talked about what the card could show you rather than how many transistors it had and the processor speed?
What I want to know about a new card is what picture it can put out and to how many monitors of what connection type.
This sounds more like it's advertising a CPU than a graphics card.
Re: (Score:2)
Even my current "low-end" card (an RX 460) can drive six normal-ish displays: One HDMI, one DVI-D, and four HDMI on its singular DisplayPort output using an adapter. And then the motherboard itself sports three more outputs (DVI-D, VGA, HDMI), which (in many OSs) also all get rendered by the discrete GPU.
That's nine fucking independant, concurrent screens on a low-end budget-built PC from last year. How many do you want?
I have no idea how it behaves at 4k-ish resolutions, or with high refresh rates, but
Re: (Score:2)
Even my current "low-end" card (an RX 460) can drive six normal-ish displays: One HDMI, one DVI-D, and four HDMI on its singular DisplayPort output using an adapter.
I guess you should let AMD know, since they say [amd.com]:
Up to 5 displays with DisplayPort MST hub
Not sure why that's still a subject though... for non-intensive applications I think it's been solved a while and for games I'd rather go for a single ultra-wide, if games have trouble you can presumably set it to a normal 16:9 resolution and get black bars. There was a time you couldn't get monitors to match but with the current 34/38" monsters the only advantage to multi-monitor is if you get them cheap/free. And for a big video wall there's probably cheaper
Re: (Score:3)
Lots of "graphics" cards will never have a monitor plugged in. GPU computing (whether for cryptocurrency mining or other purposes) is very much a thing now. The cards I use for mining run rings around the cards I use to drive monitors.
Uh, no... (Score:5, Insightful)
That's not a "Consumer Graphics Card". That's a gaming enthusiast card. Consumer cards top out at $150 or so, and do not draw 210W. Hell, most "Consumer Grade" PCs don't even have 8GB of RAM.
Re: (Score:2)
So gamers aren't consumers?
The difference between these cards are targetted toward end users. Sure it's targetted toward a specific sub-set of end users, but it's still meant to be sold in individual quantities to end users.
The Vega Frontier edition was meant to be targetted towards science and research, and maybe crypto-miners(?). Those are generally not consumer. I don't disagree that this is a gaming enthusiast line right now, but it's still meant for consumers and not institutions/professionals.
Re: (Score:2)
A cheap bottle of vodka and a fine champagne are both consumer products. You're making an arbitrarily line in the sand where there is none.
Re: (Score:2)
Re: (Score:2)
Let's be very clear about this: most consumers don't buy a PC anymore.
They buy a laptop, tablet or phone. Or game-console. Or a smart-TV.
Re: (Score:2)
Most enthusiasts don't buy PCs either; they buy the parts and build their own.
Why? (Score:1)
Re: Why? (Score:1)
Consoles are grossly inferior for gaming. Most consoles can barely do 1080P at more than 30fps and 4K? Fat fucking chance. Just upscaled bullshit.
Not to mention unlike consoles, your average PC doesn't shit the bed every three years or less. Fuck unless PCI-E dies any expected death you can just buy a new $400 graphics card in 5 years and still have a superior gaming experience to whatever new crappy console is being peddled.
Also... mouse and keyboard are vastly superior inputs for a large amount of games.
Re: (Score:2)
why spend as much if not more on a PC graphics card as a complete game system will cost (Xbox PlayStation...ect)?
Because it's far more powerful, therefore it will perform better: more detailed models, better quality visual effects, higher resolution, and most importantly, smoother framerate! This makes a huge difference in enjoyment.
if gaming is a major hobby... (Score:3, Insightful)
'advanced' games on even the best consoles have lousy framerates and refresh rates when the going gets hard. They lack mosue and keyboard. Their graphics settings are what we call 'low' or 'low-medium' on a PC and look noticably worse in many places. And consoles have much lower resolutions.
On the PC you can experience the best games as nature intended. You can 'mod' games like Skyrim and Fallout to remarkable degrees (and no, the lame limited modding on the consoles doesn't start to compare). And you can r
Re: (Score:1)
> Wow, you sound like a fat autist who never gets laid!
You must be new here.
Re: (Score:1)
also $550 THREADRIPPER quad ram and 64 pci-e (Score:2)
also $550 THREADRIPPER quad channel ram and 64 pci-e intel can't touch that.
Re: (Score:2)
I'm waiting for 32 core Threadripper parts with 8 channels for RAM and 128 lanes of PCIe. The server parts (Epyc) have this, and the Threadripper parts are nearly identical. They even have 4 dies under the heatspreader just like the Epyc parts. (Each die has 2 "CCX" modules which each have 4 cores.)
Re: (Score:2)
2 of the dies are dummies/dead dies. They're there for structural integrity and possibly Infinity Fabric routing for the half of the PCI-E lanes that would normally be used by the other chips.
Re: (Score:2)
Yes, in current samples that is the case. There's nothing stopping them from releasing an SKU with 32 cores. The socket can handle it.
Re: (Score:2)
Yes, in current samples that is the case. There's nothing stopping them from releasing an SKU with 32 cores. The socket can handle it.
Well obviously since it appears to be the same socket as EPYC. It's quite possible you'll be able to use one of those in a Threadripper motherboard at EPYC prices of course. I don't see much reason for AMD to release a separate 32C consumer chip for less though.
Re: (Score:2)
The socket is keyed slightly differently, I believe. I'm not exactly sure.
AMD could do it because Intel trotted out an 18 core part for bragging rights. It'd be a huge FU.
Re: (Score:2)
To be fair, the driver doesn't contain the telemetry, the "GeForce Experience" application does.
Unless you like tweaking settings or want to use Shadowplay for recording, you don't need that application. You can use the latest WHQL drivers from Windows Update or go to somewhere like Guru3D and get the extracted driver files without all the bullshit.
Some clarifications (Score:4, Insightful)
Mind that "unveils" in this case means a paper launch and the actual video cards will be released after August, 14, 2017. Or even later considering the number of delays to this point.
Given everything that we already know about this AMD's GPU generation one can only wonder why they release these GPUs at all. Underpowered, consuming twice as much power as the nearest competition (~350W vs 180W), costing too much to produce (HBM2) and most likely resulting in a huge write off when the company desperately needs successful competitive products to stay afloat. Consumer Vega is anything but.
I still want to believe that Vega to AMD is like Fermi to NVIDIA and AMD's new generation of GPUs will be actually competitive.
Re: (Score:2)
Nvidia's GTX 1080 TI uses closer to 250W than 180W.
Re: (Score:2)
They are almost as fast as... (Score:1)
So these cards are near 1080 speeds? Not the 1080 ti's but the slower, older 1080. When can we expect flagship cards from AND that compete with Nvidia's flagship devices?
Re: (Score:3)
Navi, so Q3/4 2018 at the earliest, might slip to Q1/2 2019 depending on 7nm process. That being said, for gamers who don't play twitch FPS's competitively or are rocking higher refresh rate 4k monitors, Vega looks to be a very good solution as the framerate band is narrower than the GTX 1080.
Re: (Score:2)
Those gamers can do very well with a GTX 1070.
Sadly Vega doesn't fit anywhere - in theory. In practice, if it's bad at mining, gamers might stand a chance of buying a gaming card, because GTX 1070 is rarer than hair on my wife's pussy.
Re: (Score:2)
Vega 56 is in the same spot regarding the 1070 if not a bit ahead, with the same narrow frameband. It's true that the MSRPs for the base 1070s are lower by 20 bucks, but the memory bandwidth makes it good for currency mining and thus actual prices are 100 dollars higher.
Re: (Score:2)
Vega 56 is in the same spot regarding the 1070 if not a bit ahead, with the same narrow frameband. It's true that the MSRPs for the base 1070s are lower by 20 bucks, but the memory bandwidth makes it good for currency mining and thus actual prices are 100 dollars higher.
Thats not good enough. IT has to be the best or gamers will shun the brand. Why do you think the 1050TI sells like hotcakes over the 200% faster RX 470 for just $40 more?
It is because it is an Nvidia. THat is why. Besides the last month before prices skyrocketed I saw a guy pick up the shitty 1050ti and I mentioned that rx 470 was double the performance. He got mad and said games are meant to be played on Nvidia. It can't possible match this etc.
something is clearly faulty with the Vega chip (Score:5, Informative)
A year late, and AMD has a part with key specs identical to its two year old Fury chip. The new chip isn't more power efficient than Fury, nor does it do more work per clock. And the two year old Fury chip itself was a disaster, compared to the earlier 290, considering die size and power draw and HBM memory stack.
AMD's new Zen CPU, on the other hnad, literally slaughters the current Intel competition in all key metrics.
Vega reminds us of 'bulldozer', AMD's horrible pre-Zen CPU architecture that cloned Intel's horrible CPU architecture, Netburst. After AMD made the world's first x64 (64-bit x86) CPU and the world's first true x64 dual core, AMD's management became very corrupt and chose to follow Intel's netburst as the simplest management decision that would maximise management bonuses and pensions. Intel, meanwhile, cancelled the putrid netburst, and copied the AMD x64 design- creating the highly successful Core 2 design.
When AMD's bulldozer CPU (very very late) finally appeared, its performance bore no relationship to the appaernetly good specs of the CPU. Later it transpired that all the key memory blocks of the chip were so terrible, it didn't matter how many pipes the core had or how powerful the ALUs were.
I think Vega's memory sub-systems are totally broken as well. On paper Vega is a 'maths monster' (shaders- the units used to give rendered triangles their advanced lighting and material properties). On paper the triangle rate matches Nvidia's best- memory bandwidth is as good- the ROP system (finished pixels) likewise. But in practice the massive die runs slower than Nvidia's much smaller 1080, and uses much more power when doing so. Synthetic benchmarks show the maths power is as advertised. So Vega has to be a horrible STALL monster like bulldozer (stall is when your work units are constantly starved of any work to do).
The saddest fact is that AMD's 480/580 polaris chip is really very good- and AMD could easily have added 50% more performance by building a polaris part with 50% more of everything. This chip would have cost next to nothing to design, could have been ready in 6 months, would cost little to build a card around, would use ordianry cheap memory, and would have been a little better than the Nvidia 1070 card. But the head honcho at AMD's graphic division knew such a project would make his personal Vega chip look like a terrible joke by comparison- so cancelled competing 'big' polaris designs.
AMD's recent GPU history has seen the pointless 285/380 chip, the terrible Fury chip and the terrible Vega chip. in the same time frame AMD delvered just one good chip- the above mentioned 480/580. That's a metric ton of wasted R+D from a company with little money to spend. Meanwhile Nvidia is on a killer streak- most recently with the 1070/1080 and 1080TI. While AMD goes for hopeless unrequired exotic new designs, Nvida just keeps refining a successful old one.
Until AMD sacks the engineers responsible for the broken blocks in Fury and Vega, these engineers will continue to screw up future designs.
Re: (Score:2)
please mod this up... I already replied to this thread.
Re: (Score:2)
All the words... missing the key.
Re: (Score:2)
This is a node size transition.
It is not. Polaris was the node size transition from 28nm to 14nm. Vega ist a supposedly totally new and better chip architecture.
Re: (Score:3)
While AMD goes for hopeless unrequired exotic new designs, Nvida just keeps refining a successful old one.
On memory architecture that's true, the bet on HBM was premature but nVidia is now also doing HBM2 in the V100 data center GPU so it's not a low performance choice though it might be a cost driver. For the rest of the GPU though nVidia's Maxwell architecture brought a tile-based rasterizer which was a huge new trick. Vega was supposed to bring the same functionality to AMD, but so far it's disabled on the frontier edition and probably the gaming edition too either because the drivers aren't ready or they co
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Interesting)
Your analysis of bulldozer and core2 are made almost entirely of "alternative facts". Yes, bulldozer was a bad idea, but it wasn't because AMD tried to copy netburst. Yes, Core2 was a good platform, but intel didn't implement most of the AMD64 platform until nehelem, and as such their memory throughput was much worse until then. Instead, Core2 was a bigger cache, dual core version of Pentium M (which was mostly a reworked P3). Intel didn't even have to match the architecture efficiency of Athlon64 to co
Re: (Score:2)
AMD's new Zen CPU, on the other hnad, literally slaughters the current Intel competition in all key metrics.
Except single core performance, which I'd say is a pretty big key metric (especially for gamers). Honestly I don't understand why people are being so blinded by Ryzen. Competition is great, but let's not start a cult here. Ryzen does have great multicore performance. It "slaughters" intel in low-end workstation builds, which I'm sure a lot of professionals trying to save a buck can appreciate.
Re: (Score:2)
I agree that the performance is hugely underwhelming, especially considering the power consumption and the release date. I have no regrets at all about my GeForce purchase now.
If this is a typical AMD/ATI driver clusterfuck, we can expect to see the performance ramp up to more reasonable levels over the next few months. This wouldn't be their first card to launch with subpar drivers. Not by a long shot.
But, yeah, right now there is no reason to recommend these as gaming cards at all. Maybe their compute per
GTX 1070 performance for the cost of a 1080TI (Score:2)
No thank you [gamersnexus.net]. Hell, even the 1060 is about as fast in 4 Honor!!
Face it AMD is done. They killed what was mediocre of ATI at the turn of the decade and never recovered. Drivers are shitty and it reaks of a cheap quality knock off. Not saying this as a troll, but realistically if you ask any ATI/AMD users where Nvidia drivers are uncrashable and just work at launch.
Re: (Score:2)
Given the impressive hardware specs, I suspect this is largely driver issues.
But there is no way I would pay those asking prices until the performance nudged up.
Maybe these will be decent cards in 2-3 months, but I wouldn't pay for a "maybe".
Re: (Score:2)
Given the impressive hardware specs, I suspect this is largely driver issues.
But there is no way I would pay those asking prices until the performance nudged up.
Maybe these will be decent cards in 2-3 months, but I wouldn't pay for a "maybe".
That's the issue with AMD cards. Maybe it's Async compute slowing things down? Or like another poster said maybe it is just a bad chip similiar to Bulldozer? An RX 470 has the same specs as a 1070 in terms of math! Look it up? However, gaming performance is drastically different. WIth async optimized titles like Due Ex mankind divided it can get kind of close an GTX 1070. Other than that no just like bulldozer if you maxed out 100% of all cores would get close to an i5/i7 in performance but that is rare out
Re: (Score:2)