AMD Launches Higher Performance Radeon RX 580 and RX 570 Polaris Graphics Cards (hothardware.com) 93
Reader MojoKid writes: In preparation for the impending launch of AMD's next-generation Vega GPU architecture, which will eventually reside at the top of the company's graphics product stack, the company unveiled a refresh of its mainstream graphics card line-up with more-powerful Polaris-based GPUs. The new AMD Radeon RX 580 and RX 570 are built around AMD's Polaris 20 GPU, which is an updated revision of Polaris 10. The Radeon RX 580 features 36 Compute Units, with a total of 2,304 shader processors and boost / base GPU clocks of 1340MHz and 1257MHz, respectively, along with 8GB of GDDR5 over a 256-bit interface. The Radeon RX 580 offers up a total of 6.17 TFLOPs of compute performance with up to 256GB/s of peak memory bandwidth. Though based on the same chip, the Radeon RX 570 has only 32 active CUs and 2048 shader processors. Boost and base reference clocks are 1244MHz and 1168MHz, respectively with 4GB of GDDR5 memory also connected over a 256-bit interface. At reference clocks, the peak compute performance of the Radeon RX 570 is 5.1TFLOPs with 224GB/s of memory bandwidth. In the benchmarks, the AMD Radeon RX 580 clearly outpaced AMD's previous gen Radeon RX 480, and was faster than an NVIDIA GeForce GTX 1060 Founder's Edition card more often than not. It was more evenly matched with factory-overclocked OEM GeForce GTX 1060 cards, however. Expected retail price points are around $245 and $175 for 8GB Radeon RX 580 and 4GB RX 570s cards, though more affordable options will also be available.
I'll end up buying several because fuck Nvidia (Score:4, Insightful)
I can't in good conscience support a major brand-power that spends so much money crippling games and gouging customers as often as possible. Fuck Nvidia.
Re: (Score:3, Funny)
Re: (Score:2)
I'll gladly take a performance hit up front so as not to appease Hitler, sorry. Maybe my ethics are just worth more money?
Re: (Score:2)
They aren't, trust me.
Re: (Score:1)
Your wife agrees ;)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Funny, I have the opposite experience... /tmp that wasn't mounted noexec...
Nvidia drivers break on kernel updates, and sometimes simply will never work again (nvidia-legacy, but still), and last I checked required a
AMD in-tree OSS drivers are simply hassle free.
I buy nVidia (Score:2)
I can't complain too much about gouging either. I got a Gigabyte 6gb 1060 for $215 shipped using deals from Jet.com. Those aren't even that common. That's not bad for a card that runs everything at 60+ FPS
Re: (Score:1)
Forget the graphic cards... (Score:1)
Re: (Score:1)
Youtube.. so many tech tubers have benchmarked this.
Re: (Score:2)
Youtube.. so many tech tubers have benchmarked this.
For AMD vs. Intel. I've yet to see one for AMD vs. AMD.
Re: (Score:2)
Really?
https://www.cpubenchmark.net/c... [cpubenchmark.net]
And the date on that is... today! The Ryzen 1700 has better performance but the FX-8300 still has a better price/performance ratio.
Re: (Score:2)
Re: (Score:2)
They consume like twice the power that zen does for like half the performance.
Only if all eight cores are maxed out. I got a 25W quad-core AM1 system that draws 27W when its not doing anything and 35W when the cores are maxed out..
Re: (Score:2)
Re: (Score:2)
That's also a glorified netbook chip that has jack-all real performance. Compare it to the newest 35w i3
I built the AM1 system when the processor and motherboard were $25 each. The ECS KAM1-I motherboard [amzn.to] had two built-in serial ports and a header for two additional serial ports. This is nice little system for running Red Hat Linux and connecting to four console ports on my Cisco certification rack.
Re: (Score:3)
How is that ratio modified by the electrical costs of running an 8 core FX chip? They consume like twice the power that zen does for like half the performance.
The Ryzen 1700 is a 65W chip, the FX-8300 is a 95W chip. Assuming you are running the machine at 100% load 24/7, and these ratings are accurate, the FX-8300 will cost you $1.75 extra per month at $0.08/kW-hr. If you want to consider the added AC load, you can round up to $2.50 per month. Given the price difference between the chips (currently ~$310+ vs ~$120), it would take more than 6 years to get to the electricity cost breakeven point.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: Forget the graphic cards... (Score:2)
Re: (Score:1)
40% faster.
Not at current prices. Paying $500 for a 40% performance increase isn't that great of a deal. Especially since I updated my components last year because Ryzen didn't come out soon enough.
Re: (Score:1)
This. You ask a question as bait to try to get your chance to be a prick. This is why everybody thinks you're a worthless moron. Go ahead tell us how they would never use AMD at Microsoft or any of the other tech giants you claim to have worked at intermittently.
Re: (Score:1)
This. You ask a question as bait to try to get your chance to be a prick. This is why everybody thinks you're a worthless moron.
Hello, troll. Can't find anyone us to play with?
Go ahead tell us how they would never use AMD [...]
Why?
[...] at Microsoft or any of the other tech giants you claim to have worked at intermittently.
I only interviewed at Microsoft.
Re: (Score:2)
So are you lying now or on the post where you said you worked at Microsoft for 3 or 6 months whatever it was.
Re: (Score:2)
So are you lying now or on the post where you said you worked at Microsoft for 3 or 6 months whatever it was.
Nope. Never worked for Microsoft. Five recruiters led me around the nose for Microsoft positions in 2005. I interviewed on the Silicon Valley Microsoft Campus for a technology spinoff in 2014.
Feel free to post the URL to where I said I worked for Microsoft.
Re: (Score:2)
So are you lying now or on the post where you said you worked at Microsoft for 3 or 6 months whatever it was.
I believe you were referring to this post when I was at Fujitsu for six months.
When I worked at Fujitsu in 1997, our virtual world division got a temporary vice president from Japan for three monhts. This VP was in charged of mainframes. He always asked the same question in broken English, "Are you a mainframe programmer?" He was disappointed that no one in our division was a mainframe programmer. For years since then I've always heard that mainframe programmers were in high demand. The catch, of course, is having previously worked on mainframes. Seems like no one wants to hire an inexperienced person as a mainframe programmer.
http://slashdot.org/comments.pl?sid=10471307&cid=54207521 [slashdot.org]
Re: (Score:2)
I honestly cant remember exact details, the name will explain that. But i do recall on several occasions you speaking about working at microsoft(maybe im wrong and you said you had just applied, but pretty sure im not wrong) along with several short stints at companies. I have only worked for 4 companies in my 16 years as an electrician. only one for 3 months, so when i hear stories like yours it makes me think youre not any good at what you do. Because those of us who are good at what we do stay at compani
Re: (Score:2)
Because those of us who are good at what we do stay at companies long term.
As an IT support contractor, I take whatever work comes my way. Sometimes the assignment is for four hours on short notice, days, weeks, months or years. I'm currently half-way through a five-year contract. My LinkedIn profile has connections to 800+ recruiters from a 20+ year career. Tech workers who stay at one company often make less money than those who moved around. It's easy to say that I'm no good. But I do the jobs that other people won't do, especially if they need a miracle worker.
Re: (Score:2)
Don't you work for peanuts? Something ridiculously low for where and what you do?
I make $50K+ per year in Silicon Valley. That makes me the highest wage earner in my blue collar familiy. For the people who drop $3K per night on wine, I make peanuts.
Re: (Score:2)
Wow, i dont make much less than you, and im just an Electrician/Low Voltage tech.. i will probably catch and surpass your pay in the next 2 years or so.. Glad i decided not to go into tech.
Re: (Score:1)
(Re: FX-8300)
> I updated my components last year because Ryzen didn't come out soon enough
(Re: Hillary)
> Hillary started off 268 electoral votes
(Re: Trump)
> Trump had to perform better than McCain in 2008 and Romney in 2012 to have a chance at winning. Like everything else in this election, he blew that off too and will lose in a landslide.
Good grief your 2016 was a trainwreck of terrible calls. Here's to 2017, hopefully you can at least swing a good processor later in the year- I have less faith
Re: (Score:2)
(Re: FX-8300)
> I updated my components last year because Ryzen didn't come out soon enough
My Vista-compatible motherboard was nine-years-old. Ryzen was originally slated for Q3/Q4 2016. When I updated the motherboard and memory, I got the FX-8300 to replace the seven-year-old quad core processor.
(Re: Hillary)
> Hillary started off 268 electoral votes
Based on available polling data.
(Re: Trump)
> Trump had to perform better than McCain in 2008 and Romney in 2012 to have a chance at winning. Like everything else in this election, he blew that off too and will lose in a landslide.
Based on available polling data, Hillary had a 95% chance and Trump had 5%. Now obviously something was wrong with the polling data. Hillary won the popular vote, Trump won the electoral college.
Good grief your 2016 was a trainwreck of terrible calls. Here's to 2017, hopefully you can at least swing a good processor later in the year- I have less faith in your political prescience improving any.
I based my decisions on available data, as all good decisions should be based
Re: (Score:2)
Re: (Score:2)
Anybody who could read the numbers knew she would lose a good year before the election. When she announced her candidacy her approval rating? 15%.
Her approval rating was...14%
I've never seen polls with those numbers. Got a link?
Re: (Score:2)
Re: (Score:2)
Because why? Because popular opinion has guaranteed monotonic convergence? At a guaranteed quadratic convergence rate?
Just what part of "moving target" is so hard for people to understand?
Candidate A shits her pants at the front of the boat. Everybody rushes to the back of the boat.
Candidate B begins barfing up a taco bowl. Everybody rushes back to the front of the boat.
Blather. Foam. Repeat.
The only reason Trump won is because on the day of the
Re: (Score:2)
No, you based decisions on the data that fit your filter.
My filter is... I'm not a purist. If an alternative solution presents itself because my preferred solution wasn't available at the time, I would go with that and evaluate alternatives later.
It does not you should now use (Score:2)
Cyrix CPU's.
Re: (Score:2)
Cyrix CPU's.
I loved my Cyrix 6x86 CPU back in the late 1990's. It ran Linux flawlessly for my file server.
https://en.wikipedia.org/wiki/Cyrix_6x86 [wikipedia.org]
Re: (Score:2)
I remember those. Owned several. And then one day I've bought a K6 just to test it - after all it was a socket 7 CPU as well, so no new motherboard or memory was needed. After that test I swore that I won't ever buy Cyrix crap again.
Re: (Score:2)
After that test I swore that I won't ever buy Cyrix crap again.
That's not a fair comparison. The Cyrix was a contemporary of the K5. I had both. Cyrix had better Linux support (IIRC, you had to compile the kernel with the feature enabled) and the K5 ran DOS /Windows just fine. The K6-2 400MHz (overclocked to 500MHz) was my favorite processor from the Socket 7 days.
Re: (Score:2)
It is a fair comparison because Cyrix 6x86MX was released two months after K6.
My last Socket 7 CPU was a K6-3 450 and it was so good that I had it for almost two years instead of bying a new CPU every three months like I did previously.
Re: (Score:2)
It is a fair comparison because Cyrix 6x86MX was released two months after K6.
That was probably the Mark II. The Mark I came out in 1996, a year after the K5 and year before the K6. Mine was a Mark I.
Re: (Score:2)
Nope, not MII, that one was released in 1998 and ran against K6-2.
MX was the as the first 6x86, but with the added MMX instruction set.
Re: (Score:2)
True that! but you were better off with an Intel non MMX, since it had faster x87 and that was useful.
Low clocked original 6x86 was good at integer code ; S3 Virge was cheap and had very high 2D performance. So, for everything 320x200 under MS-DOS it was a monster that chewed through everything. Ran your old 3D games at a solid 70 fps (after all those years, I learned that 320x200 mode was at 70Hz), stuff like Dark Forces, all that used integer / fixed point. Tomb Raider ran well and was a 320x200 DOS game.
Re: (Score:2)
*late DOS games like that looked much like what the Playstation did
Re: (Score:2)
look it up you lazy chunk of shit
Xeon (Score:2)
On the other hand, I find that a good notebook does everything I need most of the time. Who really cares about the high end of consumer processors? They cost too much and the motherboards
Competition is a beautiful thing (Score:4, Interesting)
...for the consumer. Pumping 60fps+ @1080p, graphics ticket to the max on a 175$ card, even in a 2013-era CPU is gonna make 'em consoles up their game next gen... Or at least lower their prices.
Re: (Score:2)
Re: Competition is a beautiful thing (Score:5, Insightful)
Project Scorpio is pretty much built for VR.
So you're not that informed about Scorpio...
Re: (Score:2)
Re: Competition is a beautiful thing (Score:2)
Hardware-wise, it is oh so VR capable!
And... incorrect.
Re: Competition is a beautiful thing (Score:2)
Re: (Score:2)
Re: (Score:2)
What's VR?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Well there's nothing VR announced for the XBOX market that I know off, and even the PS4 Pro makes use of the extra CPU add-on from the PSVR for actually processing the VR experience. Scorpio and the Pro are mostly 1080p60 or 4k30 machines, and even then they struggle. But for VR, it's not only the TFlops though - there's a lot of optimization required software-side for consistent 60fps experience. The real problem of VR is the "dual-monitors" combined with the need for 60FPS on both of them while current ge
Re: (Score:2)
Re: (Score:2)
I didn't say it couldn't run them, I said there don't seem to be plans for it since I know of no marketing associations, other than the fact an XBOX controller can be used with the Rift, and that you can stream Xbox One games to the Rift, both of these scenarios using a PC. I don't see MS partnering up with a now Facebook owned company that serves absolutely no commercial purpose, especially when Microsoft itself has it's own AR/VR stuff coming (e.g. Hololens).
In any case, my personal opinion is it won't ha
Re: (Score:2)
How about another use-case. I want a silent/quiet card for 4k desktop. My old 7870 ramps the fan up when scrolling web pages and the scrolling isn't that smooth.
Re: (Score:2)
Ahahaha! Funnily enough I have the same card and I have no idea what the hell you're talking about, but I don't do 4k "desktoping". I'm guessing you're on Linux - likely Chrome under Ubuntu. You probably got something running on software rendering, either because of Chrome defaults or lacking drivers. But this is speculation, I have yet to test 4k on a daily basis.
Even so, that card, especially in the 2GB or more variant should be OK as long as it stays out of 4k gaming or recent games' higher settings. I d
Re: (Score:2)
Re: (Score:2)
Maybe i need to change the stock cooler, or at least replace the thermal paste and clean it.