AMD Continues To Pressure NVIDIA With Lower Cost Radeon R9 270 and BF4 Bundle 142
MojoKid writes "The seemingly never-ending onslaught of new graphics cards as of late continues today with the official release of the AMD Radeon R9 270. This mainstream graphics card actually leverages the same GPU that powered last-year's Radeon HD 7870 GHz Edition. AMD, however, has tweaked the clocks and through software and board level revisions updated the card to allow for more flexible use of its display outputs (using Eyefinity no longer requires the use of a DisplayPort). Versus the 1GHz (GPU) and 4.8Gbps (memory) of the Radeon HD 7870 GHz Edition, the Radeon R9 270 offers slightly lower compute performance (2.37 TFLOPS vs. 2.56 TFLOPS), but much more memory bandwidth--179.2GB/s vs. 153.6GB/s to be exact. AMD and its add in board partners are launching the Radeon R9 270 today, with prices starting at $179. The Radeon R9 270's starting price is somewhat aggressive and once again puts pressure on NVIDIA. GeForce GTX 660 cards, which typically performed lower than the Radeon R9 270 are priced right around the $190 mark. Along with this card, AMD is also announcing an update to its game bundle, and beginning November 13 Radeon R9 270 – R9 290X cards will include a free copy of Battlefield 4. NVIDIA, on the other hand, is offering Splinter Cell: Blacklist and Assassins Creed – Black Flag, plus $50 off a SHIELD portable gaming device with GTX 660 and 760 cards."
Re: (Score:2)
AMD drivers are shitty, and before that ATI drivers were shitty, even before ATI made 3d cards. I've been watching ATI drivers cause Windows to crash since Windows 3.1.
It's broadly believed that ATI's hardware is as good as or better than nVidia, but their drivers hold them back.
I'm still glad ATI is around, just to keep nVidia scared
Re:Nvidia feeling the heat? XD (Score:5, Informative)
For what I have experienced, that perception is a bit outdated. AMD's drivers have been as good as NVIDIA's for about one or two years. On Windows. On Linux, NVIDIA is still way, way better. For newer cards. If you can use the proprietary driver at all.
Re: (Score:3)
The problem is when do you give them the benefit of the doubt?
I've used AMD cards on and off over the years for well over a decade and the problem has always been that each time I've heard someone say what you just did and tried them it's simply not been true.
Maybe you're right this time, but given how many times I've been bitten it's hard to have any faith in such statements.
My friend uses AMD and is always whining about problems with his cards, especially when it comes to Eyefinity stuff so I'm loathe to
Re: (Score:2)
I was just getting ready to say that AMD's drivers are terrible.
But I'm primarily a Linux user. Nvidia binary blob drivers are on-par with (better than?) Windows drivers, while AMD binary and open drivers are both 3-10 times slower in games.
As for Windows, all I know is that AMD seems to release game-breaking updates from time to time. Remember when Diablo III came out? Blizzard was warning everyone not to update their drivers.
Finally, while I think OpenCL is the future, CUDA is the only game in town for de
Proprietary vs Opensource (Score:2)
On Linux, NVIDIA is still way, way better. For newer cards. If you can use the proprietary driver at all.
Although Catalyst is subpar(2) compared to nvidia's proprietary driver, AMD's opensource driver is quite good and its performance has nicely catched up for all the previous generation hardware(1). Of course that comes from the fact that AMD has actively helping the development almost since the days of the ATI acquisition, releasing docs, source code, and even having developpers on their own payroll. Meanwhile Nvidia only recently started announcing that they will be open to answer specific questions to help
Re: (Score:2)
Yeah. Legacy support is also a huge plus for AMD. Try to make either nouveau or NV's blob work on a GeForce FX or 6 series. Unity, Gnome and Cinnamon are terribly broken and NVIDIA said it's a won'tfix. The computer I'm typing this on has a Radeon HD5570 and an integrated GeForce 7025. The latter is useless due to faulty legacy drivers under Linux, failing to render the desktop. On an integrated card from 2006. AMD's open driver, meanwhile, properly supports even the 9000 series, from 2003. Given the curren
1650 Rocked (Score:3)
That said, the 4350 I tried to replace it with was junk. Nice card, good performance, dirt cheap, drivers crashed on everything but Call of Duty. I've heard that if you spend the big bucks ($400) you do alright, but I'm pretty sure I wouldn't trust even $190 ranged cards. Which is sad, because $190 for what the R9 270 does is ridiculous...
Re: (Score:1, Troll)
Re: (Score:2, Troll)
That's complete bullshit. Nvidia does indeed make their drivers very flexible, but it's trivial to force AMD's products into software mode (or black screen, blue screen of death/kernel panic, system freeze, trippy display noise, etc.) with totally valid state configuration.
Google it you fuck up.
Re: Nvidia feeling the heat? XD (Score:1)
Seriously? I've had to fix two "bugs" on NVidia hardwre in the middleware I wirk on as a job. Both were caused by AMD not enforcing the proverbial "letter of the law" vis-a-vis Direct3D and NVidia drivers correctly rejecting the relevant calls.
Re: (Score:1)
Nvidia drivers are shittier than AMD. End of story. AMD drivers implement the graphics API's to the letter.
Yeah, my personal experience is that that's completely bullshit you're slewing there. AMD drivers fail to implement graphics API's properly, and thus are more fragile when something unexpected happens, like a call to a deprecated OpenGL function, and when you point out to AMD that their driver is breaking, they point out that the function is deprecated, failing utterly to grasp the meaning of the term -- yes, new software is not supposed to call it anymore, but it's supposed to continue working anyway unti
Re: (Score:1)
Yep. Those Nvidia drivers are really nice since they caused a Low Profile GT210 to overheat, forcing a replacement by them under warranty. Sure I'll continue using the replacement where it is but I've restricted the drivers to the 296 series as this isn't a gaming build so there's no need for all the god damn updates to benefit games.
What I'd like to see both AMD and Nvidia tell the game devs is "Fuck Off" and use the god damn windows DX as you're supposed to and bitch to Microsoft about crap performance fr
Re:Nvidia feeling the heat? XD (Score:4, Insightful)
Oh really? I assume you've never had to write software for these electronic pintos
Re: (Score:2, Interesting)
Re: (Score:1)
I write a lot of OpenCL code at my job. I also mess around with OpenGL stuff in my free time, usually just screwing around with procedurally generated scenes and shaders (so no where near a full game engine, but hitting a lot of aspects of the modern pipeline). I haven't had any problems programming for the AMD card in my home desktop, whether for OpenGL, or for OpenCL when I was too lazy to log into a workstation in my office to run tests on small datasets. I've just gone off the specs as far as documen
Re: (Score:2)
Nope.. unless of course you define 'shitty user' as someone who chooses to run something besides the latest 3 games the installed drivers were 'optimized' for, at the expense of compatibility with everything else.
Re: (Score:1)
AMD doesn't support BSD. Regardless of how much they beat NVIDIA, I'm not going to buy one of their cards.
Don't worry BSD is dead. No one gives a flying fuck mate.
Re: (Score:1)
1. The Catalyst control panel's dependency on .Net that made it bloated and slow to load.
The dependency on .Net doesn't make it "bloated", in fact if anything it offloads more functionality to the installed .Net libraries making it less "bloated". Also there is nothing inherent in .Net (or Java for that matter) that would make it take significantly longer to load, if it is taking a long time to load then writing it in a native language won't change that.
Re: (Score:1)
The dependency on .Net doesn't make it "bloated"
it might make the installation package larger if it comes with an offline installation of the .net runtime. but most use a web install for the few cases where the system doesnt already have it installed anyway.
if it is taking a long time to load then writing it in a native language won't change that.
the problem is more likely to be loading external resources than loading the runtime and yeah writing a native version wont help you.
Christmas is coming earlier and earlier it seems (Score:3)
Time to get the shopping done.. These bundles are getting sweeter and sweeter.
Re: (Score:2)
Yeah and some poor bastards probably had to work 90 hour weeks to make the AMD GPU work enough to ship.
Re: (Score:2)
Hmm lets see..
1. demoscene - breakage everywhere
2. game titles a few years old
3. anything that's not a game, even things like game map editors, tends to break, especially in opengl land. nvidia handles them fine, on geforce or quadro.
4. video playback causing bsods..
5. shitty linux support, though it's nice they're opening up the specs.
Fresh as always... (Score:2)
Along with this card, AMD is also announcing an update to its game bundle, and beginning November 13 Radeon R9 270 – R9 290X cards will include a free copy of Battlefield 4.
Beginning November 13th, you say....
Unfortunate Card Naming (Score:4, Interesting)
Maybe someone else has a decoder ring, but it's alphabet soup trying to figure out what video card one should get.
http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units#Comparison_tables:_Desktop_GPUs [wikipedia.org]
http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units#Comparison_table:_Desktop_GPUs [wikipedia.org]
If you stare at the article above, it's a blob of numbers worthy of A Simple Mind.
I left the PC gaming rat-race a while back, and I've never been happier -- the only real downside is that I can't possibly suggest to people what video card to buy beyond saying, "meh. Go spend $200 on Newegg."
Re: (Score:2)
You're right... after looking at the numbers long enough, you start to see a pattern!
Oh my God... the Russians! :)
Re: (Score:2)
If you have no idea what the numbers mean, then perhaps you should leave analysis of the numbers to someone with the requisite basic computer hardware knowledge. This isn't challenging stuff. Perhaps you'd be best just leaving the computer alone completely. Really, what are you doing on the internet?
Did someone forget to take their meds?
Unless you follow these like some sort of religion, there's no way for the average bear to know if they should get an R7 260X or an HD8730, or an HD8760, or a HD7730, or what the difference between the -30 and -60 are.
What do those numbers mean?
Re:Unfortunate Card Naming (Score:5, Informative)
Nothing. They are completely random marketing labels, and often designed to be misleading. So the commenter above has no clue. This is not T-34 to T-54 "new tank is better than old tank" series here.
If graphics chip marketing guys would have had their say made soviet tanks then T-34 would have been something like CrossFire DUAL 54200 Extreme Edition and T-54 something like 6700 HDD RXX 2GB Edition.
Re: (Score:2)
Re: (Score:2)
Which is probably why the average user has long since moved to consoles. The last 15 years of crappy naming conventions and standards changes across the computer (PC-133, DDR1, GDDR, ATX, BTX, USB1.1, Firewire, CD-R, CD-RAM, AGP, PCI, PCIe, SATA, ATA, PATA, etc, etc) have acted as a form of natural selection for consumers. Those that were interested and capable of keeping up with the various options, and those that went with Apple because "it just works."
I almost think hardware manufacturers should move t
Re: (Score:2)
maybe 3 categories per year (mid, high, and CAD).
That's more marketing garbage, though. Why no "low"? Will the average Joe know what "CAD" means? Will they assume it's "low" because otherwise there is no low and the salesweasel can use the confusion to sell them the most expensive card for solitaire? Your solution involves more of the crappy naming conventions that you just complained about!
"We don't have a 'small' order of fries. We have Extra-large, Extreme, and Super-mega sizes."
Re: (Score:2)
I'd say no to a low end label because the niche is sort of filled with integrated graphics these days. Even still, a new "last years" mid range card would fit the low end price point just fine. Having said that, those example names were just that: examples. Call them whatever else you want if it makes you feel better. All I'm trying to do is point out that for most people there are only mid range ($200-399) and high end ($400+) cards to consider when it comes to playing games, and the "Pro" line of card
Re: (Score:2)
But tbh, I built an A10 APU-machine for a friend recently and you can get a GPU matching the performance of that machine for $100 and that will basically run any game out today at decent quality. Add another $50 or so to that and you'll be up in the performance of nextgen console hardware and beyond that you're leaving them well behind.
Generally speaking though, as you said, people will do more than well with a $200 G
Re:Unfortunate Card Naming (Score:5, Informative)
Chart for the lazy. [tomshardware.com]
Re: (Score:2)
Or this...
Chart to sort by Passmark rank [videocardbenchmark.net]
and check the CPU [cpubenchmark.net] one as well.
Re: (Score:2)
Model numbers, unfortunately, are garbage through and through, but once you get into the realm of "Well, this one has 1GB of RAM; but it's DDR3 on a 64-bit interface, while the other one only has 512; but it's
Re: (Score:2)
For the enthusiast, the multitude of options are welcome, but for everyone else... ...not so much.
Re: (Score:2)
For the enthusiast, the multitude of options are welcome, but for everyone else... ...not so much.
Choice is good, what I dislike is the fragmentation-into-meaninglessness (and sometimes outright intent to deceive, like the cards that take a bottom-of-barrel GPU and throw in an impressive-sounding amount of RAM, albeit pitifully slow DDR on a narrow bus, then slap a big model number and a picture of a CGI chick riding a dragon or something on the box). Right now, the main contenders appear to be HD5450s on the AMD side and GT610s on the Nvidia side, with 2GB of DDR3. On the plus side, the whole damn car
Re: (Score:2)
I happen to like being able to choose a video card based on specs. I can find what I want at the price I want.
The difficulty is in understanding what you want. If I sometimes get choppy performance in a game, does that mean I want faster memory or more memory? If I want good rendering performance in Blender using OpenCL, what is the break-even ratio of core clock speed/core number?
Re:Unfortunate Card Naming (Score:5, Informative)
AMD's scheme right now is actually pretty easy.
The first number is the generation. We're on "2", even though they just started this new numbering scheme this year, but that's fine.
The next number is the "category". Best way to think of it is monitor resolution: _70 is for 1080p - you'll get 60FPS+ on most games at max settings, real killers may need a settings drop but you'll generally be fine. _80 is for 120Hz or 1440p monitors, and the _90 is for tri-monitor 1080p, 4K, or obscene multi-GPU rigs. And an _60 part is a lower-quality 1080p - think "high" or "medium", not "max".
An X suffix means it's the "full" part, the lack of an X means it's been binned in some way (reduced clockspeeds and/or some cores disabled). For example, the 290X has 44 "compute units", while the 290 has 40 at a slightly lower clockspeed. On the 270s, they're both 20 compute units, but the 270X is clocked about 10% higher.
Since both new consoles use AMD chips, it's worthwhile to compare to them. The PS4 is a bit weaker version of the 270, and the Xbox One is a slightly underclocked 260.
Nvidia's scheme is similar (add another 0 on the end for no reason, swap "Ti" for "X", and be generation 7 instead of 2), but they've complicated it right now by not rebadging old chips as new names. AMD's recent launches were basically "launch a new 9-tier chip, take all the old ones, up the clockspeeds, bump them down a tier and cut their prices accordingly". The 270s that just launched are essentially overclocked 7870s (think "180X").
Right now, Nvidia's lineup starts at the 650 and 650 Ti Boost (medium-end 1080p), 660 and 760 (high-end 1080p), 770 and 780 (1440p), and the 780 Ti (4K). Nobody's really sure whether they're going to launch more low-end 700-series parts. They're also looser with which ones are low bins of what - the 780 is a binned 780 Ti, but the 760 is a binned 770.
PS: Ignore the Titan. It's no longer a gaming card - the 780 Ti outperforms it (the Titan is a binned 780 Ti), at $300 less.
Re: (Score:3)
I hope you realize that you just wrote EIGHT paragraphs to describe two naming conventions. It's kind of absurd that such a thing is required.
Re: (Score:2)
Try coming up with a better one. Invent a new naming convention that still hits all the same price points (let's say $150, $170, $200, $250, $300, $350, $450, and $600). And also accounts for releasing a new batch every year, maybe every other year.
Seriously, I tried once. I ended up in about the same place AMD is right now. Nvidia's naming is a bit wonky, partially because they've never been the clearest, partially because AMD just forced them to drop prices VERY abruptly, and partially because they're in
Re: (Score:2)
I agree that the model number apparently does convey ALOT of information. The guy still did spend 8 paragraphs explaining it, and lost me somewhat along the way (how can something be said to be universally 60fps on "max settings" when there are so many games out there??). The model numbers can be well constructed and yet completely arcane to a once-every-few-years purchaser. And that's to someone who has been gaming since the voodoo2 (many video card generations of knowledge). I can only imagine what the la
Re: (Score:2)
Seems a bit information-inefficient. Four characters for the year, but only one for the model specifier? That's the inverse of what's needed - there's a 60-100% improvement in performance per year, but the range between the lowest-end GPU and highest-end GPU is closer to 10,000%.
Re: (Score:2)
If you have to use more than a paragraph to explain it, it's certainly not simple. So the first number is generation but how do we know what generation we're on? You say we're on generation 2 and that's fine, but there's more than 2 generations of graphics card. Is this 2nd generation card better than the 5th or whatever generation card I bought last year before the new naming scheme?
What relation does _70 have to 1080p, 60fps exactly? I get that
Why would you have full part and half part cards, what exactly
Re: (Score:2)
Does anyone still use dual video cards anymore? Are SLI or Crossfire still in use?
I don't build systems, but I haven't heard anything about them for a while and am just wondering if NVidia and ATI/AMD still run those lines.
Re: (Score:2)
Cards featuring two GPUs are just SLI / CrossfireX parts skipping the PCIe bus for communication.
There's now "Hybrid" CrossfireX which allows use of integrated graphics (AMD Fusion chips) as well as discrete video cards to increase
Re: (Score:2)
Thanks for the info.
Re: (Score:3)
It's real easy ... pick your budget and your tier will follow.
http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-7.html [tomshardware.com]
Re: (Score:2)
Maybe someone else has a decoder ring, but it's alphabet soup trying to figure out what video card one should get.
Drink your Ovaltine. :)
No 4k numbers? (Score:2)
How many 4k monitors can it simultaneously drive? If I want a 2x or 3x 4k setup, will it drive it? I see a Dual DVI, an HDMI and a DP (so max of 2 for Seiki 39" 4k unless the D-DVI and HDMI share a channel...but it doesn't say).
Cost doesn't matter when... (Score:2)
Cost is only unimportant when you're spending someone else's money.
Re: (Score:2)
Re: (Score:2)
Why would I spend more than I have to on a rig that will never see gaming? I need (okay, want) pixels for photo editing and 2D CAD work / large format PDF review. No amount of money will speed up any of those operations as there are no 2D accelerators or GPU-bound functions in the programs I use. I just want a large canvas, which means pushing lots of pixels.
And, let's face it, if I'm considering $500 Seiki's, it's not exactly an enormous amount of money.
Re: (Score:2)
No amount of money will speed up any of those operations as there are no 2D accelerators or GPU-bound functions in the programs I use.
Welcome to the 1990s, I guess?
Re: (Score:2)
I suspect he means that if his graphics-drawing functions were sped up massively, it would have no noticeable benefit to him since they are already effectively instantaneous.
That's how I read it anyway.
Unless he has 3D functions that are CPU-bound. In which case, what you said.
Re: (Score:2)
You have a GPU solution to speed up Photoshop and Lightroom? How about PDF rendering? More than 99% of all construction projects in the world are designed and printed in 2D format -most are too small to justify the cost of the least expensive 3D modeling option. I do more than 200 small construction projects a year - which means I have, on average, 8-10 billable hours from the time the client calls me to say they need drawings to the time I finish designing, drawing, reviewing, printing, and shipping out t
Re: (Score:2)
You have a GPU solution to speed up Photoshop and Lightroom? How about PDF rendering?
I know an AC has already addressed these points, but I feel like addressing them again, and I have time.
Not only is at least Photoshop already GPU-accelerated, but PDF rendering is also 2d-accelerated. Things like drawing lines have been accelerated by video cards Since Windows 3.1 or thereabouts. That's when the first consumer-level PC 2d accelerators started to come out, from names like ATI and Radius. They had bigger, more special video drivers than did earlier video cards, because they performed 2d acce
Re: (Score:3)
Video cards even used to be designed to accelerated Autocad for DOS, and had special drivers for this purpose.
Oh God, I think I remember writing some of those :).
Re: (Score:2)
I run $3000 in monitors in a 4960x1600, three-head display. I use a 3+ year old Radeon 5750 card to drive them and it works exceptionally well. Which card would you recommend to accelerate 2D photo display and 2D bitmapped and vector PDF files? There aren't any because none of the software supports it so no graphics card companies make them. To get a multi-thousand dollar "pro" card would simply be throwing money away.
Final Fantasy XIV (Score:2)
In Soviet Russia, FreeBSD serves YOU! (Score:3)
Sorry, ladies and gentlemen. I was a longtime fan of Radeons, and I bought me a new shiny Radeon+Phenom notebook - just to find that the Radeon X-Windows drivers don't support FreeBSD anymore. They need a Kernel Mode Switch that is obviously absent. Now the FreeBSD team implements it while my book collects dust. The Nvidia drivers are closed-source and glitchy - but at least they exist and they work.
No problems here (Score:5, Insightful)
Despite the massive amount of bashing going back and forth here, I feel compelled to point out that I've swapped back and forth between both AMD/ATI and NVidia over the years and I've run into problems with brand new games having glitches with one or the other on both sides. Even having said that, I'm talking two or three times in over a decade. Aside from that I've had fans go out on one card, and it still lasted long enough after that that I didn't feel bad when it came time to buy a new one.
For most people it really doesn't matter what card you get as long as it isn't ancient. For enthusiasts, compare specs and get what you need. If the specs look like they're in Klingon to you, take the time to learn what's what. If you can't be arsed to do that, then you aren't an enthusiast in the first place.
This isn't like rooting for your home sports team. There is no justifiable reason to give complete loyalty to any company when weighing your purchases.
hardly news, still not open source (Score:2)
when they release a fully enabled GPL driver, i'll be happy to rip out my Nvidia card and buy AMD's card. in the meantime, i'll stick with my Nvidia card and the ever improving Nouveau driver.
Re: (Score:1)
AMD's Open Source driver is miles better than Nvidia. Heck Nvidia doesn't even have one! Some guys had to reverse engineer one. The Open Source Radeon driver is almost on par with the flgrx. Try doing some research.
Re: (Score:3, Interesting)
Re: (Score:2)
At least AMD didnt have several years where solder reflowing cards and laptops was standard troubleshooting practice.
Re: (Score:2)
There is a reason that hardware accelerated audio died 15 years ago. Its because CPUs from that era were so powerful that many-channel realtime software mixing wouldnt use even a fraction of a percent of CPU time.
The best way to keep this in perspective is to realize that the number of samples per second for audio i
Re: (Score:1)
If you compare the quality of 3d audio before vista and compare it with current games its obvious something went wrong. Sure we have a lot more CPU power but games dont use it for sound.
Re: (Score:2)
While I don't disagree with your post, and you made me more than a little nostalgic at the mention of Scream Tracker, I feel I should point out that CPUs in that era were running ONLY your code, and in a deterministic manner. It was comparatively easy to ensure your mixed sound made it to the DMA buffer before it emptied.
In this era of pre-emptive multi-tasking operating systems, unless you're running a realtime kernel there is no longer a guarantee that your multi-channel rendered audio will be ready in t
Re: (Score:1)
TrueAudio is an interesting technology IMO. While graphics improved every year the sound is actually worse now than in the XP era. Im kinda optimistic that games will use this technology, since its pretty much the same chip as in PS4. If this was an AMD only technology then i would expect it to be used in as many titles as GPU physx is used in. Pretty much useless. But since its also in PS4 im looking forward to games with better sound.