Forgot your password?
typodupeerror
AMD Hardware

AMD Continues To Pressure NVIDIA With Lower Cost Radeon R9 270 and BF4 Bundle 142

Posted by samzenpus
from the hot-off-the-presses dept.
MojoKid writes "The seemingly never-ending onslaught of new graphics cards as of late continues today with the official release of the AMD Radeon R9 270. This mainstream graphics card actually leverages the same GPU that powered last-year's Radeon HD 7870 GHz Edition. AMD, however, has tweaked the clocks and through software and board level revisions updated the card to allow for more flexible use of its display outputs (using Eyefinity no longer requires the use of a DisplayPort). Versus the 1GHz (GPU) and 4.8Gbps (memory) of the Radeon HD 7870 GHz Edition, the Radeon R9 270 offers slightly lower compute performance (2.37 TFLOPS vs. 2.56 TFLOPS), but much more memory bandwidth--179.2GB/s vs. 153.6GB/s to be exact. AMD and its add in board partners are launching the Radeon R9 270 today, with prices starting at $179. The Radeon R9 270's starting price is somewhat aggressive and once again puts pressure on NVIDIA. GeForce GTX 660 cards, which typically performed lower than the Radeon R9 270 are priced right around the $190 mark. Along with this card, AMD is also announcing an update to its game bundle, and beginning November 13 Radeon R9 270 – R9 290X cards will include a free copy of Battlefield 4. NVIDIA, on the other hand, is offering Splinter Cell: Blacklist and Assassins Creed – Black Flag, plus $50 off a SHIELD portable gaming device with GTX 660 and 760 cards."
This discussion has been archived. No new comments can be posted.

AMD Continues To Pressure NVIDIA With Lower Cost Radeon R9 270 and BF4 Bundle

Comments Filter:
  • by Virtucon (127420) on Wednesday November 13, 2013 @08:09PM (#45418605)

    Time to get the shopping done.. These bundles are getting sweeter and sweeter.

  • Along with this card, AMD is also announcing an update to its game bundle, and beginning November 13 Radeon R9 270 – R9 290X cards will include a free copy of Battlefield 4.

    Beginning November 13th, you say....

  • by mythosaz (572040) on Wednesday November 13, 2013 @08:16PM (#45418653)

    Maybe someone else has a decoder ring, but it's alphabet soup trying to figure out what video card one should get.

    http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units#Comparison_tables:_Desktop_GPUs [wikipedia.org]
    http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units#Comparison_table:_Desktop_GPUs [wikipedia.org]

    If you stare at the article above, it's a blob of numbers worthy of A Simple Mind.

    I left the PC gaming rat-race a while back, and I've never been happier -- the only real downside is that I can't possibly suggest to people what video card to buy beyond saying, "meh. Go spend $200 on Newegg."

    • You're right... after looking at the numbers long enough, you start to see a pattern!

      Oh my God... the Russians! :)

    • Well if your computer isn't complete crap that's about the information they will need.

      But tbh, I built an A10 APU-machine for a friend recently and you can get a GPU matching the performance of that machine for $100 and that will basically run any game out today at decent quality. Add another $50 or so to that and you'll be up in the performance of nextgen console hardware and beyond that you're leaving them well behind.

      Generally speaking though, as you said, people will do more than well with a $200 G
    • by OverlordQ (264228) on Wednesday November 13, 2013 @08:34PM (#45418793) Journal

      Chart for the lazy. [tomshardware.com]

    • The thing that gets really annoying is what crops up (usually a bit later in a given chipset's lifetime) where the various models simply become impossible to rank in order (without thorough benchmarking), rather than merely needing a lookup table to translate between model numbers and actual specs.

      Model numbers, unfortunately, are garbage through and through, but once you get into the realm of "Well, this one has 1GB of RAM; but it's DDR3 on a 64-bit interface, while the other one only has 512; but it's
      • by mythosaz (572040)

        For the enthusiast, the multitude of options are welcome, but for everyone else... ...not so much.

        • For the enthusiast, the multitude of options are welcome, but for everyone else... ...not so much.

          Choice is good, what I dislike is the fragmentation-into-meaninglessness (and sometimes outright intent to deceive, like the cards that take a bottom-of-barrel GPU and throw in an impressive-sounding amount of RAM, albeit pitifully slow DDR on a narrow bus, then slap a big model number and a picture of a CGI chick riding a dragon or something on the box). Right now, the main contenders appear to be HD5450s on the AMD side and GT610s on the Nvidia side, with 2GB of DDR3. On the plus side, the whole damn car

    • by gman003 (1693318) on Wednesday November 13, 2013 @08:49PM (#45418891)

      AMD's scheme right now is actually pretty easy.

      The first number is the generation. We're on "2", even though they just started this new numbering scheme this year, but that's fine.

      The next number is the "category". Best way to think of it is monitor resolution: _70 is for 1080p - you'll get 60FPS+ on most games at max settings, real killers may need a settings drop but you'll generally be fine. _80 is for 120Hz or 1440p monitors, and the _90 is for tri-monitor 1080p, 4K, or obscene multi-GPU rigs. And an _60 part is a lower-quality 1080p - think "high" or "medium", not "max".

      An X suffix means it's the "full" part, the lack of an X means it's been binned in some way (reduced clockspeeds and/or some cores disabled). For example, the 290X has 44 "compute units", while the 290 has 40 at a slightly lower clockspeed. On the 270s, they're both 20 compute units, but the 270X is clocked about 10% higher.

      Since both new consoles use AMD chips, it's worthwhile to compare to them. The PS4 is a bit weaker version of the 270, and the Xbox One is a slightly underclocked 260.

      Nvidia's scheme is similar (add another 0 on the end for no reason, swap "Ti" for "X", and be generation 7 instead of 2), but they've complicated it right now by not rebadging old chips as new names. AMD's recent launches were basically "launch a new 9-tier chip, take all the old ones, up the clockspeeds, bump them down a tier and cut their prices accordingly". The 270s that just launched are essentially overclocked 7870s (think "180X").

      Right now, Nvidia's lineup starts at the 650 and 650 Ti Boost (medium-end 1080p), 660 and 760 (high-end 1080p), 770 and 780 (1440p), and the 780 Ti (4K). Nobody's really sure whether they're going to launch more low-end 700-series parts. They're also looser with which ones are low bins of what - the 780 is a binned 780 Ti, but the 760 is a binned 770.

      PS: Ignore the Titan. It's no longer a gaming card - the 780 Ti outperforms it (the Titan is a binned 780 Ti), at $300 less.

      • I hope you realize that you just wrote EIGHT paragraphs to describe two naming conventions. It's kind of absurd that such a thing is required.

        • by gman003 (1693318)

          Try coming up with a better one. Invent a new naming convention that still hits all the same price points (let's say $150, $170, $200, $250, $300, $350, $450, and $600). And also accounts for releasing a new batch every year, maybe every other year.

          Seriously, I tried once. I ended up in about the same place AMD is right now. Nvidia's naming is a bit wonky, partially because they've never been the clearest, partially because AMD just forced them to drop prices VERY abruptly, and partially because they're in

          • I agree that the model number apparently does convey ALOT of information. The guy still did spend 8 paragraphs explaining it, and lost me somewhat along the way (how can something be said to be universally 60fps on "max settings" when there are so many games out there??). The model numbers can be well constructed and yet completely arcane to a once-every-few-years purchaser. And that's to someone who has been gaming since the voodoo2 (many video card generations of knowledge). I can only imagine what the la

      • by Xest (935314)

        If you have to use more than a paragraph to explain it, it's certainly not simple. So the first number is generation but how do we know what generation we're on? You say we're on generation 2 and that's fine, but there's more than 2 generations of graphics card. Is this 2nd generation card better than the 5th or whatever generation card I bought last year before the new naming scheme?

        What relation does _70 have to 1080p, 60fps exactly? I get that

        Why would you have full part and half part cards, what exactly

    • Does anyone still use dual video cards anymore? Are SLI or Crossfire still in use?

      I don't build systems, but I haven't heard anything about them for a while and am just wondering if NVidia and ATI/AMD still run those lines.

      • Yes, though there are no specific cards; All are "SLI" or "CrossfireX" parts. You used to need a special "Master" card and a Y-connector for the DVI ports for CrossFire; That was abandoned. CrossFireX, as it is now known, is all internal connections. SLI is as it's always been.

        Cards featuring two GPUs are just SLI / CrossfireX parts skipping the PCIe bus for communication.

        There's now "Hybrid" CrossfireX which allows use of integrated graphics (AMD Fusion chips) as well as discrete video cards to increase
    • It's real easy ... pick your budget and your tier will follow.

      http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-7.html [tomshardware.com]

    • Maybe someone else has a decoder ring, but it's alphabet soup trying to figure out what video card one should get.

      Drink your Ovaltine. :)

  • How many 4k monitors can it simultaneously drive? If I want a 2x or 3x 4k setup, will it drive it? I see a Dual DVI, an HDMI and a DP (so max of 2 for Seiki 39" 4k unless the D-DVI and HDMI share a channel...but it doesn't say).

    • If you are springing for multiple 4k monitors, I don't see why you would be going for a value video card
      • Why would I spend more than I have to on a rig that will never see gaming? I need (okay, want) pixels for photo editing and 2D CAD work / large format PDF review. No amount of money will speed up any of those operations as there are no 2D accelerators or GPU-bound functions in the programs I use. I just want a large canvas, which means pushing lots of pixels.

        And, let's face it, if I'm considering $500 Seiki's, it's not exactly an enormous amount of money.

        • by drinkypoo (153816)

          No amount of money will speed up any of those operations as there are no 2D accelerators or GPU-bound functions in the programs I use.

          Welcome to the 1990s, I guess?

          • by Trogre (513942)

            I suspect he means that if his graphics-drawing functions were sped up massively, it would have no noticeable benefit to him since they are already effectively instantaneous.

            That's how I read it anyway.

            Unless he has 3D functions that are CPU-bound. In which case, what you said.

          • You have a GPU solution to speed up Photoshop and Lightroom? How about PDF rendering? More than 99% of all construction projects in the world are designed and printed in 2D format -most are too small to justify the cost of the least expensive 3D modeling option. I do more than 200 small construction projects a year - which means I have, on average, 8-10 billable hours from the time the client calls me to say they need drawings to the time I finish designing, drawing, reviewing, printing, and shipping out t

            • by drinkypoo (153816)

              You have a GPU solution to speed up Photoshop and Lightroom? How about PDF rendering?

              I know an AC has already addressed these points, but I feel like addressing them again, and I have time.

              Not only is at least Photoshop already GPU-accelerated, but PDF rendering is also 2d-accelerated. Things like drawing lines have been accelerated by video cards Since Windows 3.1 or thereabouts. That's when the first consumer-level PC 2d accelerators started to come out, from names like ATI and Radius. They had bigger, more special video drivers than did earlier video cards, because they performed 2d acce

              • by 0123456 (636235)

                Video cards even used to be designed to accelerated Autocad for DOS, and had special drivers for this purpose.

                Oh God, I think I remember writing some of those :).

  • Best card for Final Fantasy XIV at 1920x1080 with maximum settings, taking into account they announced a DX11 update in 2014?
  • by Thor Ablestar (321949) on Wednesday November 13, 2013 @09:23PM (#45419063)

    Sorry, ladies and gentlemen. I was a longtime fan of Radeons, and I bought me a new shiny Radeon+Phenom notebook - just to find that the Radeon X-Windows drivers don't support FreeBSD anymore. They need a Kernel Mode Switch that is obviously absent. Now the FreeBSD team implements it while my book collects dust. The Nvidia drivers are closed-source and glitchy - but at least they exist and they work.

  • No problems here (Score:5, Insightful)

    by Rakhar (2731433) on Thursday November 14, 2013 @01:46AM (#45420209)

    Despite the massive amount of bashing going back and forth here, I feel compelled to point out that I've swapped back and forth between both AMD/ATI and NVidia over the years and I've run into problems with brand new games having glitches with one or the other on both sides. Even having said that, I'm talking two or three times in over a decade. Aside from that I've had fans go out on one card, and it still lasted long enough after that that I didn't feel bad when it came time to buy a new one.

    For most people it really doesn't matter what card you get as long as it isn't ancient. For enthusiasts, compare specs and get what you need. If the specs look like they're in Klingon to you, take the time to learn what's what. If you can't be arsed to do that, then you aren't an enthusiast in the first place.

    This isn't like rooting for your home sports team. There is no justifiable reason to give complete loyalty to any company when weighing your purchases.

  • when they release a fully enabled GPL driver, i'll be happy to rip out my Nvidia card and buy AMD's card. in the meantime, i'll stick with my Nvidia card and the ever improving Nouveau driver.

    • by Anonymous Coward

      AMD's Open Source driver is miles better than Nvidia. Heck Nvidia doesn't even have one! Some guys had to reverse engineer one. The Open Source Radeon driver is almost on par with the flgrx. Try doing some research.

Never test for an error condition you don't know how to handle. -- Steinbach

Working...