Follow Slashdot stories on Twitter


Forgot your password?
Graphics AMD Hardware Technology

AMD's New Radeon HD 6870 and 6850 Cards Debut 153

MojoKid writes "AMD has officially launched their new Radeon HD 6800 series of graphics cards and the company has managed to drive cost and power consumption out of the product, while increasing performance efficiencies in the architecture. The Radeon HD 6870 and Radeon HD 6850 are new midrange cards that offer similar performance to previous generation high-end offerings, but at significantly lower price points and with an enhanced tessellation engine for better support of next generation DX11 game engines. The cards compete well with NVIDIA's GeForce GTX 470 and 460 products, besting them in some scenarios but trailing in others. Word is AMD is readying their flagship high-end Radeon 6900 family for release in Q4 as well."
This discussion has been archived. No new comments can be posted.

AMD's New Radeon HD 6870 and 6850 Cards Debut

Comments Filter:
  • by BadAnalogyGuy ( 945258 ) <> on Friday October 22, 2010 @01:55AM (#33983058)

    I can't wait to see these things in the store! Graphics cards are so cool. You can of course play graphics on them, but you can also do cool stuff like encryption and supercomputer type of stuff.

    Man, I can't get enough of these graphics cards stories! Oh yeah!

    • Re: (Score:3, Insightful)

      by erroneus ( 253617 )

      Sarcasm appreciated. (Really, you should get yourself a sarcasm sign)

      A co-worker has stated on numerous occasions that it is time for hardware makers to go on vacation for at least a year. Software is not catching up with the advances in hardware. Further, these advances are without any need. Nothing runs slowly on yesterday's hottest new thing.

      Microsoft has already updated beyond any need as can be demonstrated by nearly everyone's reluctance to go beyond Windows XP. MS Office demonstrates the same po

    • I'm honestly not sure whether you're serious or not - it could be both.

      Especially as I was thinking "why aren't these stories in the games section?". I mean who uses dedicated graphics cards, other than hardcore gamers, when nowadays all motherboards come with integrated graphics that proved you with more than enough rendering power for normal office work, web browsing and watching videos.

      The only applications I can think of are games (and only the latest and most demanding ones that are not handled well

      • Try a decent dedicated card and you'll be amazed at how much better windows 7 runs...

        Certainly the high end cards are not fully utilized unless you are playing games, but a decent mid range card runs circles around integrated graphics in normal everyday applications in my experience.

        • I'm running Linux, not Windows (except XP in VirtualBox for e-banking).

          Makes me wonder why one would need a dedicated graphics card just for an OS. Then the OS is taking up too many resources.

          In line with that: computer hardware has become hundreds if not thousands of times more powerful over the last decade or so. I still have the feeling that the software I'm running is working slower than 10 years ago.

          • Re: (Score:3, Informative)

            by Dr_Barnowl ( 709838 )

            It's because Aero uses composited textures to draw the screen, so it's reliant on GPU performance. Compiz does much the same thing, so Linux can do a similar resource-eating trick.

            Turn off the pretty and Win7 will look a little plainer but run a little snappier. I still do this with WinXP, just because the Fischer-Price theme has really chunky title bars that take up extra screen estate.

            I remember when graphics cards sold on their ability to accelerate 2D drawing operations to make Windows go faster...

            • Re: (Score:2, Interesting)

              by Anonymous Coward

              A little snappier!? You ever logged the number of WM_PAINT messages to the client windows when using different compositor? Aero doesn't need to send as many re-paint messages since it knows that the HWND haven't been touched, so it can just blit whatever is cached in the texture. THAT is way more efficient than application re-generating the image just because you expose a few pixels of a window, you see, applications rarely respect the RECT parameter of WMP message.

              The "wasteful" part of Aero is that it doe

            • And yet OpenSUSE seems to do just fine with an integrated graphics card and moderate graphic effects. Just because MS screws it up doesn't mean that you can't have something that's pretty and snappy.
              • If you really want to get into a simple UI go with fluxbox and develop your own theme for it as it's not hard. If I want to simply get things done I use fluxbox as it stays the hell out of my way. Otherwise I'll stick with KDE 3.5.10 as it offers me the features I actually use while staying out of my way as much as possible. Of course I also use Gentoo instead of any of the others - not because of the Ricing Effect but the fact that I decide what Dependencies are installed and how the system is configured i

            • Actually, if you have any kind of dedicated GPU (it doesn't have a to be a good, a cheap midrange card for $50 or less is fine) turning off Aero will actually harm performance, because it increases the CPU load. It will probably even increase power consumption; you turn off the GPU but must increase CPU clock rate (Windows, like most modern OSes, dynamically scales down the CPU at less-than-peak load).

              I can understand reverting to the classic theme on an aesthetic basis, especially on XP (I happen to like A

        • Try a decent dedicated card and you'll be amazed at how much better windows 7 runs...

          I run Windows 7 on both my Intel powered laptop and my ATI powered desktop. The dedicated graphics does nothing for Windows 7 that the Intel one can't do just as well.

          • For everyday desktop usage, that may be true. I'd wager that the Radeon is non-trivially better for DirectX Video Acceleration.
            • The Intel one plays 1080p H.264 with no issues. CPU usage is higher, but unless you're watching task manager while watching your movies you'd never know.

              It's an Intel GMA 4500MDH vs an ATI HD 5670 for those that are interested.

              • by cynyr ( 703126 )

                or doing an encode, compile, or BONIC in the background.

                Some of us do that sort of thing you know.

                Yet my 8600GT will happily decode 1080P H264 level 4.1 High profile @ ~2Mbps while ffmpeg is encoding a dvd to h264 at the same time.

    • ok, I do some CUDA code. So watching ATI make Nvidia up their pace rocks... I like crysis for the exact same reason, I find the gameplay a bit dull, but it sure makes people buy primo video..
    • Man, I can't get enough of these graphics cards stories! Oh yeah!

      Just because you're not interested, it doesn't mean someone else isn't. It's like you were driving your car down a highway and you see an exit for a road that's not on your route. You might not care about that exit. But for someone who needs gas at the station on that intersection, that exit is very well timed. That doesn't mean you need to take the exit either. Unless you're just curious and you want to drive around a bit. Then you better fill up anyway because you never know where the next gas stati

    • by T.E.D. ( 34228 )

      Try playing the just-released Civilization V without a DX11 card, and you'll be signing this tune for real. I have a pretty good non-DX11 card, and its painful (for a frigging turn-based strategy game!). Pretty much every PC gamer is going to need one of these cards the next time they buy a new game.

      Right now the cheapest decent DX11 cards are Nvidia's 960 series, which tend to go for $200 or so. Perhaps that's no big deal to some, but I have a family to support, so $200 is kinda painful. I've been eagerl

  • "The Radeon HD 6870 and Radeon HD 6850 drop in at $239 and $179 MSRP, respectively. "
  • up to six LCDs (Score:5, Insightful)

    by iamhassi ( 659463 ) on Friday October 22, 2010 @02:21AM (#33983144) Journal
    This is what I'm interested in: [] "....six display controllers offering six TMDS links. This lets users connect up to six displays to as independent display heads, or span display heads across multiple physical displays using the Eyefinity technology. The new HDMI 1.4a connector standard is made use of, which gives you support for stereoscopic 3D standards such as Blu-ray 3D, the two mini-DisplayPort 1.2 connectors support Multi-Stream technology that let you daisy-chain 3 physical displays per connector, letting you wrap up a 6-display Eyefinity array using just those two connectors."

    Sounds great! Tired of selling an old monitor to buy a new one that's 2" larger and a few hundred more pixels, much rather just get a second (or third, or fourth, etc) same-sized LCD and double the pixels.
    • No matter what that A4-sized page still doesn't fit in a readable manner on the monitor(s)...

      • Re: (Score:2, Insightful)

        by espiesp ( 1251084 )

        Actually by the time you get into the 22+" size (non-widescreen) you can fix two A4 side-by side at 1:1 ratio. However, this isn't accounting for tool-bars and the like so my preference is a 20-22" non-widescreen or 22+ widescreen, rotated 90". I've used this in Electronic Document Imaging applications, real world, with a lot of seat time and it's a VERY workable solution. Gives plenty of room for a single A4 page with toolbars on top and side.

        The one catch is that some monitors have asymetrical and or narr

        • It would be great if my desk is that big... space comes at a premium in this part of the world.
          • Ok, I'm curious where you are located.

            Also, for about double the cost of the monitor, you can get a nice vesa mount stand that gives you the entire footprint of your desk back. It was one of those purchases that felt very silly and wasteful, to show off... and then ended up being practical and a great use of the money.

        • Another problem is with subpixel rendering of fonts - the rendering engines are optimised for the pixels in their horizontal configuration and it doesn't quite look right when aligned vertically.
          • Indeed. I tried keeping a monitor in pivot at work, but couldn't stand the rendering of text. And the viewing angles of the monitor weren't too good either, which didn't really help.

            Works somewhat well if used to examine scanned documents though.

      • by pz ( 113803 )

        No matter what that A4-sized page still doesn't fit in a readable manner on the monitor(s)...

        Take a bog-standard, old 1600x1200 LCD monitor (doesn't even have to be an LCD, but they tend to fare better with what I'm about to suggest) and rotate it 90 degrees clockwise. Some mounts allow this, some do not; you may need to purchase a new mount. Then, assuming you have a reasonably modern driver and video card combination, tick the box to accommodate the rotation. Reboot or restart your display manager as indicated.

        Voila, a very nice fit for a single-sheet of A4 or US Letter paper at essentially lif

    • I was a bit concerned by the large variety of ports, could they all work at the same time or was I stuck using buying a not-yet-available DisplayPort Multi-Stream Transport hub?

      Seems this image [] clears that question right up: two monitors are connected to DVI and 4 are connected through a hub, so I see no reason why I can't purchase two cheap DisplayPort to DVI adaptors [] and have up to four monitors connected by the very common DVI port.

      one $180 video card, one PCI-E 16x slot, 4+ LCDs. Sounds good.
    • I'm running three monitors now. A large central, plus two old-school 4:3 20" turned portrait, one on either side. In fact, I'm reading/posting on /. on the right portrait monitor right now. 1200 pixels is wide enough for practically every reading application, so the sides are for web, email, documents, calendar, note taking, task management, etc. My center monitor is for CAD, two page document editing, engineering analysis, and large format PDFs (architectural drawings).

      It _is_ awesome, though when it's wa

  • These cards don't have nearly the computational ability I'd hoped for. Even the 5800 series is faster! Fermis are definitely faster for my applications, especially for 32-bit integer multiplication.

    • Re: (Score:3, Informative)

      The replacement of 5870 will be the 6900 series, not 6870. This is confusing as the x900 series used to be dual gpu cards, but this time it isn't.
      • by Kjella ( 173770 )

        There has been exactly one "x900" card and it's the 5970. Historically, the dual cards used to be called X2 like in 4870 X2 but the 5970 wasn't fully a 5870 X2 (would break the ATX spec) so they gave its own name and series. What is worse is that the 5870 is performing better than 6870 and same for 5850 and 6850. The price reduction is nice, but in all honesty they should have been named either as the 6700 series or as 6850 and 6830 respectively.

    • Price wise, at least in sterling, the 6850 is the replacement for for 5830 @ £150 ish, and the 6870 replaces the 5850 @ £200. Mid-range cards with a moderate performance bump (and cooler and thus quieter).

      As already stated, the 6900 series will be the high-end performance cards, though single gpu. Confusing, though not as bad as intel, yet!

      • Though now I look at the price drops for fermi based nvidia cards; the 460 1GB is @ £150 and beats the 6850 handily, and the 470 @ £200 definitely out performs the 6870 at the same price. So on bang for buck, the fermi's definitely win this round! I imagine the prices will drop for the 68xx series, or they're going to take a bit of a kicking this time round.

    • by AmiMoJo ( 196126 )

      It would be interesting to do a price/performance matrix for computational GPUs. At some point it has to be better to start buying more cheaper GPUs instead of one or two expensive ones. The real limit might be the availability of motherboards with multiple PCI-E slots. Even though 16x cards work in 1x slots most of them physically will not accept 16x cards.

  • I thought AMD was dropping the graphics card line?
    • Re:hmmmm (Score:4, Informative)

      by Surt ( 22457 ) on Friday October 22, 2010 @02:54AM (#33983266) Homepage Journal

      No, what you probably heard about was that they are dropping the ATI branding of the graphics cards. The cards themselves are alive and well, just AMD branded.

    • Nah, they just dropped the name "ATI" from their graphics cards. Previously, they released their graphics cards under the ATI brand, but I believe this release actually marks the retiring of the name.
    • Think about that one for a moment. AMD spent $5.4bn to acquire a graphics card manufacturer only to drop the graphics card line :-)
  • I've worked on some of the most cutting-edge GPU designs on the planet, from the low-level software stack down to the design changes needed to accomodate die shrinks. After looking at the HD 6870's design, one thing is clear. It needs more cowbell.

  • by mykos ( 1627575 ) on Friday October 22, 2010 @06:22AM (#33983904)
    1. This is a midrange; high end parts come next month
    2. $239 for the 6870, $180 for the 6850
    3. 5870 > 6870 > GTX 470 > 6850 > 5850 > GTX 460
    4. Crossfire scaling (for those who are dual-GPU inclined) is around 90%+ in most games
    5. Brand-new Anti-Aliasing filter: ATI has invented some edge-smoothing shader that looks incredible in most games and even works where in games that don't have AA or where AA would give a huge performance hit. This "morphological AA" costs almost nothing in framerate.
  • by TheSunborn ( 68004 ) <.tiller. .at.> on Friday October 22, 2010 @07:49AM (#33984142)

    Does anyone know what the status of the Linux drivers are(Both open and closed). Do I still have to buy a nvidia card if I hard to use OpenGL with Linux, or did Amd finally release drivers with performance as good as the ones on Windows?

    • by armanox ( 826486 )

      Guess it's been a few years for you and ATi? Since about the middle of 2007 I haven't had any real issues with ATi's closed source driver. I don't recommend the open source driver for the most recent cards (FGLRX does a much better job).

      • That sounds about right. I don't switch graphics card that often :}

        But is the performance for ATI's closed source drivers as good as the windows version? (And can they be installed on Fedora 13 without wasting to many hours)?.

        • by armanox ( 826486 )

          I haven't noticed too much difference in the Windows vs Linux versions, but I also haven't done a whole lot of gaming on Linux recently. I have a Radeon HD 4550 on an Ubuntu/Windows box and a Radeon HD 3200 on a Fedora/Windows box. I do think that the my last round of updates broke FGLRX on Fedora 13 (I think X upgraded, haven't looked too much into it, will reply again later after checking). Installing the driver has been a simple matter of download, run the installer, startx. I guess I could try some

        • by rwa2 ( 4391 ) *

          Phoronix is the site that cares about that sort of thing. I can't seem to find their most recent article on Windows vs. Linux closed drivers, but I remember there being one within the last year or so.


          (There was still a big gap in 2007)

          All I know is that nVidia is really making a killing on all the Linux graphics clusters for government/defense sims that are replacing all the older SGI and SUN workstations. And it's still even difficu

    • I don't know what the performance is with the proprietary fglrx driver (ATI/AMD stopped supporting my X1600 Pro a while ago), but I think the current status of the open-source driver is "works, but don't expect anything amazing".
      Last I knew about the proprietary driver was that it performed better than the OSS ones, but still not as good as Windows.

      For reference: [] and the pages linked to from there.
  • Honestly, when I can't even keep my 4870's working reliably, why would I bother shelling out any money for these things? Call me when they hire folks who can create a driver set that works without having to purge and upgrade every month or so.


  • I know you shouldn't judge a book by it's cover, but I was kinda interested when I read that these are now uber-efficient and such. Then I open the article and both are the massive structures that I'm not sure will even clear the hard drive bays in my case, and the 6870 requires not one but TWO dedicated graphics card power leads with a 151W power draw under load.

    "Efficient" just don't mean what it used to.

  • But ... does it run Linux?

    I mean, I'm sure there will be a driver to use the card under Linux. But can you run Linux on the card?

A consultant is a person who borrows your watch, tells you what time it is, pockets the watch, and sends you a bill for it.