AMD's New Radeon HD 6870 and 6850 Cards Debut 153
MojoKid writes "AMD has officially launched their new Radeon HD 6800 series of graphics cards and the company has managed to drive cost and power consumption out of the product, while increasing performance efficiencies in the architecture. The Radeon HD 6870 and Radeon HD 6850 are new midrange cards that offer similar performance to previous generation high-end offerings, but at significantly lower price points and with an enhanced tessellation engine for better support of next generation DX11 game engines. The cards compete well with NVIDIA's GeForce GTX 470 and 460 products, besting them in some scenarios but trailing in others. Word is AMD is readying their flagship high-end Radeon 6900 family for release in Q4 as well."
Oh wow! New graphics cards! (Score:5, Funny)
I can't wait to see these things in the store! Graphics cards are so cool. You can of course play graphics on them, but you can also do cool stuff like encryption and supercomputer type of stuff.
Man, I can't get enough of these graphics cards stories! Oh yeah!
Re: (Score:3, Insightful)
Sarcasm appreciated. (Really, you should get yourself a sarcasm sign)
A co-worker has stated on numerous occasions that it is time for hardware makers to go on vacation for at least a year. Software is not catching up with the advances in hardware. Further, these advances are without any need. Nothing runs slowly on yesterday's hottest new thing.
Microsoft has already updated beyond any need as can be demonstrated by nearly everyone's reluctance to go beyond Windows XP. MS Office demonstrates the same po
Re:Oh wow! New graphics cards! (Score:4, Informative)
If you actually RTFAd, would realize this is actually an "efficiency" launch for AMD, with quite lower costs (and prices) for only slightly lower performance.
Nice rant, though.
Modest price/performance improvement in Germany... (Score:2)
Looking at the specs, it seems the 6870 might be about equal to the 5850 in performance. Also, power consumption under load is the same.
Looking at the prices at alternate.de, the 6870 is about 10-20% cheaper than the the 5850. So we have a 10-20% improvement in performance/price. Better than nothing, but no spectacular improvement.
Re:Oh wow! New graphics cards! (Score:5, Insightful)
Pov-Ray runs slow on today's hottest new thing. So do various physics simulators. And just try to run a physics simulator and AI on a same machine (to do robot research without having to build actual robots)! In fact, even Dwarf Fortress [bay12games.com], and ASCII game, still slows down occasionally!
Simply because you use a modern computer as a glorified typewriter doesn't mean that we all do.
Re: (Score:2)
I'm honestly not sure whether you're serious or not - it could be both.
Especially as I was thinking "why aren't these stories in the games section?". I mean who uses dedicated graphics cards, other than hardcore gamers, when nowadays all motherboards come with integrated graphics that proved you with more than enough rendering power for normal office work, web browsing and watching videos.
The only applications I can think of are games (and only the latest and most demanding ones that are not handled well
Re: (Score:2)
Try a decent dedicated card and you'll be amazed at how much better windows 7 runs...
Certainly the high end cards are not fully utilized unless you are playing games, but a decent mid range card runs circles around integrated graphics in normal everyday applications in my experience.
Re: (Score:2)
I'm running Linux, not Windows (except XP in VirtualBox for e-banking).
Makes me wonder why one would need a dedicated graphics card just for an OS. Then the OS is taking up too many resources.
In line with that: computer hardware has become hundreds if not thousands of times more powerful over the last decade or so. I still have the feeling that the software I'm running is working slower than 10 years ago.
Re: (Score:3, Informative)
It's because Aero uses composited textures to draw the screen, so it's reliant on GPU performance. Compiz does much the same thing, so Linux can do a similar resource-eating trick.
Turn off the pretty and Win7 will look a little plainer but run a little snappier. I still do this with WinXP, just because the Fischer-Price theme has really chunky title bars that take up extra screen estate.
I remember when graphics cards sold on their ability to accelerate 2D drawing operations to make Windows go faster...
Re: (Score:2, Interesting)
A little snappier!? You ever logged the number of WM_PAINT messages to the client windows when using different compositor? Aero doesn't need to send as many re-paint messages since it knows that the HWND haven't been touched, so it can just blit whatever is cached in the texture. THAT is way more efficient than application re-generating the image just because you expose a few pixels of a window, you see, applications rarely respect the RECT parameter of WMP message.
The "wasteful" part of Aero is that it doe
Re: (Score:2)
Re: (Score:2)
If you really want to get into a simple UI go with fluxbox and develop your own theme for it as it's not hard. If I want to simply get things done I use fluxbox as it stays the hell out of my way. Otherwise I'll stick with KDE 3.5.10 as it offers me the features I actually use while staying out of my way as much as possible. Of course I also use Gentoo instead of any of the others - not because of the Ricing Effect but the fact that I decide what Dependencies are installed and how the system is configured i
Re: (Score:2)
Actually, if you have any kind of dedicated GPU (it doesn't have a to be a good, a cheap midrange card for $50 or less is fine) turning off Aero will actually harm performance, because it increases the CPU load. It will probably even increase power consumption; you turn off the GPU but must increase CPU clock rate (Windows, like most modern OSes, dynamically scales down the CPU at less-than-peak load).
I can understand reverting to the classic theme on an aesthetic basis, especially on XP (I happen to like A
Re: (Score:2)
Try a decent dedicated card and you'll be amazed at how much better windows 7 runs...
I run Windows 7 on both my Intel powered laptop and my ATI powered desktop. The dedicated graphics does nothing for Windows 7 that the Intel one can't do just as well.
Re: (Score:2)
Re: (Score:2)
The Intel one plays 1080p H.264 with no issues. CPU usage is higher, but unless you're watching task manager while watching your movies you'd never know.
It's an Intel GMA 4500MDH vs an ATI HD 5670 for those that are interested.
Re: (Score:2)
or doing an encode, compile, or BONIC in the background.
Some of us do that sort of thing you know.
Yet my 8600GT will happily decode 1080P H264 level 4.1 High profile @ ~2Mbps while ffmpeg is encoding a dvd to h264 at the same time.
fantastic side effects. (Score:2)
Re: (Score:2)
Man, I can't get enough of these graphics cards stories! Oh yeah!
Just because you're not interested, it doesn't mean someone else isn't. It's like you were driving your car down a highway and you see an exit for a road that's not on your route. You might not care about that exit. But for someone who needs gas at the station on that intersection, that exit is very well timed. That doesn't mean you need to take the exit either. Unless you're just curious and you want to drive around a bit. Then you better fill up anyway because you never know where the next gas stati
Re: (Score:2)
Try playing the just-released Civilization V without a DX11 card, and you'll be signing this tune for real. I have a pretty good non-DX11 card, and its painful (for a frigging turn-based strategy game!). Pretty much every PC gamer is going to need one of these cards the next time they buy a new game.
Right now the cheapest decent DX11 cards are Nvidia's 960 series, which tend to go for $200 or so. Perhaps that's no big deal to some, but I have a family to support, so $200 is kinda painful. I've been eagerl
Re: (Score:2)
Re: (Score:3, Insightful)
Re:Oh wow! New graphics cards! (Score:4, Interesting)
Re:Oh wow! New graphics cards! (Score:4, Informative)
According to Wikipedia, both use about 150W under full load. See http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units [wikipedia.org] for a comparison table.
But at the same time, the 6870 is of course faster. So if you don't need the extra performance, what about a 6850?
Should still be an upgrade in performance, have at least the same power advantage as the 6870 at idle, and uses only 127W under load.
Re: (Score:2)
The 6000 series cards offer little or no performance gain over the 5000s, and are in fact slower in some games. They are really just a refinement of the 5000s rather than an upgrade. They use less power and cost less to produce while giving similar performance in most games.
I am waiting for some computing benchmarks like Folding@Home and WPA cracking. The 6000s have fewer stream processors (hence lower cost, with the performance made up by architectural changes and improved algorithms) so I have a feeling t
Re: (Score:2)
Well, I was responding to Zuriel who considered one as replacement for his "too noisy in idle" 4870, but said he is happy with the performance.
So the 6850 would be suitable for him. Even the 5770 might work for reducing idle noise while keeping the performance level of the 4870 (but I'd prefer the 6850 over the 5770).
Re: (Score:2)
Also the 6850 is a ~9" card, and it is low wattage, making it great for those of use looking to cram some GPU power into 12L of computer case.
Re: (Score:2)
and yet the new 6xxx series includes more Tesselation Units. This means that games using DX11 and Tesselation become more common, the cards will suffer from fewer performance issues.
For example, I have a 5670 with 512M onboard, it's an odball card as it's first gen and is in very limited production now. In running the Heaven 2.1 benchmark, the card gained a higher score w/o AA enabled but frame rates went as low as 3.8 while the same test ran with 2x AA gained a lower score showed minimum frame rates of 4.6
Re: (Score:2)
I'm actually quite disappointed in the newer power requirements.
I bought a 4670 not long ago because of power. With a max 59, compared to anything faster at 150-300, they should be embarrassed.
Re: (Score:2)
"Anything faster at 150-300" is obsolete. According to a review I found at hartware.de, a 5670 with max. 64W will beat the 4670 clearly on performance.
According to the same site, a 5570 will beat the 4670 on power (max. 39W for the chip) at the expense of being a bit slower overall.
Sorry for picking a German review site, but they include the previous generation in their comparisons which helps.
BTW, according to Wikipedia the (future) 6670 is supposed to be a 60W part too. Might be interesting for PCs that a
Re: (Score:2)
try to find a non GPGPU daily use case to get your card to draw max power. The 4670 doesn't even have a PCIE power connector, nor does it support DX11 or OpenGL3.0.
Also the 5670 should be a performance upgrade of the 4670, and only uses 2 more watts(at max load, but i hear the 5xxx series uses a lot less than the 4xxx series when idle).
All of this brought to you by a nvidia linux user.
Re:Oh wow! New graphics cards! (Score:5, Informative)
From what I read at [H]ard|OCP today, the 6850 is an upgrade for a 5830 and below, while a 6870 is an upgrade for a 5850 and below.
source [hardocp.com].
Re: (Score:2)
Not everyone just gets their kicks out of drooling after graphics cards. Like e.g. I've got HD 4800 and still all the games I play run perfectly well at about 60 fps at max. details. There simply is no benefit in buying yet another card.
But alas, each to their own.
I have a HD 4850x2 and I don't get enough FPS.
But since the prices are dropping maybe I can by myself another 4850x2 and CF them together.
GPU's are lagging behind games right now, not like the Intel CPU, when you o/c them there is no game yet to need all that power.
Re: (Score:3, Insightful)
Re: (Score:2)
Erm... You do realise that games don't scale well past two GPUs, don't you?
Depends on the drivers, the 4850x2 are so exotic there are virtual no benchmarks in CF mode for modern games. But you can get a used one for 120E now, hopefully will be well under 100E once the 6xxx series comes out. The power consumption and case space might be a problem tho.
Re: (Score:2)
GPU's are lagging behind games right now, not like the Intel CPU, when you o/c them there is no game yet to need all that power.
I don't see this. I have an older video card (somewhere in the ATI 4600 series) and haven't really noticed any issues with modern games. I haven't bought a game in a long time where I felt prompted to go out and buy a new card. Sure, I could probably squeeze and extra couple FPS out of games with a better card, but I find 40-50 FPS to be perfectly acceptable. Yes, new cards su
Re: (Score:2)
SC2 and WoW get CPU starved very fast. WoW is really bad as addons are not given tier own thread, and 4.x.x can only use 3 cores ATM. Simply running every addon/UI element in it's own thread and letting the OS figure out CPU time would allow the game to scale with core much better.
Re: (Score:2)
Re: (Score:2)
Re:Oh wow! New graphics cards! (Score:5, Interesting)
Meh a Ferrari
Well... yes.
I can appreciate the engineering, even be interested in test driving one, but OWNING one? Too much extra cost for too little extra value.
The same goes for graphics cards ; I have an nVidia GTS8800, which is getting pretty long in the tooth, but it plays most of the games I own pretty reasonably (could be a bit faster on Crysis, I suppose ;-) ), largely because I haven't been buying new games with heavy 3D needs recently.
Why not? Well, partly because I'm less interested in playing games as I age ; playing with ideas seems to be more interesting. Partly because the games that do appeal to me are increasingly indie titles that don't need much in the way of graphical grunt. And partly because most of the big titles that do need a powerful GPU are marred by either being a total pile of arse, an MMO game for which I don't have the time, or encumbered with such offensive DRM that I'd rather not let the box near my computer.
A platform only has value so long as it has a killer app - in the case of new GPUs, I don't have a game that I want to play, or a large set of numbers I need to crunch. I'm guessing that some time next year when Deus Ex : Human Revolution comes out, I might feel a small urge to upgrade.
I'm guessing that Slashdot attracts a substantial proportion of engineers who also see no practical reason for getting a new GPU beyond the "OOh, shiny!" factor, so I'm not surprised to see so many "Meh." responses to this article.
Re:Oh wow! New graphics cards! (Score:5, Funny)
So you don't buy games, don't use high-end graphic cards and don't particularly see the benefits of improved performance and lower power consumption (and cost), yet admire the engineering. Congratulations, you're now an adult.
Re: (Score:2)
Agreed on most points but my old GTS6800 didn't hold up to Dragon Age. So it's upgrade time on March 11.
Re: (Score:2)
Re: (Score:2)
8800GT or GTS8800? they are different.
Re: (Score:2)
And if that don't work, use more gun er GPU.
Re: (Score:2)
Real nerds don't ejaculate over incremental improvements to the status quo.
Which is what we're dealing with here.
If these were new cards that had some revolutionary new rendering algorithm, then maybe they would be "kicks worthy". As it is, news worthy maybe, but no more so than today's weather. Useful to know, but nothing to get excited over.
Re: (Score:2)
Yeah, it would be interesting to see what the dotslash crowd has to say. Wonder if it'd be anything similar to what the /. crowd has to say.
Re: (Score:3, Funny)
Where is your sense of adventure? (Score:2)
*start sarcastic attempt at humour*
Oh dude, you NEED one of these cards, like yesterday, man.
I pre-ordered one, stood in line last night, and today am the proud owner of one of these new shiny cards!
I'm just finishing the benchmarks now...wait a second...HAH!
Eleventy gajillion fps in Tuxracer! W00t!
And only for a $buck three-eighty!
*end sarcastic attempt at humour*
All joking aside...
I started my 'gaming' experience[semi-hardcore] around 1999-2000 with a PIII 800mHz w/ 512 MB RAM, and a nVidia TNT2-64(32 MB
Re: (Score:2)
Think as far ahead with the motherboard/cpu socket/RAM slots and type/expansion slots as your budget allows.
Second priority...the hard drive. %00 GB minimum nowadays.
Fill in the blanks with lower to medium price components-these can be upgraded piecemeal as your finances allow.
ALWAYS look at 'bang for the buck' for all of the above. Here YMMV, depending on your own definition of best 'bang for buck'. Different needs/desires/goals change the definition.
Specifically for CPUs, I'd like to add a that a medium price CPU ("medium" defined as half the price of the fastest on the market) often offers more than 80% of the performance of said fastest part.
In GPU cards, the performance seems a bit more in proportion to the price. But even there, you tend to get more "bang for the buck" if you go for one step below the maximum performance parts. The current AMD release (yes we get back on topic ;-) is a bit special as AMD did not release the high end parts first, de
Re: (Score:2)
This is my general buying strategy on all things; always shoot for the high middle. The high end is generally over priced, since your paying some form of status tax. You are absolutely correct when it comes to CPUs, you could spend $2000 for the top of the top, or you could spend $500 for a chip that does around 80% of the same. The cost/quality ratio gets more and more skewed the higher you get, towards cost.
At the bottom you generally have cheap crap, and get exactly what you pay for.
The only benefit o
Re: (Score:2)
This is my general buying strategy on all things; always shoot for the high middle. The high end is generally over priced, since your paying some form of status tax. You are absolutely correct when it comes to CPUs, you could spend $2000 for the top of the top, or you could spend $500 for a chip that does around 80% of the same. The cost/quality ratio gets more and more skewed the higher you get, towards cost.
At the bottom you generally have cheap crap, and get exactly what you pay for.
Even more importantly, know what you need, where exactly the "cheap crap" is lacking and where it may actually do what you need.
Since we're on the topic of GPUs, an example from the graphics cards world:
I would in good conscience recommend an ATI 5450 for an office PC that only needs to display Word (and that only if no suitable mainboard graphics with digital output are available). Of course, this card sucks for gaming, but that does not matter for the office.
BTW, your ATI 4600 family looks like a good cho
Price (Score:2)
up to six LCDs (Score:5, Insightful)
Sounds great! Tired of selling an old monitor to buy a new one that's 2" larger and a few hundred more pixels, much rather just get a second (or third, or fourth, etc) same-sized LCD and double the pixels.
Re: (Score:2)
No matter what that A4-sized page still doesn't fit in a readable manner on the monitor(s)...
Re: (Score:2, Insightful)
Actually by the time you get into the 22+" size (non-widescreen) you can fix two A4 side-by side at 1:1 ratio. However, this isn't accounting for tool-bars and the like so my preference is a 20-22" non-widescreen or 22+ widescreen, rotated 90". I've used this in Electronic Document Imaging applications, real world, with a lot of seat time and it's a VERY workable solution. Gives plenty of room for a single A4 page with toolbars on top and side.
The one catch is that some monitors have asymetrical and or narr
Re: (Score:2)
Re: (Score:2)
Ok, I'm curious where you are located.
Also, for about double the cost of the monitor, you can get a nice vesa mount stand that gives you the entire footprint of your desk back. It was one of those purchases that felt very silly and wasteful, to show off... and then ended up being practical and a great use of the money.
Re: (Score:2)
Re: (Score:2)
Indeed. I tried keeping a monitor in pivot at work, but couldn't stand the rendering of text. And the viewing angles of the monitor weren't too good either, which didn't really help.
Works somewhat well if used to examine scanned documents though.
Re: (Score:2)
No matter what that A4-sized page still doesn't fit in a readable manner on the monitor(s)...
Take a bog-standard, old 1600x1200 LCD monitor (doesn't even have to be an LCD, but they tend to fare better with what I'm about to suggest) and rotate it 90 degrees clockwise. Some mounts allow this, some do not; you may need to purchase a new mount. Then, assuming you have a reasonably modern driver and video card combination, tick the box to accommodate the rotation. Reboot or restart your display manager as indicated.
Voila, a very nice fit for a single-sheet of A4 or US Letter paper at essentially lif
Re: (Score:2)
Seems this image [pcper.com] clears that question right up: two monitors are connected to DVI and 4 are connected through a hub, so I see no reason why I can't purchase two cheap DisplayPort to DVI adaptors [ebay.com] and have up to four monitors connected by the very common DVI port.
one $180 video card, one PCI-E 16x slot, 4+ LCDs. Sounds good.
Re: (Score:2)
I'm running three monitors now. A large central, plus two old-school 4:3 20" turned portrait, one on either side. In fact, I'm reading/posting on /. on the right portrait monitor right now. 1200 pixels is wide enough for practically every reading application, so the sides are for web, email, documents, calendar, note taking, task management, etc. My center monitor is for CAD, two page document editing, engineering analysis, and large format PDFs (architectural drawings).
It _is_ awesome, though when it's wa
Does not compute. (Score:2)
These cards don't have nearly the computational ability I'd hoped for. Even the 5800 series is faster! Fermis are definitely faster for my applications, especially for 32-bit integer multiplication.
Re: (Score:3, Informative)
Re: (Score:2)
There has been exactly one "x900" card and it's the 5970. Historically, the dual cards used to be called X2 like in 4870 X2 but the 5970 wasn't fully a 5870 X2 (would break the ATX spec) so they gave its own name and series. What is worse is that the 5870 is performing better than 6870 and same for 5850 and 6850. The price reduction is nice, but in all honesty they should have been named either as the 6700 series or as 6850 and 6830 respectively.
Re: (Score:2)
Price wise, at least in sterling, the 6850 is the replacement for for 5830 @ £150 ish, and the 6870 replaces the 5850 @ £200. Mid-range cards with a moderate performance bump (and cooler and thus quieter).
As already stated, the 6900 series will be the high-end performance cards, though single gpu. Confusing, though not as bad as intel, yet!
Re: (Score:2)
Though now I look at the price drops for fermi based nvidia cards; the 460 1GB is @ £150 and beats the 6850 handily, and the 470 @ £200 definitely out performs the 6870 at the same price. So on bang for buck, the fermi's definitely win this round! I imagine the prices will drop for the 68xx series, or they're going to take a bit of a kicking this time round.
Re: (Score:2)
It would be interesting to do a price/performance matrix for computational GPUs. At some point it has to be better to start buying more cheaper GPUs instead of one or two expensive ones. The real limit might be the availability of motherboards with multiple PCI-E slots. Even though 16x cards work in 1x slots most of them physically will not accept 16x cards.
Re: (Score:2)
there are boards out there with 4 x16 lanes on them for quad SLI.
hmmmm (Score:2)
Re:hmmmm (Score:4, Informative)
No, what you probably heard about was that they are dropping the ATI branding of the graphics cards. The cards themselves are alive and well, just AMD branded.
Re: (Score:2)
Re: (Score:2)
I've worked for SGI, ATI, and nVidia (Score:2)
I've worked on some of the most cutting-edge GPU designs on the planet, from the low-level software stack down to the design changes needed to accomodate die shrinks. After looking at the HD 6870's design, one thing is clear. It needs more cowbell.
Re: (Score:2)
Well, the HDMI port carries audio, so that's pretty easy.
Quick highlights of this 6870 launch (Score:3, Informative)
2. $239 for the 6870, $180 for the 6850
3. 5870 > 6870 > GTX 470 > 6850 > 5850 > GTX 460
4. Crossfire scaling (for those who are dual-GPU inclined) is around 90%+ in most games
5. Brand-new Anti-Aliasing filter: ATI has invented some edge-smoothing shader that looks incredible in most games and even works where in games that don't have AA or where AA would give a huge performance hit. This "morphological AA" costs almost nothing in framerate.
Status of linux drivers (Score:4, Interesting)
Does anyone know what the status of the Linux drivers are(Both open and closed). Do I still have to buy a nvidia card if I hard to use OpenGL with Linux, or did Amd finally release drivers with performance as good as the ones on Windows?
Re: (Score:2)
Guess it's been a few years for you and ATi? Since about the middle of 2007 I haven't had any real issues with ATi's closed source driver. I don't recommend the open source driver for the most recent cards (FGLRX does a much better job).
Re: (Score:2)
That sounds about right. I don't switch graphics card that often :}
But is the performance for ATI's closed source drivers as good as the windows version? (And can they be installed on Fedora 13 without wasting to many hours)?.
Re: (Score:2)
I haven't noticed too much difference in the Windows vs Linux versions, but I also haven't done a whole lot of gaming on Linux recently. I have a Radeon HD 4550 on an Ubuntu/Windows box and a Radeon HD 3200 on a Fedora/Windows box. I do think that the my last round of updates broke FGLRX on Fedora 13 (I think X upgraded, haven't looked too much into it, will reply again later after checking). Installing the driver has been a simple matter of download, run the installer, startx. I guess I could try some
Re: (Score:2)
Phoronix is the site that cares about that sort of thing. I can't seem to find their most recent article on Windows vs. Linux closed drivers, but I remember there being one within the last year or so.
http://www.phoronix.com/scan.php?page=category&item=Display%20Drivers [phoronix.com]
(There was still a big gap in 2007)
All I know is that nVidia is really making a killing on all the Linux graphics clusters for government/defense sims that are replacing all the older SGI and SUN workstations. And it's still even difficu
Re: (Score:2)
Last I knew about the proprietary driver was that it performed better than the OSS ones, but still not as good as Windows.
For reference: http://wiki.x.org/wiki/RadeonFeature [x.org] and the pages linked to from there.
Drivers stable yet? (Score:2)
Honestly, when I can't even keep my 4870's working reliably, why would I bother shelling out any money for these things? Call me when they hire folks who can create a driver set that works without having to purge and upgrade every month or so.
[John]
Re: (Score:2)
I upgraded to a version 9 set last year I guess and had to roll back to an earlier version 8 because rolling back to the previous version 9 still crashed the system.
[John]
Wow (Score:2)
I know you shouldn't judge a book by it's cover, but I was kinda interested when I read that these are now uber-efficient and such. Then I open the article and both are the massive structures that I'm not sure will even clear the hard drive bays in my case, and the 6870 requires not one but TWO dedicated graphics card power leads with a 151W power draw under load.
"Efficient" just don't mean what it used to.
Linux (Score:2)
But ... does it run Linux?
I mean, I'm sure there will be a driver to use the card under Linux. But can you run Linux on the card?
Re: (Score:2)
Hardcore gamers aren't buying these.
Re: (Score:2)
Re: (Score:2)
What don't you get? If you have any 2 year old card or a low-end 1 year old card, these cards will give you a big improvement at a price that is considered "mid-range". In fact, thanks to this release you can now get for $180 (HD 6850) a bit better performance than $230 gave you up to 2 days ago (GTX 460 1GB). If you already have a good card, I guess you have to wait for next month's high-end release, but that comes with a price.
Since it is always about price/perf, all reviews correctly state that this laun
Re: (Score:2)
I've got a pair of Radeon 4670's right now. A 6850 would give me a performance boost over those even in CrossFire, and it will be quieter and use less electricity. If I already owned a 5870, there's no way. But not everyone buys everything in the latest generation. Most people skip a generation or three.
Re:Power consumption and Gbps vs GB/s (Score:5, Informative)
That's 4Gbps per bus line, apparently. The card has a 256bit bus, which works out at exactly 128GB/s.
Re: (Score:2)
I am at a total loss for understanding when a firm releases a new power effecient graphics card that draws 19 watts of energy continuously when doing absolutely nothing at all. Something is fundementally wrong with the industry here not just AMD.
That's actually not bad at all. The problem is that a transistor that's turned "off" is no longer off. As semiconductor manufacturing processes get smaller, transistors become worse and worse at being insulators when off, both through the gate (which, in an ideal MOSFET transistor, never conducts current) and through the junction (in an ideal transistor, it conducts when on, and insulates perfectly when off).
These days, about 50% or more of a chip's power is lost to leakage - these transistors that are burn
Re: (Score:2)
Here's the 5850 still pulling off 23fps at 1920x1200 4x AA 16x Aniso [hothardware.com] running the newest game possible (released Feb 2010), Aliens vs Predator [wikipedia.org]. I'm a mild gamer and I'm even sure what the 4x AA and 16x Aniso is, does something with making it look nicer, turn them off and the fps improves significantly. If you're like me and you're stuck playing a game released way bac [wikipedia.org]
Re: (Score:2)
Crysis and Just Cause 2 with the graphics maxed out (at 1080p) can both bring my SLI'd twin GTX 260s down to 30fps at times...
Re:Nice card shame about the price. (Score:5, Insightful)
This seems to be one of the worst reviews out there, looks like it comes directly from the nVidia PR department. The main reason, apart from the benchmark selection and lack of any methodology details, is that it only pits the new cards against an OC card that nVidia strategically priced yesterday and had EVGA send it to the reviewers asking for this to be the AMD competition. Also, I don't see the prices that the article uses, because even the sites that did try out the EVGA card (along with others of course, unlike this site here) stated it is competitive but did not notice a price/perf advantage.
The point is that while the OC cards vary in price and availability (since the good ones use hand picked GPU's, at their introduced price points the AMD cards have the best price/performance, and absolute performance over the regular 460 versions. In fact, all other reviewers seem to say that even at yesterday's price cuts the regular GTX 460 is a bad buy, while interestingly if you can go to the GTX 470 price that is the only point nVidia now leads.
Unless in Great Britain there is some weird pricing going on hence the article...
Re: (Score:2)
Re: (Score:2)
IIRC the GTX 460 is the lowest-end nVidia card that supports FP64, and if that is the case it is definitely the cheapest CUDA development platform for those who are into that.
Then again, I am personally hoping OpenCL picks up and we get good dev tools and support since it is a much more exciting technology. Compare developing a program to run on a number of specific manufacturer's GPUs, to developing a program to run on a heterogeneous system taking advantage of the specific strengths of available GPUs, CPU
Re: (Score:2)
Re: (Score:2, Funny)