AMD Unveils New Family of GPUs: Radeon R5, R7, R9 With BF 4 Preorder Bundle 188
MojoKid writes "AMD has just announced a full suite of new GPUs based on its Graphics Core Next (GCN) architecture. The Radeon R5, R7, and R9 families are the new product lines aimed at mainstream, performance, and high-end gaming, respectively. Specs on the new cards are still limited, but we know that the highest-end R9 290X is a six-billion transistor GPU with more than 300GB/s of memory bandwidth and prominent support for 4K gaming. The R5 series will start at $89, with 1GB of RAM. The R7 260X will hit $139 with 2GB of RAM, the R9 270X and 280X appear to replace the current Radeon 7950 and 7970 with price points at $199 and $299, and 2GB/3GB of RAM, and then the R9 290X, at an unannounced price point and 4GB of RAM. AMD is also offering a limited preorder pack, that offers Battlefield 4 license combined with the graphics cards, which should go on sale in the very near future. Finally, AMD is also debuting a new positional and 3D spatial audio engine in conjunction with GenAudio dubbed 'AstoundSound,' but they're only making it available on the R9 290X, R9 280X, and the R9 270X."
Schlameel, Schlamazel (Score:2)
Re: (Score:2)
Or the R1 for that matter?
Re:Schlameel, Schlamazel (Score:5, Funny)
The same thing that happened to the Intel i2, i4 and i6 processors.
Maybe time for an upgrade? (Score:3)
Wow, that $89 R5 actually looks surprisingly attractive. If the benchmarks hold up, I might think about replacing the old power-hungry card I've got in my main desktop machine right now - I'd probably save energy and get better performance to boot.
Re: (Score:2)
Wow, that $89 R5 actually looks surprisingly attractive
I'm guessing that /. is the only website where this comment would find general agreement.
Will it fit? (Score:2)
Re: (Score:2)
Ah, thanks for the writeup. I'm actually using a 4850 too, and just kinda assumed that the architectural improvements & extra/faster RAM (mine has 512MB) would compensate for the lower shader counts. Thanks for correcting me!
without decent drivers (Score:2, Insightful)
that work reliably for more than the current crop of just released games, I don't care how much faster these chips are. I've had too many glitches with radeon drivers over the years to consider them again. Their opengl is horrible, and CCC is a bloated pos.
Re:without decent drivers (Score:4, Informative)
Yeah I feel the same way about their driver support, couldn't trust them with too much of my limited gaming hardware budget. .Net?
Also, would it be really really difficult for them to hire some decent programmers and produce a new version of Catalyst control center that doesn't have to run on
Whatever happened to C++ and fast reliable software?
Re: (Score:2, Informative)
What happened? Point-and-stick software 'development.' Visual basic on steroids (.NET), and huge interpreted runtimes (python/php/ruby/.NET/ad nauseum) being used to write programs that could be done in a few dozen lines of C or shellcode..
This disease is everywhere. Basic system software should have as few dependencies as possible. GNU land suffers from this too. Honestly if CCC was the only problem, I could live with it.
Re: (Score:2)
Re:without decent drivers (Score:5, Insightful)
-
- In case you meant to refer to C#, no part of this development process is "point-and-click". In this regard, it is no different to C++ (I develop in both).
- It is not interpreted. Nor has it ever been.
- I think you'll find that the simple programs of "a few dozen lines" that you mention would likely be smaller (3 of lines) in C# than C++. But, again, this is a silly comparison and shouldn't be used in any reasonable comparison. If things like this are a problem, you are just using the wrong libraries; in most cases it has little to do with the language directly.
Re: (Score:2)
- I think you'll find that the simple programs of "a few dozen lines" that you mention would likely be smaller (3 of lines) in C# than C++. But, again, this is a silly comparison and shouldn't be used in any reasonable comparison. If things like this are a problem, you are just using the wrong libraries; in most cases it has little to do with the language directly.
I'm also a professional software developer but if you stop to think about it I think that you'll find MSVCRT (Microsoft Visual C++ Runtime) is significantly smaller then the .NET Framework that MSIL applications can't run without. I'm not sure about the size of the Mono runtime in comparison. .NET framework is pre-installed on Windows, but depending on your project and targeted runtime environment it could be a factor.
Most of the time this won't be important, especially given that
Re: (Score:2)
Frankly the runtimes are so lightweight and efficient that if you cant manage to write a GUI control panel in say .Net that performs as well as native then you are just a shit programmer.
Unless said controls are using WPF which then performs horribly on older machines. They might have improved it since .NET 3.5, but I haven't developed on Windows for quite a few years now so my knowledge could well be out of date. Or just plain wrong. This is the internet after all ;)
Re: (Score:2)
But has that ever been different?
New functionality, runs shit on older hardware. It's not really news it's just the way the world works.
The classic Windows style GUI and toolbars and most common controls haven't changed in decades yet computing power has increased tremendously. WPF is the first attempt at doing something about that because given the change in computing power we can definitely do better now with those interfaces.
If that means running shit on old hardware then well yeah, that's the price we p
Re: (Score:2)
The idea that these platforms and languages (python/php/ruby/.NET/Java) provide "Point-and-stick software development" is fucking retarded, anyone who knows even the slightest bit about them knows immediately that such a statement is objectively false. But that very thing is often said by the sort of people have no understanding of them through choosing to ignore them or genuine inability to comprehend them.
Frankly the runtimes are so lightweight and efficient that if you cant manage to write a GUI control panel in say .Net that performs as well as native then you are just a shit programmer. But of course people who think the programming world begins and ends with C that have no understanding of anything else will immediately blame the language or the tools rather than admit they need to actually learn more.
There's only one reason for using high level languages at all. More efficient development, easier to support, and better cross compatibility. But all that is on one side of the equation. Execution is always going to be slower, and it gets worse the higher up you go.
You can't really argue managed code isn't several orders of magnitude slower than native code. Boxing/unboxing, automatic garbage collection, all of it's other benefits: none of it comes free.
The defending of managed code because the difference b
Re: (Score:2)
Re: (Score:2)
This is what happens when non-programmers try and converse about programming whilst having a certain arrogance that prevents them seeing that they're way out of their depth.
He also listed .NET as interpreted which is flat out wrong (unless you're talking about code executing on the DLR but given his level of understanding I doubt he even knows what that is).
The idea you can write more concise code in C or Shellcode than the others is also laughable. The fact you have to explicitly write code for dynamic mem
Re: (Score:2)
"Whatever happened to C++ and fast reliable software?"
That makes no sense, C# and .NET let you write fast, reliable software. In fact, the very nature of .NET means it's more reliable by default because it has better handling of common programming mistakes and the JIT compiler means you can get equally good performance out of it, .NET even gives you a decent amount of control over the GC so is lesser plagued by the nature of that in say Java and even Java can perform equally well otherwise.
CCC's problems ha
Re: (Score:2)
... CCC's problems have nothing to do with the development environment, language, and framework used ...
Well I seem to remember repeated faults with mismatched .Net library dependencies and somehow ending up with a CCC installation that would not load up its user interface and could not be fixed by uninstall/re-install and wasted many days of effort. But I guess you are right, it takes a special kind of developer to make such a poor hash-up of a user interface.
Re: (Score:2)
"requiring little in the way of dependencies"
This has always been a myth. Even before .NET in a very C++ world people were using various libraries like MFC that had to be installed.
Installing .NET is no different to installing things like new Winsock versions, through to MFC updates, through to MSXML. Nothing's changed, it's all just in one big package called .NET now.
"But it also makes it much easier to write rubbish"
This hasn't changed either. People said the same about BASIC, and COBOL and assembly progr
Re: (Score:2)
Re: (Score:2)
NVidia drivers tend to be worse nowdays.
I hear them on hear saying how crappy Windows 7 is because aero brings thier GTX 680s to a crawl. Funny my parents Intel GMA 950 integratred 2007 era graphics run aero fine. Again driver issues.
I only had one bug with my ati drivers and if you base your data from 10 years ago then it is obsolete.
Re: (Score:3)
I based it starting at 10 years ago.. actually it starts with the ATI rage 128 which came out in 98(?), through to the radeon 5000 series. that 128 used to bsod windows on a regular basis with opengl applications (eg quake2). Years later, a litany of broken scenes, kernel panics due to unhandled exceptions (HANDLE YOUR DAMNED EXCEPTIONS!), tearing in video playback, completely broken support in non-game accelerated applications, etc, have kept me far far away. There's a reason adobe, autodesk et al, (and
Re: (Score:3)
I do know some were having some issues with post 314.x drivers, but I didn't run into any.
Some? Anything post 290.x have been complete crap on 400-600 series cards. At best, they might be stable, at worst you're going to see amazing hardlocks which require a complete powerdown to fix. The last time I looked on the nvidia forums with that issue, there was a thread on this with nearly 140k views. Funny enough it was the bad drivers that broke me, and I dumped my 560ti for a 7950 I have no complaints of doing so.
Re: (Score:2)
You lack reading comprehension. Please reread.
Re: (Score:2)
Well, my experience with radeon cards is in windows, for the most part. Really, it doesn't matter because the fglrx drivers for X11 aren't any better. So unless Windows was developed by basement dwellers, your assumption is incorrect.
Re: (Score:2)
I think you are both saying the same thing.
Balmer == Manchild.
Mantle API (Score:5, Interesting)
Personally I would've gone for a mention of Mantle, the proprietary API they are introducing that sidesteps OpenGL and DirectX. I don't really know what it does yet, haven't found good coverage, but DICE's Battlefield 4 is mentioned as using it, and the description I've read said it enabled a faster rate of calling Draw calls.
http://www.xbitlabs.com/news/graphics/display/20130924210043_AMD_Unveils_Next_Generation_Radeon_R9_290X_Graphics_Card.html [xbitlabs.com]
Re: (Score:3)
Windows only (for now), blah. Still really exciting though! I remember glide being pretty awesome back in the day. It's funny that NVIDIA bought 3dfx and got glide but it is AMD that built a new low-level api. NVIDIA's NVAPI doesn't seem like an openGL or directX replacement but a helper of sorts for managing all kinds of stuff on the card.
Curious about stability (Score:5, Interesting)
The idea is that operating systems introduce a huge amount of overhead in the name of security. Being general purpose, they view their primary role as protecting all the other apps from your unstable app. And, lets face it, even AAA games these days are plagued with issues -- I'm really not sure I want games to have low-level access to my system. Going back to the days of Windows 98's frequent bluescreens isn't on my must-have list of features.
John Carmack has been complaining about this for years, saying this puts PCs at such a tremendous disadvantage that consoles were able to run circles around PCs when it came to raw draw calls until eventually they simply brute-forced their way past the problem.
Graphics APIs have largely gone a route that encourages keeping data and processing out of the OS. That's definitely the right call, but there are always things you'll need to touch the CPU for. I'm curious exactly how much of a benefit we'll see in modern games.
Re:Curious about stability (Score:5, Interesting)
1. today's consoles also run protected mode (or architecture specific equivalent) operating systems too. The userland kernel hardware latencies are present.
2. You're complaining about games? Today's operating systems are hardly any better off. There is no way the vendors can vouch for the security of 10gb worth of libraries and executables in windows 7 or osx. The same is true for OSS. Best practice is to just assume every application and system you're using is compromised or compromisable and mitigate accordingly.
3. IIRC that particular carmack commentary was done to hype up the new gen systems. It's largely bogus. I'm sure the latencies between the intel on-die hd5000 gpu and cpu are lower, but that doesn't mean it's going to perform better overall. Same thing goes with the amd fusion chips used in the new consoles. They're powerful for their size and power draw, but they will not outperform current gaming pc rigs..
Re: (Score:2)
Re: (Score:2)
Did anyone say SteamOS?
Re: (Score:2)
"That is really what we don't want though - a return to the bad old days of game developers having to write code specifically for each vendor's cards"
It doesn't really matter since there are only two videocard vendors now, and the pace of graphics card innovation has slowed to a crawl. Not only that a vendor API would not prevent anyone from doing a direct X and OGL executable. The same way many games have DX9 and Dx10 exe's.
More importantly EA is big enough to afford to do native API + OGL + Direct X if
Re:Mantle API (Score:4, Insightful)
It doesn't really matter since there are only two videocard vendors now,...
There are only two operating systems in widespread use now, so I should go write my new software in .Net and Objective C/Cocoa? There is only one Office Producitivity software suite in widespread use now, so I should release documents in .docx format? There is only one web browser, so I should only test sites in IE and ignore the standards?
The more we entrench the already-entrenched mono/duopolies, the harder it will be to get out of that mess.
Re: (Score:2)
There are THREE major videocard vendors. You missed the biggest of them all by far (bigger than the other two combined).
Yes, Intel counts - they are by far the largest graphics provider. Of course, Intel graphics is so-so compared to the latest and greatest, but it is surprisingly "good enough" and unless you want to target the enthusiast market exclusively, you'll have to deal with Intel.
Then there's the various little video providers l
How does it perform hash-wise? (Score:2)
'MANTLE' was the game-changing announcement (Score:2, Interesting)
AMD has totally ruined the future of Nvidia and Intel in the AAA/console-port gaming space. Working with partners at EA/DICE, AMD has created a 'to-the-metal' API for GPU programming on the PS4, Xbox One, and any PC (Linux or Windows) with AMD's GCN technology. GCN is the AMD architecture in 7000 series cards, 8000 series, and the coming new discrete GPU cards later this year and onwards into the foreseeable future. It is also the GPU design in all future AMD CPUs with integrated graphics.
GCN is *not* the a
Re:'MANTLE' was the game-changing announcement (Score:5, Insightful)
Are you saying that OpenGL and DirectX are the fastest? Because Fortran code sure is.
Re:'MANTLE' was the game-changing announcement (Score:5, Interesting)
So, you're convinced that the slight improvement in performance brought about by a reduction of software overhead is going to completely cripple nVidia? Yeah, sure.
Even if Mantle does produce faster performance (and there's no reason to doubt that it will), the advantages will be relatively small, and about all they might cause nVidia to do is adjust their pricing slightly. The won't be anything that you'll be able to accomplish with Mantle that wasn't possible without it, such is the nature of fully programmable graphics processors.
Game publishers, for their part, will hesitate to ignore the 53% of nVidia owners in favour of the 34% AMD owners. It's highly unlikely that this will cause a repeat of the situation caused by the Radeon 9700, which scooped a big win by essentially having DirectX 9 standardized around it. In that case, ATI managed to capture significant marketshare, but more because nVidia had no competitive products on the market for a year or two after. This time around, both companies have very comparable performance, and minor differences in performance usually just result in price adjustments.
Re: (Score:2)
Game publishers have a 95%+ AAA market that is AMD GCN exclusive- it is called the next-gen console market (and yes, these figures won't be true until a few years time, but this is the new unstoppable trend that all serious publishers must account for). Against the Xbox One and PS4, the PC market is a sad joke. It becomes a far less sad joke if AMD GCN gets properly established on EVERY gaming PC.
The new AMD flat memory model for GPU / CPU is very interesting. I assume this is what will be used in the upcoming consoles. But you have to realize that the console and PC markets are currently being crushed by the newly created portable market (Android / iOS). The larger market of iOS users combined with the App Store model that limits piracy has game developers making more money with their stupid little Apps then they do with their much more impressive console games. The Android market is also grow
Re:'MANTLE' was the game-changing announcement (Score:5, Interesting)
"The difference in performance will be MASSIVE when the rendering features made viable by Mantle are enabled."
I'm sorry but you are full of shit. Memory bandwidth has been the KEY factor in framerates. Not drawcalls. That drawcall bs is propaganda. Transistors > software (provides software developer isn't braindead). Always. The same way CISC was 'slower' then RISC, and the itanium was supposed to be the death of x86 but we still have X86. They found ways around it and to make it faster. Same deal.
Re: (Score:3)
Re: (Score:2)
"I'm sorry, but you're the one who's clearly full of shit. GP clearly has some experience doing console/PC game programming."
I'm sorry but the OPPOSITE is true, perhaps you don't know the theory behind ITANIUM. Do a bit of research you dumb cunt. Right now in the hardware world ALL SOFTWARE both GPU and CPU is suffering performance penalty from the memory bottleneck.
https://epic.hpi.uni-potsdam.de/pub/Home/TrendsAndConceptsII2010/HW_Trends_The_Processor-Memory_bottleneck___Problems_and_Solutions..pdf [uni-potsdam.de]
"The
Re: (Score:2)
Oh wow, you're vitriolic when you're proven wrong, aren't you?
1) Draw call overhead is a problem of CPU utilization and synchronization. On the CPU end, the drivers have to check a bunch of states before moving forward with the draw call (yes, like you said, this hits DRAM, but...!). During this check, the CPU may stall and stop handing instructions to the GPU until the GPU finishes with its current tasks (read: a sync), which stalls both the CPU and GPU. When you have 10000+ threads in flight, a pipeline f
Re: (Score:2)
I know all about this you stupid fuck, the problem is -- 99% of games don't need the marginal increase performance and that's EXACTLY what it would be. When looking at performance of games you have to look at total system performance of the entire system and the ecosystem of games. Say you increase draw calls by 900% it doesn't mean those draw calls are spent doing anything necessary to the game, in that there is no added value.
You're narrow minded thinking regarding performance is the issue, out of all
Re: (Score:2)
And this is why there are tons of ways to reduce draw calls using various batching techniques. Instancing, for example, or vertex buffers. If modern software drew every single piece of geometry or sprite or primitive using individual draw calls, then yeah, you'd see some massive speedups. But that's not what graphics engines are doing.
Re: (Score:2)
Re: (Score:3)
AMD isn't a certain enough company for game developers to drop nVidia and Intel, and they're not all going to do do yet another port. Does AMD really think MANTLE will kill nVidia? They certainly won't kill Intel (yeah, the macho gamers all buy discrete hardware, but publishers won't give up all the buyers who run the games on integrated hardware).
If AMD is serious about OpenGL being a problem, then they need to take whatever they figured out in the lab about MANTLE and work to get it into OpenGL 5 (or wh
Re: (Score:2)
Here we are talking about performance factors that may be 10-100 times faster. A Mantle method could thus easily have a game running at unplayable frame rates if emulated under DirectX.
This statement is so mindbogglingly idiotic that it pretty much makes anything else you say meaningless by proving that you really have no idea what you're talking about. AMD claimed that MANTLE might enable up to nine times as many draw calls to be made per second under ideal circumstances, but draw calls are not that big a bottleneck. Thinking that somehow reduced overhead on draw calls could enable a one hundred fold performance increase on the same silicon that is today rather well utilized is akin to t
Re: (Score:2)
I have a completely different prediction: you don't know what the fuck you're talking about. Nothing can kill OpenGL, if DirectX couldn't do it, certainly not this proprietary shit.
Re: (Score:2)
Nothing can kill OpenGL, if DirectX couldn't do it, certainly not this proprietary shit.
Android based phones, tablets, consoles and even laptops and dekstops (yes, they are coming and they are getting better very very fast) all use a simple version of OpenGL. You'd have to kill that fast-moving train to kill OpenGL and that is not going to happen. Yes, AMD got a big win with it being the base of the new XBox and Playstation toys. That would have major implications a few years back. I'm not convinced it will make such a huge difference today. As I said, there are Android consoles for sale right
Re: (Score:2)
Not if you care about your game running on the majority of hardware out there. You'd support a mantle path, but not exclusively.
Re: (Score:3)
" limiting developer"
Not OpenGL. Extension not there? Just add it in. Can't do that with Direct3D.
Re: (Score:2)
Isn't nVidia's Cg very close to metal as well? Also, you seem to be implying that making an GPL driver for Mantle should be easy since it will just send the compiled code directly to the card. Could Linux finally be able to get "release day" open source drivers for AMD cards?
Re:'MANTLE' was the game-changing announcement (Score:4, Insightful)
And Nvidia has been crushing AMD/ATI in the PC market for a while (the Steam hardware survey shows 52.38% Nvidia to 33.08% AMD/ATI with 14% Intel).
Hopefully this will even things out some but I don't see it making OpenGL or DirectX obsolete.
OpenGL and DirectX have so much momentum and market share that game devs are going to have to target and support them for a while yet.
Also, until we get more solid details about Mantle we won't know how good it really is. I am cautiously optimistic but at most this will cause me to delay my next video card purchase until things shake out.
Re: (Score:2)
So, how much is amd paying you? I'd like to supplement my income.. OpenGL and D3D aren't going anywhere for the immediate future. We went down this vendor-api route with glide, and while it did run well, it created support issues for the consumer that fragmented the market and made it difficult to make money selling gpus. It would be nice, however, to see better support parity between the vendors' shader compilers.
Re: (Score:2)
Of course, ATI customers with 6000 series cards or earlier (or Zacate, Llano, or Richland APUs) are as out-of-luck as Intel and Nvidia GPU users
So existing ATI customers are being given the shaft
With the rise of Mantle, many console games developers are going to desire that the PC market rapidly changes to AMD only
because games producers are going to ditch 66% of the pc gaming marketplace
Any PC gamer interested in high-performance would be INSANE to buy any Nvidia product from now on
in favor of the minority of sane gamers
Nvidia, on the other hand, will be cursing themselves for ever starting this war
because the market leader in graphics cards is going to start crying into their huge wads of cash.
Next time you write a new API, try creating world peace. We need that more than faster fly by wire shooters.
Sincerely.
Re: (Score:2)
because games producers are going to ditch 66% of the pc gaming marketplace
I'd like to mention that many of them are currently ditching 100% of the pc gaming marketplace, and are doing fine.
New Family, My Ass (Score:5, Informative)
If these links are to be believed, this is just another rebadge:
http://news.softpedia.com/news/First-Sighting-of-Radeon-R9-280X-Really-a-Rebranded-HD-7970-GHz-Edition-386239.shtml [softpedia.com]
http://videocardz.com/45877/amd-radeon-r9-280x-rebranded-hd-7970-ghz-edition [videocardz.com]
http://www.techpowerup.com/191440/radeon-r9-280x-is-rebranded-hd-7970-ghz-edition.html [techpowerup.com]
http://www.guru3d.com/news_story/radeon_r9_280x_could_be_a_rebadged_hd_7970_ghz_edition.html [guru3d.com]
Re: (Score:2)
Low end models in each familar are almost always a rebadge of the high end models from the previous family. This has been the case for a very long time. It allows the manufacturer to better move inventory that would otherwise be unsold.
Re: (Score:2)
When you say "a very long time" I assume you only mean a generation or two? because this tacky shit wasn't done by any of them a while back. Each model number in a 3xxx series was all based on the 3xxx tech or the 2xx or whatever. Now as you state the top end part in a series is new, the middle end parts in the new series are old.
It's deceptive.
Re:New Family, My Ass (Score:4, Insightful)
nVidia is famous for rebadging. I'll give an example: the Geforce 8800GTX became the 9800 GTX, and then the GTS 250.
ATI on the other hand, has followed a different pattern. All cards of a series (HD 2xxx, 3xxx, 4xxx, etc) are based on the same tech. The 6xxx series cards were tuned versions of the 5xxx cards, and I think what's happening is the new R-series cards are tuned versions of the 7xxx series. nVidia does this with their cards now too - the Fermi family (4xx and 5xx) and Kepler family (6xx and 7xx) introduce a chip in the first gen, and refine that chip in the second.
Re: (Score:2)
My experience with mobile GPU is greatly lacking, I'm afraid.
Re: (Score:2)
Both companies have done this for quite some time. They often introduce new products into emerging markets to sell off old silicon.
The Radeon 9100 (2003) was a rebadged Radeon 8500 (2001)
The HD 3410 (2009) was a rebadged HD 2400 (2007), and the HD 3610 (2009) is a rebadge and slightly retuned HD 2600 (2007). In fact, most of the HD 3000 series was a patch for the short lived HD 2000 series.
The HD 4580 (2011) is a rebadged HD 3750 (2008).
With the exception of the HD 6900 series GPUs, the entire HD 6000 famil
Re: (Score:2)
"Radeon X1250" was also a chipset based around Radeon 9600/9700 tech, and its users were shafted with dropped driver support. :D
Current "Radeon 8660D" is kind of a cut-down Radeon 6970. So you have Radeon 7000 series (barring some OEM and laptop models) with a more recent and advanced architecture than Radeon 8000 series
As AMD now wants to boast about their GCN and HSA, they thus needed to introduce R5, R7, R9 as a naming clean up.
Re: (Score:2)
They also seem to be saying that the flagship R9 290X is going to be based on the new technology.
At the risk of being labeled troll (Score:2)
Re: (Score:2)
I never had a problem with my crap of the line HD4200. Sure, it's not going to run the latest and greatest at any respectable frame rate, but hey, it worked and didn't die.
Re: (Score:2)
I'm currently running a 7750 (my critera were basically SC2 and CS:GO at a solid fps-capped 128FPS, so anything more would have been overkill) and haven't had any issues whatsoever. Rock solid cheap card...
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Same here. I do have a small problem with the driver though. I have to use a special version or the box might lockup. But it is a great card and has been rendering perfectly for years. Just ordered a 6970.. very excited for the upgrade!
Ignore numbers (Score:4, Interesting)
128-bit is low end.
192-bit is your mid range card.
256-bit is your high end.
You don't need to pay attention to anything else until 256 bit. After that just sort by price on newegg and check the release date. Newer is better
Re: (Score:3, Interesting)
False. Perhaps this was true in the past, but currently memory bandwidth is tailored to the GPU's processing power - that is, it's the bandwidth the core needs, usually defined by the most bandwidth-hungry scenario.
Bandwidth is not constrained by bitrate alone, but by bitrate and clockspeed - a 128-bit interface at 2 ghz is just as good as a 256-bit interface at 1 ghz. Usually the wider bus is less power-hungry at the same bandwidth, and is therefore preferred.
Also, bitrates of 384 and 512 exist.
Re: (Score:2)
But 3840x2160 isn't as cool as "4k"
Re: (Score:2, Funny)
Wonder how long until HD manufacturers adopt 1k == 960 to "avoid customer confusion"
Re: (Score:2)
Considering the whole "HD ready, full HD" crap, probably never. Ignorance sells products.
Re: (Score:2)
Especially since if you can pronounce 4K to sound like 'fuck' which is fun.
Re: (Score:2)
If it sounds like 'fork' you're clearly not trying hard enough...
Re: (Score:2)
You mean speaking australian is like speaking english but not trying hard enough? Well it didn't come from me!
Re: (Score:2)
It's easier to say.
Plus, it's fairly descriptive - it's almost 4000 pixels wide, and it's 4 x the resolution of HD.
I don't see the problem here, and I don't think it's "just marketing". People would come up with their own shorthand anyway if it was marketed at 3840x2160.
It's also completely backwards to what the established systems are. Resolution for television has always been specified in the number of horizontal lines. This is a consequence of early CRT raster displays only caring how many times HSYNC and VSYNC are flickered. However many distinct analog values you can toss out on the display wires per line is completely open. This is why we talk about 1080p and not "2K" even though a 16:9 display with square pixels will have 1920 pixels per line.
For TV resolution to be
Re: (Score:3)
Re: (Score:2)
Wow, I was looking at picking up a Radeon HD 7870 for $150 at Microcenter....
Re: (Score:2)
Maybe one day we will be allowed to upgrade RAM on these cards so that going from 1Gb to 2Gb didn't cost you another $60.
Re: (Score:2)
Like we could do with ISA VLB and PCI cards around 20 years ago (even onboard graphics and some soundcards)
Re: (Score:3)
Keep soldiering on, AC
Re: (Score:2, Insightful)
Yeah, good luck with that. Of course it is BGA, do they even make modern RAM in anything else these days? I think it is even part of the GDDR spec at this point, but there are so many pins, and the industry is all about density, so the chips are going to be *GA of some variety. So yeah, toaster oven, after you have disassembled the card to remove the melty plastic bits, then you get to put it back together and find out it doesn't work because it didn't reflow properly. I suppose with proper investment in yo
Re: (Score:2)
That would not be possible, considering noone sells graphics RAM for consumer installation.
Re: (Score:3)
Mining bitcoins with GPUs is no longer profitable for the most part. Most profitable miners today use hardware designed specifically for bitcoin mining.
Re: (Score:2)
I can play DOTA2 and L4D2 on Linux with a Radeon HD 7750 with about the same FPS the card would give me in Windows, if that would help. Only other ATI card I've had recently is a Radeon HD 5770, and that does about the same as the 7750 (slightly better or worse depending on the application - OpenCL performance seems better on the 7750, and the 7750 also supports DP FP).
Re: (Score:2)
My 6850 also "just works", CCC and all, in Debian 7 (amd64).
Haven't checked the supported cards list, lately, to see what newer card works.
Re: (Score:2)
They haven't needed to. The 800 series is out next year iirc. Internet rumor mills say it is a real architecture change.
Re: (Score:2)
Re:Today I learned (Score:4, Insightful)
You mean today you just crawled out of your hole, considering AMD has all three consoles, and they're about to drop a brand new graphic architecture to the table.
Re: (Score:2)
Don't have a problem with it if it's anything like BF3.
At the end of the day BF3 still had way more content than most games, and the premium subscription was slightly less than a new game but also had more content.
I've got no problem paying for DLC if the original game has enough content to be classed as a perfectly reasonable retail buy and the DLC also has enough content relative to it's cost.
That's been completely true of BF3 and BF3's Premium subscription.
It's one thing to complain about 6hr long games
Re: (Score:2)
What?
You people really are daft. Absolutly nothing in the 'premium' package wouldn't have been developed by the community at large if they were allowed to. These 'packs' work out to be little more than extra maps. Sure you got a few extra vehicles and weapons, but it still works out to be very little content for the money.
Lack of mod support, lack of custom map support. About the only thing BF3 had was you could actually still own a server for it if you wanted to, but I suspect that wont be true for lon
Re: (Score:2)
Dunno what you mean by "emulated 32bit", Intel had 64bit itanium with emulated or hardware assisted emulation of x86 32bit. It was crap and sold on computers that cost the same as a house or a car. Then they had 64bit Pentium 4 (and Xeon Pentium 4), not very great but a full 64bit x86 CPU.
Re: (Score:2)
Then they had 64bit Pentium 4
No, the Pentium 4 was most definitely 32-bit. AMD got the first 64-bit x86 CPUs out in 2003 and intel didn't have any for sale until at least 2005 IIRC.
The 1.4GHz 64-bit Opterons spanked the 2.8GHz Pentium 4s...
In those days, intel was a joke.