AMD Delivers DX11 Graphics Solution For Under $100 133
Vigile points out yesterday's launch of "the new AMD Radeon HD 5670, the first graphics card to bring DirectX 11 support to the sub-$100 market and offer next-generation features to almost any budget. The Redwood part (as it was codenamed) is nearly 3.5x smaller in die size than the first DX11 GPUs from AMD while still offering support for DirectCompute 5.0, Eyefinity multi-monitor gaming and of course DX11 features (like tessellation) in upcoming Windows gaming titles. Unfortunately, performance on the card is not revolutionary even for the $99 graphics market, though power consumption has been noticeably lowered while keeping the card well cooled in a single-slot design."
Why? (Score:3, Insightful)
Seriously, good for AMD, but I just don't see the point. Say it's a good card, say it has very low power consumption, but hyping DX11 when it has no particular benefit - especially at this price point - is absolutely useless.
And before anyone says I'm just bashing AMD, my computer has a 5850.
Re:Why? (Score:5, Informative)
I don't get it. Yay, DX11. The biggest new features I could see about it were hardware tessellation and compute shaders.
Compute shaders, or more generally GPGPU (via OpenCL as well as DX11) will open up a huge new market for GPUs. One midrange GPU can replace a small cluster of computers at a fraction of the cost. For example, using 2-3 GPUs in one box, people doing architectural visualization can get their results in minutes instead of days.
Re: (Score:2)
I should add that the arch. viz. sample was for compute shading in general. But even such "puny" cards such as this should give a nice boost to many GPGPU applications.
Re:Why? (Score:4, Insightful)
For example, using 2-3 GPUs in one box, people doing architectural visualization can get their results in minutes instead of days.
Yeah, and the point was that those people wouldn't be buying this card. Face it, GPGPU isn't a general purpose CPU, we have some companies that are already damn good at making those. This means you either need it or you don't, and if you first do you'll probably want a lot of it. Companies and research institutions certainly will have the money, and even if you are a poor hungry student you can probably afford to invest 2-300$ in your education for a HD5850 which has a ton of shaders compared to this. The only real purpose of this card is to phase in a chip built on a smaller process, that'll be cheaper to produce. All they could have gained in performance they've instead cut in size.
Re:Why? (Score:5, Insightful)
Not quite accurate. While GPGPU != CPU, there are things that GPGPUs can do far better than CPUs, and those things are more common than you'd think.
Even though I don't agree with you that that is the only reason, isn't making the same product, but cheaper, a worthy cause in and of itself?
I feel that you are being unduly dismissive.
Re: (Score:2)
Not quite accurate. While GPGPU != CPU, there are things that GPGPUs can do far better than CPUs, and those things are more common than you'd think.
I agree completely, for example, video encoding is pretty common these days and can be GPU accelerated for massive gains in speed.
Re: (Score:2)
Re: (Score:2)
Yeah, and the point was that those people wouldn't be buying this card.
True, I was thinking more generally about the GPGPU market. However consider that if a HD5870 speeds up a task by 10-15x compared to a regular CPU for a given task, then this card could potentially give a 2-3x speed-up. For many it'll be easier and cheaper to get this card than a CPU which can do 2-3x.
Re: (Score:2)
All they could have gained in performance they've instead cut in size.
Power consumption is down as well. Looking at some random HD 5670 card in my preferred online shops, they are typically listed with 61W maximum power consumption. That is about 10W less than in the 4670. For those of us who want a card with passable performance that is still easy on the power supply, the 5670 looks like a good compromise.
Eventually we may even see a 5650 that is good for passive cooling (the limit for that seems to be around 50W, if you don't want ridiculously large coolers).
Re: (Score:2)
A huge new market? More like a small but significant niche.
Re: (Score:2)
Well nVidia released the GT 240 not long ago for $100. My guess is this is AMD's answer to that.
nVidia's card supports DirectX10.1. If AMD can't make a card that out performs it, they can at least have a bigger number on the box.
For developers, both of these cards are good news. It means anyone can afford a video card that can handle the latest features (even if it does them slowly). Devs can focus on making games instead of supporting legacy hardware or creating workarounds for people without feature x.
Use
warning, tangential but off-topic post below (Score:2)
I won't buy an nVidia card because of those longer and longer introductory videos at the beginning of what seems like every video came now.
You know the ones: the big green logo and the breathily whispered "...ennvideeyah". On Borderlands, for chrissake, it seems to go on forever, with the little robot kicking the logo. So now, I either have to plan to go have lunch while I'm
Re: (Score:2)
Re: (Score:2)
Well, yes ... but the people who are most interested in GPGPU aren't generally all that interested in saving $50 to get less processors.
Re: (Score:1, Insightful)
I don't get it.
Of course you don't. This card is for people who have lower resolution monitor (under 1440 X 900), since at lower resolutions it can run all modern games comfortably. About 50% people still run at 1280 x 1024 or below, and for them this is a great graphics card. It gives good performance at reasonable price, and has the latest features.
Re:Why? (Score:5, Insightful)
Well the things that may make DX11 interesting in general, not just to high end graphics:
1) Compute shaders. Those actually work on any card DX10 or higher using DX11 APIs (just lower versions of the shaders). The reason these are useful, even on lower end cards, is that some things run drastically faster on a GPU so even a low end one is better than the CPU. I don't have any good examples specific to compute shaders but an older non-computer shader example would be HD video. You can do HD H.264 on a lower end CPU so long as you have a GPU that can handle acceleration. Doesn't have to be a high end one either.
2) 64-bit precision. Former versions of DX required only 32-bit FP max, since that is the most you generally need for graphics (32-bit per channel that is). However there are other math functions that need higher precision. DX11 mandates 64-bit FP support. In the case of the 5000 series, it works well too, 64-bit FP is half the speed of 32-bit FP so slower, but still plenty quick as to be useful.
3) Multithreaded rendering/GPU multitasking. DX11 offers much, much better support for having multiple programs talk to the GPU at the same time. The idea is to have it fully preemptively multi-task, just like the CPU. Have the thing be a general purpose resource that can be addressed by multiple programs with no impact.
It's a worthwhile new API. Now I'm not saying "Oh everyone needs a DX11 card!" If you have an older card and it works fine for you, great stick with it. However there is a point to wanting to have DX11 in all the segments of the market. Hopefully we can start having GPUs be used for more than just games on the average system.
Also, it makes sense from ATi's point of view. Rather than maintaining separate designs for separate lines, unify everything. Their low end DX11 parts are the same thing as their high end DX11 parts, just less of it. Less shaders, less ROPs, smaller memory controllers, etc. Makes sense to do that for a low end part, rather than a totally new design. Keeps your costs down, since most of the development cost was paid for by the high end parts.
In terms of hyping it? Well that's called marketing.
Re: (Score:2)
Re: (Score:2)
The reason these are useful, even on lower end cards, is that some things run drastically faster on a GPU so even a low end one is better than the CPU. I don't have any good examples specific to compute shaders but an older non-computer shader example would be HD video.
Except that for all intents and purposes, it has nothing to do with the GPU. It could just as well have been on a separate chip, like the Broadcom chip for the new Intel Atoms. It could have been on the CPU too for that matter. Right now there's an awful lot of hype, the question is how much is practical reality. Some things are better solved, in fact generally best solved by dedicated hardware like a HD decoder. How much falls between general purpose and dedicated hardware? Very good question.
Re: (Score:2)
Any function a programmable chip can do can also be done by a custom chip. However, a programmable chip can do them all without needing to include multiple chips or change manufacturing processes as new uses are invented. Sure, you could make a custom chip decoding HD video, but what happens when someone comes up with a new and super
Re: (Score:1, Informative)
Re: (Score:2)
This also allows them to pop a couple fuses and re-purpose marginal would-have-been high end parts by blocking out the broken parts. They did this back in the 9500/9700 days, I don't see why they wouldn't want to do it now.
Mal-2
Re: (Score:2)
That's OK, nobody does for home computing yet. This article was just a marketing press release to move some video cards that will be obsolete by Valentine's Day.
Re: (Score:2)
Same thing was said about DX10. And about HD4670.
Re:Why? (Score:5, Insightful)
And about DX9 before that. And DX8 before that. And on and on. I'm amazed by how many people here don't seem to "get" that advances in technology is precisely how technology moves forward. I mean, it's really a pretty simple concept.
Re: (Score:3, Informative)
Re: (Score:2)
Re: (Score:3, Insightful)
A new DirectX version is not technology moving forward. CUDA and PhysX are.
Re: (Score:2)
"requires CUDA" returns more than 1k hits in Google
"requires OpenCL" results in six
Re:Why? (Score:5, Insightful)
Google Earth across 6 monitors from a single $100 card? Seems like technology is heading in the right direction!
Re: (Score:2, Insightful)
I'm sorry, I've seen this news go all around tech sites and... I don't get it. Yay, DX11. The biggest new features I could see about it were hardware tessellation and compute shaders. What, this requires a powerful GPU in the first place to be of any use? Something much, much better than this card? Oh....
Sounds like AMD wants to pull the "NVidia GeforceFX 5200 card" in the market to see what happens. The FX5200 was on a huge fail scale for being hyped of DX9 Pixelshader 2 features, it does at a grand 1-3fps. Don't get me started on it's unbelievably poor GLSL support either... But hey, it IS "The way it's meant to be played", so can YOU even complain!?
Re: (Score:2)
Sounds like AMD wants to pull the "NVidia GeforceFX 5200 card" in the market to see what happens. The FX5200 was on a huge fail scale for being hyped of DX9 Pixelshader 2 features, it does at a grand 1-3fps. Don't get me started on it's unbelievably poor GLSL support either...
The 5200 wasn't a bad card as long as you kept away from those features. It was faster than the MX cards it replaced and cheaper too, only with the drawback that some games ran like crap in the default configs. If you want true turds you should look at low end laptop chipset, there we're talking sub Voodoo 2 performance with DX10 feature set and inability to run DX7 games.
Re: (Score:1)
Helps with standardisation? I might be writing a game/application that doesn't need tonnes of graphics processing power to run, but it's still easier if I can simply write one DirectX 11 renderer, instead of having to write multiple renderers for people with low end cards that only support older APIs.
Re: (Score:1)
You'd be better off writing for what everybody has on their machines currently. That's DirectX 10.
Don't be a victim of Microsoft's need for revenue from planned obsolescence. Code to DirectX 11 in a few years, if ever.
Re: (Score:1)
Indeed yes, for now (actually DirectX 9, judging by the number still on XP).
But the point is that releasing low end cards now that run the latest DirectX means that things will be easier in future, and will mean developers can sooner start focusing on only DirectX 11.
Re: (Score:2)
Re: (Score:2)
The real reason DX10 never took off is that nobody could tell the difference between DX9 and DX10 screenshots.
Re: (Score:2)
... hardware tessellation and compute shaders ...
Compute Shader for Shader Model 5.0, yes. However, starting with Catalyst 9.12 (December 2009) the HD48xx cards have driver support for CS for SM4.0. Regardless, afaik, no one is using either presently. Would be interesting to see a new physics engine that ran on this; PhysX for Compute Shaders I guess.
Re: (Score:2)
Not only that, but it's slower than the 8 month old $99 ATI Radeon HD 4770 [bit-tech.net]
so if I bought the $99 ATI Radeon HD 4770 8 months ago, why would I spend $99 on a slower card now?
Re: (Score:2)
I don't really keep up with games... (Score:2)
Re: (Score:1)
Re:I don't really keep up with games... (Score:4, Informative)
A lot of games will struggle on this card significantly. It's about as powerful as a 3870 from 2+ years ago.
Re: (Score:1, Insightful)
Which is still plenty powerful enough to run any game that also launches on the Xbox 360.
It also does it without having to buy a new PSU. The DX11 bits are just there to help cheap people (like myself) feel comfortable buying the card, knowing that it'll still play all the games (even if poorly) that come out next year since games that use DX11 are already starting to come out.
It's a good move from ATI, targeted at cheap gamers that are looking to breathe life into an older computer.
Re: (Score:2)
Re: (Score:2)
Yes, but you can get a faster 9800GT for the same price as the 5670.
Of course (Score:2)
How could hardware not outpace software? I mean it is really hard to develop a game for what does not yet exist. The hardware has to come out, and in particular the API has to come out, then developers can develop for it. They do get engineering samples a little early but still.
In terms of DX11 support. Yes, there are a couple games that will use it. No, it isn't really very useful. Said games run and look fine in DX9 mode.
Really, you don't buy new cards because you need their features right away. There are
Re: (Score:2)
I mean it is really hard to develop a game for what does not yet exist. The hardware has to come out, and in particular the API has to come out, then developers can develop for it. They do get engineering samples a little early but still.
Do new video game consoles come onto the market with 0 games? No. Even the Nintendo 64 had three finished games at launch: Mario, Pilotwings, and Shogi. So at least some video game developers depend on engineering samples.
Re: (Score:2)
Consoles are different. They are given to developers for longer periods of time, precisely for the reason that there need to be games out at launch. Graphics cards are tracked to the public much faster as people will buy them without any special titles, since they run older games better.
Also, console development these days can be done on specially modified PCs. You use PC hardware to simulate what'll be in the console, since the console chips come from the graphics companies' PC products.
Re: (Score:2)
I think it's related to consoles, because graphics card manufacturers won't create cards that are beyond the spec people are requesting and people request performance that can play games. Majority of these games will be ported to consoles and have console restrictions built in from the beginning.
Haven't you noticed that main improvements in graphics cards have been both lower price and lower power consumption? I call that quite significant development, there is no real demand to overpower the average deskto
Re: (Score:2)
How could hardware not outpace software? I mean it is really hard to develop a game for what does not yet exist. The hardware has to come out, and in particular the API has to come out, then developers can develop for it. They do get engineering samples a little early but still.
Not really true. Most games have variable quality settings. You can test the playability at the low quality settings on a top of the line card during development and test the graphics output from the higher settings with a driver that emulates the missing features. You may only be able to get one frame every few seconds, but that's enough to check that the rendering looks right. Then you release the game and when the hardware improves people can just turn up the quality a bit. That way it keeps looking
Re: (Score:2)
The biggest problem for game makers is that people went from 17-19" 1280x1024 displays (1.6 megapixels, i think, not going to do the math this late) to 21-24" displays at 1680x1050 (2.3 megapixels). The old standard used to be 1024x768. For a long time it was 1280x1024 (small step up). Now the standard (1680x1050) increased by about 50% seemingly overnight. A card (8600GT 512MB) that could push Valve's TF2 (two year old game at this point) at 45-60fps on 1280x1024 no problem with most of the settings at med
Re: (Score:2)
Wrong, the 5670 can NOT run shattered horizons smoothly at high resolutions, In fact I bet it has trouble at 1280x1024. Heck my GTX 275 barely can run it at 1920x1200 maxed out and only gets ~20-25fps. Also Shattered Horizons is a DX10 game not a DX11 game.
Re: (Score:2)
Re: (Score:2)
then try rotating and moving them in three dimensions instead of two
Rotation and movement in three dimensions is nothing new for computer graphics. Even if you just walk up and down some stairs, and look around and up/down in an game, the graphics engine has to do complete calculations for three dimensions.
A well known game which actually limited things (mostly) to two dimensions was Doom:
In Doom, you cannot look up/down and there is no perspective adjustment for vertical parts of the environment - those always appear as parallel on the screen. That was one of the tricks wh
Whats the point? (Score:4, Informative)
Re: (Score:2)
While it does support DX11, its not powerful enough to really do much with it, barely keeping 30 FPS at 1680x1050.
As I see it, the point of a bargain DX11 card is being able to run games that require DX11 at a lower res such as 1024x600 rather than at 0x0 because the game fails to start without features present.
Re: (Score:2)
Re: (Score:2)
The console game developers face the same problems, at launch few games are successful and most not so much. Later in the cycle many games are not so successful because of installed base is still not big enough. Only after installed base has grown there is some certainty on how a particular game will do.
Therefore launch titles are often made with the support of console manufacturer, only much later truly independent titles start coming out. I wouldn't be surprised if Microsoft somehow sponsored a popular ga
Re:Whats the point? (Score:4, Informative)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Tom's Hardware seized to be informative years ago, nowadays they are just nVidia/intel advertizer.
Re: (Score:2)
Thanks for the correction, ceased is the word I should have used.
Re: (Score:1)
Re:Whats the point? (Score:4, Interesting)
In DirectX 9 mode, the Radeon HD 5670 is once again keeping pace with the GeForce 9800 GT, delivering playable performance all the way to 1920x1200. However, once DirectX 11 features are enabled, the latest Radeon slows to a crawl. Even the powerful Radeon HD 5750 has difficulty keeping the minimum frame rate above 30 fps at 1680x1050.
They pretty much tell us that they're testing these cards using higher settings for the ATI parts. Also on the reviews front page it tells us that they've under-clocked all the cards before testing. Why would anyone take their reviews seriously after actually reading that?
Not like I'm an ATI fanboy here either, my current and last 3 video cards were all Nvidia (was close to getting a 4850 about a year ago, but newegg had a sweet sale on the GTX260). It's just that this level of sleaze really pisses me off.
Re: (Score:2)
When did 30 FPS become bad?
Re: (Score:2)
30 fps at 1680x1050 sounds fucking amazing to me. I would probably just run it at 1200x1024 or something anyway. But when did 30 fps become bad?
Re: (Score:2)
30 fps average usually means that sometimes you're looking at 5 fps, that's when.
Re: (Score:2)
Re: (Score:2)
I doubt you'll be looking at 5 fps, and still, 30 fps doesn't sound too bad. Just turn down the resolution a couple of notches in case of lag, and the problem is solved.
So you're one of the three people who hasn't moved to an LCD yet? Is it that you like wasting desktop space, or electricity? The rest of us would like to use our displays at their native resolution. Also, I find resolution to be the single most important factor in what I can see. When I was playing TacOps I had a fairly beefy computer and I could often shoot people before they could even really see me, on distance maps. Reducing texture quality would make far more sense.
State of AMD for HTPC Use? (Score:5, Insightful)
I'm not a gamer, so the 3D features are not important to me. I am an HTPC user, and ATI has always been a non-factor in that realm. So, I haven't paid any attention to their releases for the last few years.
Has there been any change in video acceleration in Linux with AMD? Do they have any support for XvMC, VDPAU, or anything else usable in Linux?
Re: (Score:3, Informative)
From what I understand hardware acceleration is now somewhat usable with the Catalyst drivers (source [phoronix.com]). But for the open source drivers there is nothing, there's no specs for UVD and even though it should be possible to implement a shader-based acceleration and the docs for that is out, no one has done it yet.
Re: (Score:3, Informative)
AFAIK, the open-source drivers are progressing at a breakneck pace, and hardware acceleration is very usable on some cards. One of the more recent kernel releases included a new driver, which is allegedly quite good.
Apologies for being unable to offer more specifics. The current state of affairs is rather confusing, although I'm fairly confident that we're very quickly progressing in the right direction.
Re: (Score:2)
AFAIK, the open-source drivers are progressing at a breakneck pace, and hardware acceleration is very usable on some cards.
You are referring to 3D acceleration, not video acceleration. There is no open source video acceleration for any card, neither UVD-based or shader-based.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I am an HTPC user, and ATI has always been a non-factor in that realm.
Not in Windows. MPC-HC's hardware acceleration has worked better with ATI chips than with Nvidia until just recently. The biggest sticking point was that VC1 bitstreaming (where you hand the entire bitstream to the gpu for decoding and display, rather than accelerating just parts of it like iDCT) didn't work on any nvidia gpus except the embedded ones - that did change with their most recent hardware release a couple of months ago, but ATI's had support for bitstreaming VC1 in their regular cards for at l
Yeah, I can provide you the same thing for FREE! (Score:3, Funny)
It’s called a “software renderer”. ;)
Just as AMD, I did not say that it would actually render anything in real time, did I? :P
AMD -=- ATI (Score:2)
Anyone else still :%s/AMD/ATI/g when coming up on these stories?
Meanwhile, NVidia is renaming cards (Score:4, Informative)
With NVidia unable to release something competitive and therefore creating a "new" 3xx series into being through renaming 2xx series cards [semiaccurate.com], the gts360m as well [semiaccurate.com], those with a clue will be buying ATI for the time being.
Sadly, the average consumer will only look at higher number and is likely to be conned.
Re: (Score:2)
The average consumer isn't buying video cards, especially not "top of the line" ones. Whether Nvidia is doing this or not, I don't think it will have much affect on the market.
And it's not like you can compare model numbers of Nvidia cards to those of ATI's and figure things out, if they did that, everyone would just buy ATI anyway.
Re: (Score:2)
That is annoying.
Whenever I think of switching to an intel cpu I give up since I cannot figure out how to compare them to an amd cpu
I am sure I would have the same problem if I was switchiong from itel to amd
Re: (Score:2)
Whenever I think of switching to an intel cpu I give up since I cannot figure out how to compare them to an amd cpu
Look at benchmarks (preferably from multiple sources), particularly of things you actually tend to use the CPU for? There are plenty of people out there who've already figured out how whichever CPUs you're considering compare to each other for <insert purpose here>, whether it's compiling code, playing games, encoding video, running simulations, or whatever else. Works for me. This past fall, I found a few different ones in the price/performance range I was looking for, then poked around on places
Re: (Score:2)
Re: (Score:2)
Charlie Demerjian is hardly an unbiased source of reporting on nVidia
And the understatement of the year award goes to...
Re: (Score:2)
Charlie doesn't know what he's talking about. Most of the 2xx cards at this point should be GT21x chips which are based on the GT200 high end chip. The 9000/8000 series cards were all g9x or g8x architectures. Though I think at the low end they might have reused g92 as a G210 in the desktop segment.
Look I don't mean to be a cynical bastard but,... (Score:2)
We consistently see new hardware like this for people "DX10 cards now as low as 150$" or in this case DX11 cards at the 100$ price point.
Time and time again the game developers couldn't give a damn and I don't blame them - they target the biggest possible audience.
I'll never forget the Geforce 3 announcement at one of the Apple Expos of all things, Carmack was there and showed off some early Doom 3, it was absolute hype extravaganza. "Doom 3 will need a pixel shader card like the GF3!" So many people purch
Re: (Score:1)
My point is, any new tech like DX11, while great for all of us is never fast enough in the first implimentations, you'll see in 18 months time though, the DX12 cards will be bloody fantastic at DX11 features though, this is just how it is.
If that's true, you should be glad to get a DirectX 11 card, because it will be bloody fantastic at DirectX 10 features, which your current DirectX 10 card must surely not ever be fast enough at...
Re: (Score:2)
Touche absoloutely touche! You're completely right.
Re: (Score:2)
5850, no, 5870? Maybe and even then in 99.9% of games it's basically "here we see 90fps in 1920x1200 on the 4890 and we see 145 on the 5870!" Thing is I'm hitting 90fps already at 1920x1200, I (and very few) people have a 30" Apple display.
Not to say faster isn't better in the long run of couse but on a $ / speed ratio right now, the 5xxx series just isn't cutting it, far too overpriced - and to think it was ATI who saved us from Nvidia doing the exact thing when the GT series came out 18 months ago. One
Re: (Score:2)
Re: (Score:2)
If you want to play games get a Famicom or that [subpar] new alternative, I believe it's called playstation or something.
What if I want to play indie games or games with mods? Consoles generally aren't made for that.
Re: (Score:2)
For someone who doesn't care about 3d games you seem to have quite strong emotions about direct x and gamers.
Re: (Score:3, Insightful)
Re: (Score:1)
And likewise, if it wasn't for porn, your VHS tape deck would have cost much more than it did.
Yay porn. Yah gamers.
Re:Compiz is all I need. (Score:4, Funny)
"We had a beautiful standard, called HTML. Micro$hit convinced people to use their stupid proprietary extensions, and in a few years we had destroyed the web."
Yes, XMLHttpRequest that MS came up with which made AJAX possible is just another stupid extension. We should use only "beautiful" HTML.
Re: (Score:1)
There's occasionally an exception that can be brought up, that gives Microsoft an excuse to exist.
Re: (Score:2)
anything above 720P at distances greater than 10' is useless. most people sit 18-24" away from their displays. you can most definitely tell the difference between a 1440x900, 1680x1050 and 1900x1200 pixel 24" diagonal display at 24" distance. you're correct that a 40", 1080p display for sports (i.e. general TV, not video games) in the living room is a waste of money, but for video games you will appreciate the 1080p (gui, etc). high resolutions for 22-27" displays on the desktop is very much wanted and very
Re: (Score:2)
oops, it was late when i posted that. I meant a 40" TV, which is more or less the standard size. yes, 150" will take advantage to 1080p :P
Re: (Score:2)
AMD is now supporting the development of Open Source drivers, and has released a lot of specification to make this possible. On the other hand, it is true that they have dropped support for older cards in their proprietary drivers. It seems they want to switch their Linux drivers from proprietary to Open Source.
Such Open Source Linux drivers are available by now for many ATI cards. For Ubuntu, see this list:
https://help.ubuntu.com/community/RadeonDriver [ubuntu.com]
The older cards are well supported while the new ones s