AMD Releases Register Specs For R5xx And R6xx 121
ianare writes "AMD has recently released register specifications for the ATI Radeon R5xx and R6xx graphic devices. This will (theoretically) allow the OSS community to develop drivers, given time. In fact, engineers from Novell have released a first alpha quality Open Source driver which currently supports initial mode settings. Although current work is focused on 2D, rather than 3D acceleration, this type of information sharing could conceivably lead to an OSS 3D driver."
R250 (Score:2)
Re: (Score:1)
R300 opensource drivers (Score:3, Informative)
So either the GP poster will have to update to a more recent release of his favorite distro (latest Ubuntu FF and openSUSE 10.2 have it enabled by default. I don't know about the others distro).
Or if he wants to keep his current installation for some reasons, he has to get the latest DRM (kernel [freedesktop.org]
Re: (Score:2)
It takes time and money to release specs. They are not making money from those chips so it is illogical to put any effort in to releasing the specs.
Re: (Score:2)
Too short term and narrow a thought. Putting a little bit of effort towards supporting older, no longer manufactured hardware can help profitability. How? Sure, no direct profit is made off of someone buying a used video card from a third party, or just using an old card. But if old equipment is still usable, it still has value. One thing some people look for when deciding whether to buy a company's latest offerings all new and shiny, is longevity. If old equipment can still be used, then that makes i
Re: (Score:2)
A new computer has a life of around 3 years. In 3 years you can pay the same price for four times the performance. People that buy new computers know this. They buy hardware that has support NOW. Support for an old out of production video card just doesn't matter in the grand scheme of things. I didn't say it was nice or good but just the way it is. ATI has been around long enough that it has no fear of being labeled a flash in the pan so no need to spend any money or effort
Re: (Score:1)
ATI could win be back from nVidia, in fact I'm holding off replacing my 6600GT to see how quickly these OSS drivers take to be somewhat stable and 3d.
But with 4 nVidia cards (one as old as the mx400) and only one ATI it's obvious that nVidia has my business right now. One brother has had a nv card for ~1year and the other just bought one. At least 2 friends of mine purchased nv cards within the last 2 years based almost exclusively on my opinions.
In the 3 year pe
Re: (Score:3, Insightful)
Even then you top video card is only a 6600GT. Not even a 7 series. You are not a high profit demographic but even then they will win you over if they get a good FOSS driver out for the current cards.
Re: (Score:1)
And you refer to the 6600 as an older card and then say I want them to work on current drivers!
Re: (Score:2)
glxGears needs to be updated! (Score:3, Funny)
Re: (Score:2, Informative)
Re: (Score:3, Funny)
Isn't it written into the GL Spec that all OpenGL benchmarks must include some sort of tea kettle?
Re:glxGears needs to be updated! (Score:5, Interesting)
Re: (Score:2)
A good story, but unfortunately a false one (Score:2)
I would guess that you might be confused because the library named GLUT has a function call that will render a teapot. GLUT is not a part of the OpenGL standard however. (But GLUT is used in almost all OpenGL tutorials I've seen, so it is probably an easy mistake to make.)
Hurrah (Score:2)
Re:Hurrah (Score:5, Funny)
Re:Hurrah (Score:5, Insightful)
Re: (Score:3, Insightful)
Opening the source hasn't got much to do with gaming success. The so-called succesful platforms, Windows and all the game consoles, are closed. Linux gaming could be pretty good right now, with NVidia's drivers for example. Opening the source will hardly make things less geeky, more attractive, or easier for the masses.
Quite another thing is your definition of success. For me, Linux has been a major success since 1999, and I hope I'd have discovered it even earlier. I'm not very interested in games, so
Re: (Score:2, Interesting)
Seriously though, I agree with you about closed source drivers showing gaming is still very weak on Linux, but OSS drivers will allow distros to include accelerated 3d by default and installing a game or 3d effects will be much easier. If only that was the obstacle to gaming on Linux!
Re: (Score:1)
See that "Port Windows-based games to the Linux platform" thing?
Re: (Score:2)
Note the emphasis on networking and lack of mention of anything like OpenGL or graphics. This is to maintain their existing dedicated se
Re: (Score:1)
Re: (Score:1)
If you really wanted open source... (Score:2)
Obviously your priority for open source is below your other priorities.
Old files? (Score:5, Informative)
Zonk doesn't post dupes! (Score:3, Funny)
Sorry, I RTFA (Score:1, Informative)
Oh, it's a "Zonk" story
And even the alpha driver is old news (Score:1, Insightful)
Yep, it's a Zonk story
Novell's engineer started earlier (Score:5, Informative)
So the driver isn't the result of only one week of work, even if it's still in an alpha state.
drivers (Score:1)
Re:drivers (Score:5, Insightful)
Re:drivers (Score:5, Insightful)
Re: (Score:3, Insightful)
You say that... (Score:2, Insightful)
Writing drivers for 3D cards is difficult work. "Release the specs and we'll write the drivers" has been the mantra of the open source community for years, but I think we're all in for a disappointment if we're expecting feature-complete, high-performance open-source drivers for these cards any time soon.
I think some kind of sponsorship to dedicated, full-time devolopers is going to be necessary if
Re: (Score:3)
I think that if you have a community making a decent driver, there will be enough 3D driver writers available who know the hardware well enough to delegate simple responsibilities, make good decisions, and code the hard stuff themselves. Probably. I ho
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
About time! (Score:4, Insightful)
But I think ATI made a smart move. Outsourcing driver development to the OSS community certainly cuts costs.
Re: (Score:2)
I can't comment on the quality of ATI's *nix drivers, but FWIW I've never had any trouble with their win32 drivers. My x800xt has served me well the last three years and it still ticks on nicely.
However, much to my regret the ATI of today seems to be a mere shadow of its former self. Given ATI's failure to meet expected release dates with the last two generations, the somewhat disappointing performance [dailytech.com] of both families when finally released, and the latest string of stories of senior employees signing of [beyond3d.com]
Bad move? (Score:1)
Re: (Score:2, Funny)
Re: (Score:2)
Re:Bad move? (Score:5, Insightful)
Re: (Score:1)
Re: (Score:3, Insightful)
Hardware makers do their thing and then they should pass the necessary info to the community so we can write the drivers.
Re:Bad move? (Score:4, Insightful)
Professional customers might still want a HW-developer-written driver.
Regardless of that, it's a better move than keeping the specs secret. Because in the latter case, you're totally at the mercy of the HW developer as far as driver availability and quality goes.
Re: (Score:3, Insightful)
Re: (Score:1)
Re: (Score:2)
So far there are zero modern 3D drivers that do not have a lot of manufacture support.
I will bet that the FOSS ATI drivers with still have a lot of code from ATI.
Re: (Score:2)
In general, drivers that are open-source and included in the main distribution (be it the kernel, X, or whatever) don't bitrot, even if the manufacturer stops maintaining them. So what does it matter whether it was written by the community or started as a binary blob and got
Re: (Score:2)
To the end user none at all. What it does mean is that the heavy lifting was all done buy the hardware company. No really big money savings from going FOSS are just bug fixes and maybe some small improvements to drivers for hardware that you don't sell anymore.
I am all for FOSS drivers. What I am sick of is the "all they need to do is give us the specs" driver fantasy. Heck
Re:Bad move? (Score:4, Insightful)
Look at the resources the r300 and nouveau projects have. If the manufacturers simply dumped the specs on them, they would be able to produce high quality drivers quickly. Even without the specs, they've proven their abilities to make decent drivers the hard way. Or do you have some reason to believe that they wouldn't be significantly more productive with specs? Is there something magic about ATI's programmers that makes them vastly more productive with the same specs to work from?
Re: (Score:2)
There's also a motivation for the graphics card companies to assist and help the community to develop the drivers. After all, good drivers are a selling to the card and a lack of support is a turn off. I see the future of this being not just opening of specs, but some work being done by the companies themselves under a free software license. It would help sell the cards, after all.
And for reference, I'm due to upgrade my system in a few months and I'll be getting an ATI card on principle.
Re: (Score:2)
Other companies that produce well used hardware does not have drivers for Linux, and when access to their documentation will be available, the quality of the open source drivers will hopefully improve.
Calin
What's wrong with that? (Score:4, Insightful)
The only losers are the companies (eg. nvidia) that compete with companies clever enough to do this, and companies (eg. microsoft) who have a vested interest in there not being any Free Software drivers.
Re: (Score:1)
Re: (Score:2)
Talking of Nouveau, the 2D performance of their driver at one point exceeded that of the nVidia sponsored "nv" driver, but I think since they moved to the new EXA framework (some X acceleration API) they have negated these advantages.
Open source drivers (Score:2)
Note that, with the availability of Mesa3D's source, it's not that much difficult to provide open source drivers for Windows, at least for openGL.
The 3DFX source community has been doing it since a long time. Because the source of the Glide API (the low level talking to 3DFx hardware) has been available, since the demise of 3DFx and because windows XP didn't provide any support for openGL on Voodoo boards, a st
Re: (Score:2)
Re: (Score:2)
Free Software asymptotically approaches perfection in a way that Caged Software never can. Only by having both the Source Code to the OS and the specifications of the hardware can anyone hope to write a
Re: (Score:2)
GPU's are mostly just linear arrays of FPUs with an optimized memory architecture, and they've been that way for some years now. If there is no speed benefit to hardwiring the notion of "triangle" and "texture" into the silicon, then there isn't a really compelling reason for it. 3D drivers these days include optimizing compilers for shader languages. Still, no one's asking for the s
Re: (Score:2)
These documents are not as exciting as you think (Score:5, Insightful)
Whilst the registers are essential for getting any kind of driver to work, the documents don't describe the exciting features of the graphics processor. They give you enough control over the memory-controller timings to convert any Radeon card into a smoking brick with a small kernel-mode driver, but they don't give instructions which actually make the graphics silicon do things. There's no indication of what the machine-code for the vector processors looks like.
If you compare this to the documentation that Intel has for its (obsolete) 845 graphics controller, you notice that the whole block of registers for controlling even something as basic as the blitter, let alone the 'set instruction pointer for processing unit N' registers which actually let you set the high-performance processing units in the card to work, are missing.
These documents let you use an R500 or R600 card as a frame buffer. Not worth making a song and dance about that one.
Myself, I'd be fascinated to see documentation for the Intel G965 like the documentation for the G845; it clearly exists, there's a paper in the most recent Intel Technical Journal about low-level programming on the 965, it's just not available to mortals unless by attempting to reverse-engineer the x.org 965 driver.
Re:These documents are not as exciting as you thin (Score:5, Informative)
Re: (Score:2)
patents seem to be a better excuse this time.
Promises mean nothing (Score:2)
Those specifications, promised nearly ten years ago, never arrived to my knowledge. If they ever did, it happened long after the G200 was obsolete.
About six months after wasting money on that G200, I bought a Riva TNT2 and have been an NVidia customer since then. They don't make empty promises that we MIGHT be able to write a decent workin
Re: (Score:1)
Re: (Score:3, Informative)
They did arrive, and for the G400 as well. The first driver to make use of this information was the Utah-GLX [sourceforge.net] module thingy for XFree86 3 - that John Carmack helped with their development. I think the specifications for some particular, programmable section of the cards (WARP setup engine?) weren't released, but microcode blobs for the necessary functionality were
Re: (Score:3, Interesting)
Re: (Score:2)
That would be the WARP triangle setup thingy - something like two custom RISC processors which would munch through data. Matrox did supply microcode to make 'em do all the necessary stuff, but I don't think specs were ever released, nor did anyone figure out how they worked.
Performance and features were about the same as with Windows, but
Re: (Score:2)
Complexity of moderns GPUs (Score:2)
More seriously, once the 3D specs gets released, due to the highe programmability of latest generation of hardware, the 3D specs are probably going to contain much more information about building and uploading fragments, and setting up input and output streams, than occurence of the words "blitter" and "texture". In fact, I don't actually know if
Absolutley _Spot On_ (Score:5, Informative)
These docs will let one do the following
1 - Setup you own video mode
2 - Setup up a video overlay (not video acceleration)
3 - Setup a full colour mouse cursor
That's all. These do not explain how to blit, alpha blend, scale, ROP2, ROP3 or ROP4 or perform any other transform.
This is useful, but not _that_ useful!
Hopefully there will be more to come specifically more on the memory/cache controller (essential to get performance up), more on the PCI/AGP bus control, more on the 2D source/dest blit registers, pitch, loop counters and I'd like to know how much of the 2D guts is programmable. TBH I thought we'd have moved on to the point of (somewhat) programmable shaders for 2D these days with loops etc built into the HW (0 clock loops and addressing etc).
Re: (Score:3, Interesting)
Nvidia (Score:1)
Video Acceleration (Score:1)
Re: (Score:1)
Eating my words (Score:4, Interesting)
And now they've released scads of docs - kudos. This was probably the only way to make a FOSS driver a reality without violating reams of licensed IP. On top of that, I believe their latest set of Linux drivers fix a number of long standing issues, as well as vastly increasing 3D performance (although obviously there are still are QA problems).
Granted, it's almost all 2D stuff at the moment, but being able to ship a functional, fast and non-crash-prone driver for ATI cards with every modern distro will be another win for Linux in general.
I'm quite interested to hear about advanced features though - will implementing things like iDCT in XvMC for MPEG2, MPEG4 ASP and H.264 be a reality? Can these things be implemented with 2D registers or do these things need to be run through the 3D shaders nowadays? The low end ATI cards, including the IGP's, would be ideal for HTPC boxes, espcially with Intel dragging their feet on similar support/documentation for their (admittedly otherwise excellent) GMA X3xxx series.
Re: (Score:2)
Just like I was burned by Matrox's promises to release 3D specs shortly after they released 2D specs for the G200 nearly a decade ago. They never delivered the documentation that was promis
Re: (Score:2)
I just hope to hell ATI *do* release more useful docs, and that this supposed "alpha quality" driver marks a considerable performance/reliability improvement from the current F
Snooze (Score:1)
I agree that releasing specs is a step in the right direction, but all of their products are kinda bland right now, and they lost their price advantage over the summer when Intel went apeshit with deep price cuts, and that's before even considering that current Intel chips can be overclocked to ridiculous heights with little effort.
The Linux distros have always been good to low-end hardware, but the
Re: (Score:1)
The fact remains that every year brings more and more power to the desktop, and I choose to root for whomever's winning the race. I would be very impressed and proud if NVidia were to follow suit and release dev specs for their graphics hardware, because right now that's where my money is. Next year when I upgrade again, if ATI is on top, I'll switch back. My point is: I don't want to have to ch
What I want to know is... (Score:2)
and 2.Which features are going to take the longest to document (because of patents/3rd party code)
Oh, ATi... (Score:2, Funny)
little depth in de documents. (Score:2)
The register descriptions are one thing. A little wording around that to explain what is what and how it works is another thing that changes things from very tedious to workable.
In a processor manual you see that the 0xaa opcode has the "ADD" mnemonic, and does a = a + , and sets the flags. This AMD manual has the level: "0xaa is called ADD", and nothing more. With a bit of imagination and some previous knowledge how things normally work you can probably figure
This is something we've been begging for! (Score:1, Insightful)
Money? Talent? Time?
> I don't get why the OSS community is rolling over and making these drivers.
We've only been begging for specs like these since forever, and we want lots more. Did you miss the whole story about the kernel team offering free drivers given sufficient specifications? If they make the driver, it'll almost have to be closed source, and then it won't be maintainable, so future kernels won't be able to use the hardware, et
I'd rather have them contribute (Score:3, Insightful)