AMD Releases 3D Programming Documentation 94
Michael Larabel writes "With the Free Open Source Developers' European Meeting (FOSDEM) starting today, where John Bridgman of AMD will be addressing the X.Org developers, AMD has this morning released their 3D programming documentation. This information covers not only the recent R500 series, but goes back in detail to the R300/400 series. This is another one of AMD's open source documentation offerings, which they had started doing at the X Developer Summit 2007 with releasing 900 pages of basic documentation. Phoronix has a detailed analysis of what is being offered with today's information as well as information on sample code being released soon. This information will allow open source 3D/OpenGL work to get underway with ATI's newer graphics cards."
Makes me ask (Score:1)
Re:Makes me ask (Score:4, Interesting)
Re: (Score:2, Informative)
Re: (Score:1)
fglrx has seen massive improvement lately. It is supposed to be mostly in sync with the Windows Catalyst drivers these days. It's still a bit off perfect of course, but a lot better than it was.
Yes, with version 8.455.2 it only hangs my system after 30 minutes of using Google Earth not immediately... Quake III seems to run stable at least, just without working brightness controls...
I even had compiz-fusion running on a recent version and it was reasonably stable, with the complete lockups being totally predictable (After logging off for the second time...)
It might just have been bad luck, and I did not really have time to look up all the possible settings to try to stabilize the system, but
Re: (Score:2)
Eventually i had to give up, and get an nvidia card.
Re: (Score:1)
Re: (Score:2)
Still... (Score:3, Informative)
-With fglrx kernel module loaded, my laptop has not been able to suspend ever (using Ubuntu Gutsy)
-I have to do a goofy Virtual 1408x1050 resolution with fglrx to make 3D applications not look horribly corrupted. This is weird, but as long as I don't xrandr over to it, it's not a big deal, however..
-After doing above trick, fglrx shows corruption in lower right hand corner and hardware cursor if trying to do 3D apps at 1400x150
Re: (Score:2)
I had the same issue at one stage. I had to put the following option in the fglrx Device section of xorg.conf :
Option "XAANoOffscreenPixmaps" "true"
Give it a try and see how it works for you.
Does it have info on Hybrid cross fire and other.. (Score:1)
Re:Does it have info on Hybrid cross fire and othe (Score:2)
Re: (Score:2)
Re:Does it have info on Hybrid cross fire and othe (Score:1, Informative)
Re: (Score:2)
Not entirely true. The RV570 (x1950 Pro) was the first internal Crossfire-capable GPU. But you're basically right, because every other card in the R500 range had the Crossfire glue logic external to the GPU die, and thus required special "Crossfire Edition" cards.
The RV570 got the internal Crossfire treatment because it was completely redesigned to (1) reduce power consumption and (2) create a cheap midrange card.
sad but true (Score:1)
i'd like to support amd/ati because of there friendlyness to oss but they just can cut it. alltho they are getting better. the R670's are _allright_ (crossfireX is there only hope) . and they should suck it up and release the 411 for that line and not vintage ones that are very quickly becomming very inadituquite.
Re: (Score:2)
Not lame at all. (Score:4, Insightful)
Re: (Score:2)
Re: (Score:1)
It was about time (Score:1)
Re: (Score:2)
You might think that how the card works is something that is trivially reverse engineered but that is not always the case. While I am not a hardware or graphics card expert, I suspect that a lot of chips, GPUs for example, likely have instruction sets and features that the manufacturers don't want their competitors to have
Re: (Score:1)
Re: (Score:1)
Know about shader programs? Those are compiled by a JIT compiler in the driver, at runtime. If nVidia or ATi believes that they have a better compiler that implements some optimizations that the other's doesn't, that could make them very reluctant to release the code.
I would think that's more reason for specs.. (Score:5, Interesting)
I find an interesting perspective being hinted at by AMD in this context. That they approach a common open source layer at the low level, and plug in their proprietary 'good stuff' as a replacement for higher layer things. As an example, they feel their powerplay stuff isn't top secret, so putting it at a layer where everyone can bang on it and improve it is ideal for everyone. Same with things like display handling. AMD and nVidia both do bizarre things requiring proprietary tools to configure display hotplug, instead of the full xrandr feature set, which has grown to include display hot plug.
In general, there are *many* things AMD has historically gotten wrong in their drivers. Mostly with respect to power management, suspend, stability with arbitrary kernels/X servers. One thing they seem to do better relative to the open source community is good 3D performance if all the underlying stuff happens to line up. If they can outsource the basic, but potentially widely varying work to the community, it would do wonders if their driver architecture lets them leverage that. And by giving open source 3D developers a chance to create a full stack, it's the best of all worlds. I would be delighted to see the Open Source 3D stack surpass the proprietary stack, but wonder what patents stand in the way of that being permitted...
Re: (Score:1)
Re: (Score:2)
Know about shader programs? Those are compiled by a JIT compiler in the driver, at runtime. If nVidia or ATi believes that they have a better compiler that implements some optimizations that the other's doesn't, that could make them very reluctant to release the code.
It's not really a JIT compiler (at least in opengl). Just a compiler.(Well, technically, they are free to implement a JIT compiler, but that would be silly when they have the opportunity to make a real compiler instead).
It is a part of the driver, though. Compilers are, however, something that we have a lot of in the opensource world, so I have no fears there.
Re: (Score:2)
There are standard ways of communicating with hardware (memory-mapped registers, IO ports, DMA transfers), so there isn't much that isn't known already.
Although, most of the optimisations in the use of 3D hardware seem to be related to memory mapping, caching, ordering and buffer
Re: (Score:2)
People have written open-source nVidia drivers (i.e. at the GLX level, not the DRI level). This forms the basis of current work in the area; nobody's gotten around to it because nvdriver is good enough, aside from Ubuntu making you install it after first boot. The closed source drivers only put developer focus elsewhere.
Go play with IDA Pro, get a book called "Reversing: Secret
Hard Line (Score:2)
Way to go AMD (Score:5, Insightful)
According to TFA, the small group at AMD who has spent time clearing the docs for legal issues are going to speak at FOSDEM [phoronix.com], and the maintainer for the open source driver for AMD/ATI graphics (RadeonHD) will be giving an update.
And thanks also to Intel for putting out their 3D graphics specs last month. These are good days for Linux.
Re: (Score:2)
Re:Way to go AMD (Score:5, Insightful)
You should take a look at the existing 3d drivers. The folks reverse-engineering the r300 series did a pretty good job (well enough for it to be the development platform for xgl). And the open-source drivers also guarantee that the card will continue to work just as well with software written long after the demise of the company (eg. with the 3dfx drivers).
Re: (Score:2, Informative)
Quickly and off the top of my head, here are two big ones:
1. Compiz/Fusion and the like is gaining popularity.
2. Some applications NEED good 3D or they crawl. See Blender for instance.
Of these, I would say gaming would be the least demanding - at least if my assumption that "stable is harder than fast" is correct.
Re:Way to go AMD (Score:5, Insightful)
Sure, Blender needs good OpenGL acceleration. But, nobody is going to be that concerned about getting an extra 1 fps in Blender. If proprietary drivers go twice as fast, or ten times as fast, then the open source devs would look like idiots. If the open source ones are ten percent slower, then 99% of people will be completely satisfied. Games are flashy, and they sell cards, and people will complain about getting killed by somebody with a faster machine because it couldn't possibly have anything to do with lack of skill. In Blender, you just need sufficient speed to work. If the guy next to you has an extra 2 fps, it doesn't make him appreciably more productive, and you certainly can't justify needing to display faster than the refresh rate of the monitor in Blender!
Re: (Score:1)
If Jimmie Joe Fragger crashes in a match, he gets mad and his team loses the round. That's it.
If Jimmie Joe Modeler crashes after tweaking a model for a time, there is no guarantee he can get it "just right" again - and that is lost productivity rather than just lost time.
And historically... (Score:2)
Re: (Score:1)
Very unstable, in my particular case.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Currently there are no drivers for modern cards that are not in at last some large part written the company that produces them. The FOSS Intel driver is mainly written by Intel and actually has several parts of it obscured.
I still doubt that a FOSS driver that is based just on the documentation will be more stable than the proprietary driver. FOSS isn't magic. I have had issues with Firefox and FSpot with stability. Nothing terrible but problems none the less.
Hec
Re: (Score:2)
Re: (Score:2, Informative)
"These are good days for Linux."
These are good days for Xorg, which isn't Linux. Everyone running X will benefit, not just Linux. Linux isn't the only non-Windows platform.
H.264 acceleration included? (Score:3, Insightful)
Re: (Score:1)
Re: (Score:1)
Since I don't give a crap about software patents, I will use it and be happy. Since I'm not the one who would be violating the patent, I don't think I will be in legal trouble (but in this case I don't really care)
Re:H.264 acceleration included? (Score:5, Interesting)
ATI's cards that have h.264 acceleration (and all kinds of other good stuff like smart de-interlacing all collectively branded as "UVD") are unlikely to ever have the specs for UVD disclosed because they integrated the good stuff with the bad stuff (DRM) and are afraid the exposing how to use the good stuff in UVD will also expose how to circumvent the bad stuff on microsoft windows systems.
So, once again, those DRM apologists who say that DRM is purely optional, that if you don't want to use it, it won't hurt you, are proven wrong again.
On the plus side, the next gen cards will have the DRM broken out into a separate part of the chip so that they can feel safe in publishing the specs for good video stuff while leaving the bad stuff locked away.
One of many such statements by ATI/AMD. [phoronix.com]
Yeeha!!!! (Score:5, Interesting)
I've been using only Nvidia cards since 2000 because they had
the best 3D graphics card for my Linux box. I was willing to deal
with binary drivers because there was nothing else available to me
at my price range (loooow budget) for 3D graphics.
But.... over the years I would get burned every now and then
when
1) I would upgrade the kernel and then the X server would get borked
because the Nvidia kernel module didn't match the new kernel, or
2) Some funky memory leak in the binary Nvidia module would lock
up my box hard because of some damn NvAgp vs. Agpart setting or
some funky memory speed setting. Of course, this didn't happen with
every Nvidia driver so of course I wouldn't bother writing down
what it took to fix the problem.
Finally when I switched to Debian Linux in fall 2004 and had
beautiful apt-get/synaptic take care of all of my needs I thought
I was done
driver releases with kernel releases so if I wanted to upgrade
my kernel painlessly with apt-get/synaptic I would have to
wait for Nvidia to get off it's damn rocking chair playing their
damn banjo and release a driver to go with the newer kernel.
The final straw for me was when all of my 5 nvidia cards were
now listed in the "legacy driver" section. Can you guess what
"legacy driver" means about Nvidia fixing their closed source
driver? Yeah, that's exactly the point.
That's when I started looking around for open source 3d drivers.
I know about Nouveau for Nvidia, but frankly I'm too pissed off
about Nvidia to consider them. Ati had a long history of treating
Linux customers like second class scum. Intel on the other hand
earned the "golden throne" by providing full open source for their
graphic chipsets. So now that I'm looking for getting a dual core
64 bit cpu + 3D graphic chipset the only viable choice was intel,
which I was happy to do business with.
Now that Ati has decided to come forth with 3D documentation I'm
willing to give an intel/ATi or AMD/Ati combo serious consideration.
Way to go ATI!!!!
Re: (Score:1)
Re: (Score:2)
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2, Informative)
The internal kernel API's are subject to change. Functions within the kernel for dealing with lists, interrupts, devices drivers etc, can and have changed many times in the past. The API (ie syscall interface) which is exposed to userspace is very stable, and in many cases pre-dates Linux itself.
Typically userspace application developers do not need to worry about changes to the kernel, since the userspace APIs are mostly stable. Drivers within the kernel usually do not need to worry either, since chang
Re: (Score:1, Insightful)
The upside of the Linux way is rapid development, with a constant stream of new features.
The downside is that since every kernel update might break binary compatibility for a previously compiled driver, third-party drivers must be recompiled for every update.
It's definitely a trade-off, one that isn't done
Re: (Score:2)
Re: (Score:2)
Go for anything with an intel chipset, preferably a ThinkPad if they will be upgraded to quad core. Always fun to laugh at the closed-source-binary-driver-hell that is nvidia and ati.
Re: (Score:2)
But I've been wanting to game on this computer as well, and I miss my nVidia card. I was just about to break down and buy one... Maybe I'll just wait a while longer and see what happens with ATI's drivers. It would be -so- great to continue not having to deal with
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
1) I would upgrade the kernel and then the X server would get borked because the Nvidia kernel module didn't match the new kernel
I too, have been an nVidia customer since 1999 for Linux (I don't do Windows at all). The nVidia driver package has an option to compile a new interface to match your current kernel. It leads you through the options 1 at a time with yes/no options.
The only time I had trouble was when I did a major kernel upgrade and forgot to install the headers for the new kernel. If you're in
Re: (Score:1)
So is this all for any chip? (Score:2)
The 3D programming documentation today is 300 pages and is made up of a programming guide and register specifications. Among the areas covered in this 3D guide are the command processor, vertex shaders, fragment shaders, Hyper-Z, and the various 3D registers.
Maybe the tcore code contains more, but doesn't 300 pages sound small when the previous drops have been 900 pages or so? I'd be very happy if this really is all they need to make a competitive driver (i.e. no hidden features only the closed source driver can use).
Re:So is this all for any chip? (Score:4, Informative)
Re: (Score:2)
One Moment While I Go Dance in the Streets (Score:4, Funny)
Now that AMD/ATI has come over from the Dark Side, I expect that Nvidia and all of the other graphics chip manufacturers are going to be close behind. It may take them a year or two to work out the logistics, but they'll be here.
More and more people are moving over to Linux/BSD Free/Open software, and letting yourself be locked out of a growing market is the kind of things that CEOs and CTO's get fired for.
It used to be the case that manufacturers could peacefully close their eyes to the Open Source / Free communities and drink the Microsoft brand Kool-Aid because all of their competitors were doing the same thing. Now, however, with one of The big guns having committed to solid support of the Open Source universe, their less responsive competitors have a massive flank open that is going to have to be responded to.
One Moment While I Go blow hot air. (Score:1, Funny)
Uh huh. And just how many CEO's and CTO's have been fired for using ATI or Nvidia's binary blob? I suspect the number's between zero and your imagination.
"This is the end of the beginning. "
The total number of hardware and still growing that's released with a binary blob is still greater than the total number that have open sou
Re:One Moment While I Go blow hot air. (Score:4, Interesting)
Re: (Score:2)
Too late (Score:1, Funny)
Re: (Score:3, Interesting)
I've been lamenting for years that the R300 card in my G4 (now a G5, long story) would never get specs. I figured they'd start releasing only specs for R500 and up. So when I read this story, I LITERALLY jumped for joy. I'm so happy that I'm switching from nVidia to ATI in my next custom Linux box.
-:sigma.SB
Re: (Score:2)
Re: (Score:2)
What's left? (Score:4, Insightful)
Re: (Score:1)
Re:What's left?-experience (Score:4, Interesting)
Using the gpu to decode h264 etc is something I see as quite possible, but it is likely that it is something we have to implement ourselves (something I think we are capable of).
Time to support ATI / AMD is NOW! (Score:2, Insightful)
I guess it's time to make it AMD / ATI now.
If they have released what we needed to get the drivers made, which is what we have always wanted, it's time we reciprocated by supporting them.
This will show other graphics companies *hint hint* that releasing the specs = good business.
Great! (Score:1)
Thank you ATI! (Score:2)
I know everyone was skeptical when this was announced some months ago. I though "well, it could happen." The silence on the issue lately made me think I had spoken too soon. I was beginning to wonder where the specs were. Well, here they are.
Thank you ATI!