Forgot your password?
typodupeerror
Programming Hardware IT Technology

AMD Releases 3D Programming Documentation 94

Posted by kdawson
from the fosdem-fossdoc dept.
Michael Larabel writes "With the Free Open Source Developers' European Meeting (FOSDEM) starting today, where John Bridgman of AMD will be addressing the X.Org developers, AMD has this morning released their 3D programming documentation. This information covers not only the recent R500 series, but goes back in detail to the R300/400 series. This is another one of AMD's open source documentation offerings, which they had started doing at the X Developer Summit 2007 with releasing 900 pages of basic documentation. Phoronix has a detailed analysis of what is being offered with today's information as well as information on sample code being released soon. This information will allow open source 3D/OpenGL work to get underway with ATI's newer graphics cards."
This discussion has been archived. No new comments can be posted.

AMD Releases 3D Programming Documentation

Comments Filter:
  • Would fglrx work at last? Or community devs will do all the work to have a decent driver?
    • Re:Makes me ask (Score:4, Interesting)

      by bersl2 (689221) on Saturday February 23, 2008 @08:38PM (#22530760) Journal
      fglrx is probably a technical and legal mess unable to be cleaned up with less effort than it would take to re-write the drivers using good documentation.
    • Re: (Score:2, Informative)

      by hr.wien (986516)
      fglrx has seen massive improvement lately. It is supposed to be mostly in sync with the Windows Catalyst drivers these days. It's still a bit off perfect of course, but a lot better than it was.
      • by MoHaG (1002926)

        fglrx has seen massive improvement lately. It is supposed to be mostly in sync with the Windows Catalyst drivers these days. It's still a bit off perfect of course, but a lot better than it was.

        Yes, with version 8.455.2 it only hangs my system after 30 minutes of using Google Earth not immediately... Quake III seems to run stable at least, just without working brightness controls...

        I even had compiz-fusion running on a recent version and it was reasonably stable, with the complete lockups being totally predictable (After logging off for the second time...)

        It might just have been bad luck, and I did not really have time to look up all the possible settings to try to stabilize the system, but

        • by Bert64 (520050)
          I had major trouble getting a radeon hd2400 working with mythtv... If it worked at all, it was laughably slow (~5 secs to redraw the menu).
          Eventually i had to give up, and get an nvidia card.
          • by Zencyde (850968)
            Amen! I was running a Radeon 9800 Pro 128 MB before purchasing my Geforce 7600 GS 512 MB. The Radeon gave me nothing but trouble under Ubuntu! I managed to get it to work for a short while; but, it messed up Compiz Fusion and once I got Compiz Fusion to work, I lost the ability to render OpenGL. Now, I can't even render OpenGL with my new card under that installation. I'm running off of a 20 GB harddrive now and the card works flawlessly. : ) Let's just hope that nVidia will open up their specs, too.
      • by MrHanky (141717)
        It's still garbage, really. Doesn't properly support XVideo, crashes all the time, etc. For a professionally developed video driver, the quality really is shocking.
      • Still... (Score:3, Informative)

        by Junta (36770)
        Comparing my R500 part with fglrx with an R300 part with the open source driver:
        -With fglrx kernel module loaded, my laptop has not been able to suspend ever (using Ubuntu Gutsy)
        -I have to do a goofy Virtual 1408x1050 resolution with fglrx to make 3D applications not look horribly corrupted. This is weird, but as long as I don't xrandr over to it, it's not a big deal, however..
        -After doing above trick, fglrx shows corruption in lower right hand corner and hardware cursor if trying to do 3D apps at 1400x150
        • by deek (22697)

          fglrx shows corruption in lower right hand corner and hardware cursor if trying to do 3D apps at 1400x150 (native resolution). Have to run at 1280x960 to prevent that corruption.


          I had the same issue at one stage. I had to put the following option in the fglrx Device section of xorg.conf :

          Option "XAANoOffscreenPixmaps" "true"

          Give it a try and see how it works for you.
  • Does it have info on Hybrid cross fire and other cross fire setups as well?
    • dude, let the devs get some single card setups going first before asking about crossfire o_0
    • by Anonymous Coward
      The R500 cards only have crossfire via the external cable.
      • The R500 cards only have crossfire via the external cable.

        Not entirely true. The RV570 (x1950 Pro) was the first internal Crossfire-capable GPU. But you're basically right, because every other card in the R500 range had the Crossfire glue logic external to the GPU die, and thus required special "Crossfire Edition" cards.

        The RV570 got the internal Crossfire treatment because it was completely redesigned to (1) reduce power consumption and (2) create a cheap midrange card.
  • I'm not a fan of opensource. Or closed source. I'm purely agnostic on that field. I believe licenses should be chosen on a per-case basis. But hardware makers loose little by opening their software accessible interface documentation. Actually they have more to win, because with open documentation, other people can write drivers for software platforms where that piece of hardware has so far been unsupported. That may actually increase their market share.
    • Hardware can be just as competitive an industry as software and from the manufacturers point of view they stand a great deal to lose if their competitors get a hold of their trade secrets.

      You might think that how the card works is something that is trivially reverse engineered but that is not always the case. While I am not a hardware or graphics card expert, I suspect that a lot of chips, GPUs for example, likely have instruction sets and features that the manufacturers don't want their competitors to have
      • by joaommp (685612)
        well, but if you think of it, in the case of such markets, if the problem are the competitors, than having the documentation opened or closed will not make a difference... the competitors can simply reverse engineer the driver... it may take longer, but we're talking about multi bilion dollar industries here. They HAVE the resources for that...
      • by EvanED (569694)
        Not just that, but graphics drivers are a significant piece of tech in their own right anymore.

        Know about shader programs? Those are compiled by a JIT compiler in the driver, at runtime. If nVidia or ATi believes that they have a better compiler that implements some optimizations that the other's doesn't, that could make them very reluctant to release the code.
        • by Junta (36770) on Saturday February 23, 2008 @09:24PM (#22531116)
          I see that as a reason not to open source the existing drivers, but not to preclude releasing the details needed by the open source community to produce an open driver with their own shader programs, which may be lower performance, but good enough for default operation for a lot of distributions.

          I find an interesting perspective being hinted at by AMD in this context. That they approach a common open source layer at the low level, and plug in their proprietary 'good stuff' as a replacement for higher layer things. As an example, they feel their powerplay stuff isn't top secret, so putting it at a layer where everyone can bang on it and improve it is ideal for everyone. Same with things like display handling. AMD and nVidia both do bizarre things requiring proprietary tools to configure display hotplug, instead of the full xrandr feature set, which has grown to include display hot plug.

          In general, there are *many* things AMD has historically gotten wrong in their drivers. Mostly with respect to power management, suspend, stability with arbitrary kernels/X servers. One thing they seem to do better relative to the open source community is good 3D performance if all the underlying stuff happens to line up. If they can outsource the basic, but potentially widely varying work to the community, it would do wonders if their driver architecture lets them leverage that. And by giving open source 3D developers a chance to create a full stack, it's the best of all worlds. I would be delighted to see the Open Source 3D stack surpass the proprietary stack, but wonder what patents stand in the way of that being permitted...
        • Know about shader programs? Those are compiled by a JIT compiler in the driver, at runtime. If nVidia or ATi believes that they have a better compiler that implements some optimizations that the other's doesn't, that could make them very reluctant to release the code.

          It's not really a JIT compiler (at least in opengl). Just a compiler.(Well, technically, they are free to implement a JIT compiler, but that would be silly when they have the opportunity to make a real compiler instead).

          It is a part of the driver, though. Compilers are, however, something that we have a lot of in the opensource world, so I have no fears there.

      • by mikael (484)
        Anyone involved in the design of ASIC chips has access to chip grinders and electron microscopes, which while normally used for quality-assurance purposes, can be used to examine competitors architectures. See Chip art [wikipedia.org] at wikipedia.

        There are standard ways of communicating with hardware (memory-mapped registers, IO ports, DMA transfers), so there isn't much that isn't known already.

        Although, most of the optimisations in the use of 3D hardware seem to be related to memory mapping, caching, ordering and buffer
      • You might think that how the card works is something that is trivially reverse engineered but that is not always the case.

        People have written open-source nVidia drivers (i.e. at the GLX level, not the DRI level). This forms the basis of current work in the area; nobody's gotten around to it because nvdriver is good enough, aside from Ubuntu making you install it after first boot. The closed source drivers only put developer focus elsewhere.

        Go play with IDA Pro, get a book called "Reversing: Secret

    • I'm a real hardliner on the belief that interfaces should not only be unprotectable, but should be required to be released in any and all cases, period, no exceptions.
  • Way to go AMD (Score:5, Insightful)

    by schwaang (667808) on Saturday February 23, 2008 @08:49PM (#22530866)
    For ages, the FOSS community has said "just give us the specs for your graphics cards and we'll write the drivers". Well it looks like AMD is taking real steps in that direction, and I for one, say Thanks!

    According to TFA, the small group at AMD who has spent time clearing the docs for legal issues are going to speak at FOSDEM [phoronix.com], and the maintainer for the open source driver for AMD/ATI graphics (RadeonHD) will be giving an update.

    And thanks also to Intel for putting out their 3D graphics specs last month. These are good days for Linux.
    • by LWATCDR (28044)
      Well since I have had my doubts that the FOSS community really can write better drivers than AMD or NVidia this will be interesting. Of course this is one of those times when I will be very happy to be wrong.
      • Re:Way to go AMD (Score:5, Insightful)

        by 644bd346996 (1012333) on Saturday February 23, 2008 @10:50PM (#22531684)
        Depends on what you mean by better. There's no doubt that the open source drivers will be more stable and have better software compatibility than the proprietary stuff. The 3d performance will really only matter to the Linux gamers (a very small market, that), as the performance should definitely be more than enough for simpler things like compiz, etc.

        You should take a look at the existing 3d drivers. The folks reverse-engineering the r300 series did a pretty good job (well enough for it to be the development platform for xgl). And the open-source drivers also guarantee that the card will continue to work just as well with software written long after the demise of the company (eg. with the 3dfx drivers).
        • Re: (Score:2, Informative)

          by X0563511 (793323)
          Gamers are not the only ones who like 3D acceleration.

          Quickly and off the top of my head, here are two big ones:
          1. Compiz/Fusion and the like is gaining popularity.
          2. Some applications NEED good 3D or they crawl. See Blender for instance.

          Of these, I would say gaming would be the least demanding - at least if my assumption that "stable is harder than fast" is correct.
          • Re:Way to go AMD (Score:5, Insightful)

            by forkazoo (138186) <wrosecrans.gmail@com> on Sunday February 24, 2008 @12:43AM (#22532342) Homepage

            Gamers are not the only ones who like 3D acceleration.

            Quickly and off the top of my head, here are two big ones:
            1. Compiz/Fusion and the like is gaining popularity.
            2. Some applications NEED good 3D or they crawl. See Blender for instance.

            Of these, I would say gaming would be the least demanding - at least if my assumption that "stable is harder than fast" is correct.


            Sure, Blender needs good OpenGL acceleration. But, nobody is going to be that concerned about getting an extra 1 fps in Blender. If proprietary drivers go twice as fast, or ten times as fast, then the open source devs would look like idiots. If the open source ones are ten percent slower, then 99% of people will be completely satisfied. Games are flashy, and they sell cards, and people will complain about getting killed by somebody with a faster machine because it couldn't possibly have anything to do with lack of skill. In Blender, you just need sufficient speed to work. If the guy next to you has an extra 2 fps, it doesn't make him appreciably more productive, and you certainly can't justify needing to display faster than the refresh rate of the monitor in Blender!
            • by X0563511 (793323)
              True, but you missed stability.

              If Jimmie Joe Fragger crashes in a match, he gets mad and his team loses the round. That's it.

              If Jimmie Joe Modeler crashes after tweaking a model for a time, there is no guarantee he can get it "just right" again - and that is lost productivity rather than just lost time.
              • The open source drivers have had the reliability advantage, so I'm guessing you agree with the perspective of the parent post?
                • by X0563511 (793323)
                  Well, I'm speaking from the perspective of using a crappy laptop with a crappy ATI chipset. Not even sure if the chipset is related to what they are releasing.

                  Very unstable, in my particular case.
          • by Enleth (947766)
            That's interesting - I've been doing some modeling in Blender on a SiS 661 with the free driver (that is, no DRI at all). The "preview" window was next to useless, but the workspace itself was pretty fast. I was able to model some relatively simple objects and animations without any problems, on a Celeron M 1,5GHz. I've even tried playing with some more complicated scenes downloaded from blender.org examples and tutorials and everything was still fine, if not extra-smooth.
            • by X0563511 (793323)
              Well, I'm not that advanced in blender, but when doing character work and sculpting, wireframe doesn't cut it - i need shading.
        • by LWATCDR (28044)
          Better means stable, fast, and full featured.
          Currently there are no drivers for modern cards that are not in at last some large part written the company that produces them. The FOSS Intel driver is mainly written by Intel and actually has several parts of it obscured.
          I still doubt that a FOSS driver that is based just on the documentation will be more stable than the proprietary driver. FOSS isn't magic. I have had issues with Firefox and FSpot with stability. Nothing terrible but problems none the less.
          Hec
      • Will they write better drivers than the current commercial drivers for Windows? In terms of sheer performance, probably not. In terms of reliability, maybe. Will they write better drivers than the current Linux open-source drivers? Damn skippy. And as I use the open-source nv driver myself, that's a very good thing as far as I'm concerned.
    • Re: (Score:2, Informative)

      "These are good days for Linux."

      These are good days for Xorg, which isn't Linux. Everyone running X will benefit, not just Linux. Linux isn't the only non-Windows platform.

  • by pyite69 (463042) on Saturday February 23, 2008 @09:07PM (#22530990)
    Feature parity with Windows must be the goal if they want to beat NVidia. I hope we can get some sort of media acceleration beyond the stale old XVideo & XV-MC.

    • by Jah-Wren Ryel (80510) on Sunday February 24, 2008 @03:26AM (#22533150)

      I hope we can get some sort of media acceleration beyond the stale old XVideo & XV-MC.
      You won't get it, and the reason is DRM.

      ATI's cards that have h.264 acceleration (and all kinds of other good stuff like smart de-interlacing all collectively branded as "UVD") are unlikely to ever have the specs for UVD disclosed because they integrated the good stuff with the bad stuff (DRM) and are afraid the exposing how to use the good stuff in UVD will also expose how to circumvent the bad stuff on microsoft windows systems.

      So, once again, those DRM apologists who say that DRM is purely optional, that if you don't want to use it, it won't hurt you, are proven wrong again.

      On the plus side, the next gen cards will have the DRM broken out into a separate part of the chip so that they can feel safe in publishing the specs for good video stuff while leaving the bad stuff locked away.

      One of many such statements by ATI/AMD. [phoronix.com]
  • Yeeha!!!! (Score:5, Interesting)

    by Anonymous Coward on Saturday February 23, 2008 @09:10PM (#22531012)
    I'm the owner of 5 boxes all with Nvidia graphic cards.
    I've been using only Nvidia cards since 2000 because they had
    the best 3D graphics card for my Linux box. I was willing to deal
    with binary drivers because there was nothing else available to me
    at my price range (loooow budget) for 3D graphics.

    But.... over the years I would get burned every now and then
    when
    1) I would upgrade the kernel and then the X server would get borked
    because the Nvidia kernel module didn't match the new kernel, or

    2) Some funky memory leak in the binary Nvidia module would lock
    up my box hard because of some damn NvAgp vs. Agpart setting or
    some funky memory speed setting. Of course, this didn't happen with
    every Nvidia driver so of course I wouldn't bother writing down
    what it took to fix the problem.

    Finally when I switched to Debian Linux in fall 2004 and had
    beautiful apt-get/synaptic take care of all of my needs I thought
    I was done ... until I found out that Nvidia doesn't time its
    driver releases with kernel releases so if I wanted to upgrade
    my kernel painlessly with apt-get/synaptic I would have to
    wait for Nvidia to get off it's damn rocking chair playing their
    damn banjo and release a driver to go with the newer kernel.

    The final straw for me was when all of my 5 nvidia cards were
    now listed in the "legacy driver" section. Can you guess what
    "legacy driver" means about Nvidia fixing their closed source
    driver? Yeah, that's exactly the point.

    That's when I started looking around for open source 3d drivers.
    I know about Nouveau for Nvidia, but frankly I'm too pissed off
    about Nvidia to consider them. Ati had a long history of treating
    Linux customers like second class scum. Intel on the other hand
    earned the "golden throne" by providing full open source for their
    graphic chipsets. So now that I'm looking for getting a dual core
    64 bit cpu + 3D graphic chipset the only viable choice was intel,
    which I was happy to do business with.

    Now that Ati has decided to come forth with 3D documentation I'm
    willing to give an intel/ATi or AMD/Ati combo serious consideration.

    Way to go ATI!!!!

     
    • by Sharth (621005)
      Or hell, maybe the kernel devs could make it easier to have binary modules stay compatible from version to version...
      • Or hell, maybe the kernel devs could make it easier to have binary modules stay compatible from version to version...
        I expect that this will happen automagically once they find a set of interfaces that's actually good enough that it doesn't *need* to be changed to let other parts of the kernel improve...
        • Re: (Score:3, Interesting)

          by LWATCDR (28044)
          That isn't the issue. The interfaces are pretty stable other wise you couldn't just recompile most drivers when the new kernel comes out. What is missing is a stable binary interface. I am all for a binary interface. The developers don't want a binary interface for what I feel are bad reasons. But they are the devs and they get to make that call even if I don't like it.
          • by xtracto (837672)
            That isn't the issue. The interfaces are pretty stable other wise you couldn't just recompile most drivers when the new kernel comes out. In one of the most recent Linus Trovalds interviews I read (it was featured here in slashdot), Linus specifically stated that they also DO NOT guarantee a stable API (yes, not the ABI, the API) for future versions of the kernel. He gave his reasons and I respect them but I also thing it is not fine. If you have no guarantees that your program will work at least during a
            • Re: (Score:2, Informative)

              by Ryan Mallon (689481)

              The internal kernel API's are subject to change. Functions within the kernel for dealing with lists, interrupts, devices drivers etc, can and have changed many times in the past. The API (ie syscall interface) which is exposed to userspace is very stable, and in many cases pre-dates Linux itself.

              Typically userspace application developers do not need to worry about changes to the kernel, since the userspace APIs are mostly stable. Drivers within the kernel usually do not need to worry either, since chang

        • Re: (Score:1, Insightful)

          by Anonymous Coward
          Not to pick a nit, but not being "good enough" isn't the reason the kernel devs have decided not to commit to a stable binary API. It's so that they have total flexibility to use the latest greatest code.

          The upside of the Linux way is rapid development, with a constant stream of new features.
          The downside is that since every kernel update might break binary compatibility for a previously compiled driver, third-party drivers must be recompiled for every update.

          It's definitely a trade-off, one that isn't done
    • by drinkypoo (153816)
      It's interesting you should attack nvidia ('s regrettable lack of open source drivers) in a story with a link to phoronix, which noted a rumor January 10 that nvidia might open source their drivers too [phoronix.com]. Of course, we know what rumors are worth. I for one intend to get a new laptop shortly after the quad cores start rolling out and I plan to get whatever laptop will let me use no closed source drivers. If I can find one. :P
      • by pipatron (966506)

        Go for anything with an intel chipset, preferably a ThinkPad if they will be upgraded to quad core. Always fun to laugh at the closed-source-binary-driver-hell that is nvidia and ati.

        • by Aladrin (926209)
          While my Intel gma3100 has been -amazing- for compiz, it's been absolute crap for gaming... Especially while running compiz. (Can't run opengl and compiz at the same time on Intel, it can't handle it.) The drivers are pretty amazingly solid.

          But I've been wanting to game on this computer as well, and I miss my nVidia card. I was just about to break down and buy one... Maybe I'll just wait a while longer and see what happens with ATI's drivers. It would be -so- great to continue not having to deal with
        • The Intel IGPs may be fine for general desktop usage but anything that uses the IGP to do much more than draw the desktop is MUCH better served by a discrete GPU.
        • by drinkypoo (153816)
          I am well aware that intel has open source drivers. They are not especially good drivers (not especially bad ones either) and are less stable than nvidia or fglrx in my experience. But the point is well taken. However, I will not be buying a Thinkpad. The prices are astronomical. For the same money I can get an Apple laptop which is much nicer what with the EFI boot and far superior case design.
    • by turgid (580780)

      1) I would upgrade the kernel and then the X server would get borked because the Nvidia kernel module didn't match the new kernel

      I too, have been an nVidia customer since 1999 for Linux (I don't do Windows at all). The nVidia driver package has an option to compile a new interface to match your current kernel. It leads you through the options 1 at a time with yes/no options.

      The only time I had trouble was when I did a major kernel upgrade and forgot to install the headers for the new kernel. If you're in

    • by downix (84795)
      You forgot XGI, which have had open documentation for their chips for well over a year now.
  • The 3D programming documentation today is 300 pages and is made up of a programming guide and register specifications. Among the areas covered in this 3D guide are the command processor, vertex shaders, fragment shaders, Hyper-Z, and the various 3D registers.

    Maybe the tcore code contains more, but doesn't 300 pages sound small when the previous drops have been 900 pages or so? I'd be very happy if this really is all they need to make a competitive driver (i.e. no hidden features only the closed source driver can use).

  • This is the end of the beginning.

    Now that AMD/ATI has come over from the Dark Side, I expect that Nvidia and all of the other graphics chip manufacturers are going to be close behind. It may take them a year or two to work out the logistics, but they'll be here.

    More and more people are moving over to Linux/BSD Free/Open software, and letting yourself be locked out of a growing market is the kind of things that CEOs and CTO's get fired for.

    It used to be the case that manufacturers could peacefully close their eyes to the Open Source / Free communities and drink the Microsoft brand Kool-Aid because all of their competitors were doing the same thing. Now, however, with one of The big guns having committed to solid support of the Open Source universe, their less responsive competitors have a massive flank open that is going to have to be responded to.

    • by Anonymous Coward
      "More and more people are moving over to Linux/BSD Free/Open software, and letting yourself be locked out of a growing market is the kind of things that CEOs and CTO's get fired for."

      Uh huh. And just how many CEO's and CTO's have been fired for using ATI or Nvidia's binary blob? I suspect the number's between zero and your imagination.

      "This is the end of the beginning. "

      The total number of hardware and still growing that's released with a binary blob is still greater than the total number that have open sou
      • by Junta (36770) on Sunday February 24, 2008 @12:27PM (#22535512)

        Uh huh. And just how many CEO's and CTO's have been fired for using ATI or Nvidia's binary blob? I suspect the number's between zero and your imagination.
        He was suggesting AMD's or Intel's CEO, not 'client' companies. I doubt it would get to C*O level, but I could see leadership being shuffled out of responsibility if they didn't, for example, make a correct strategy to get the GPUs sold into the HPC market for GPU computing while the competitor did. I.e. if someone takes the open source specs and designs a set of really kick-ass math libraries that cream anything done with nVidia's CUDA, that could lead to a lot of AMD GPUs being moved while nVidia rushes to leverage that. I doubt anyone would be fired though.

        The total number of hardware and still growing that's released with a binary blob is still greater than the total number that have open source drivers.
        Huh? I can count two families with binary blobs as the only option for full-function, nVidia and AMD. This story hypothetically paves the way for the AMD half to go away, leaving only nVidia for now (rumor has it nVidia will follow suit). There exist some fakeraid cards that have binary only drivers to use the same format as the firmware support, but overwhelmingly this is skipped for pure software RAID. There exist a few wireless drivers without Linux drivers at all, but ndiswrapper has brought over the Windows drivers, so I guess you could say those are binary blobs. Even counting all that, you still have countless network adapters, graphics chips (current hardware is mostly Intel on that front), wireless adapters, storage controllers, audio devices, USB devices which in no way require a binary blob. The binary blob portion of linux support is a vast minority.
        • by darkonc (47285)
          In fact, if one chooses their graphics processor well, it's actually pretty rare to get a random box that has hardware requiring binary blobs for Linux functionality.
  • Too late (Score:1, Funny)

    by Anonymous Coward
    I ordered an Nvidia card yesterday.
    • Re: (Score:3, Interesting)

      by Solra Bizna (716281)

      I've been lamenting for years that the R300 card in my G4 (now a G5, long story) would never get specs. I figured they'd start releasing only specs for R500 and up. So when I read this story, I LITERALLY jumped for joy. I'm so happy that I'm switching from nVidia to ATI in my next custom Linux box.

      -:sigma.SB

      • by MrHanky (141717)
        But the free R300 driver is pretty good, at least far better than ATI's proprietary fglrx. Maybe not as fast for 3d, but much better for everything else, like video (I can no longer play back 720p x264 after upgrading to an RV570). Or is that driver, too, i386 only?
      • by Bert64 (520050)
        There's really no reason not to release specs for older cards, they've long been surpassed on the performance front, but these older chips are widely used in servers and embedded devices because they're cheap and still more than capable of doing the job.
  • What's left? (Score:4, Insightful)

    by sudog (101964) on Sunday February 24, 2008 @01:19AM (#22532552) Homepage
    So what's left before the complete documentation sets are in our hands?
    • by z0M6 (1103593)
      You won't get UVD because of DRM, but I think you are better off asking Bridgman on the phoronix forums.
  • by Anonymous Coward
    I used to buy/recommend mostly AMD CPUs and Nvidia graphics cards till now.

    I guess it's time to make it AMD / ATI now.

    If they have released what we needed to get the drivers made, which is what we have always wanted, it's time we reciprocated by supporting them.

    This will show other graphics companies *hint hint* that releasing the specs = good business.
  • Maybe now we can finally get a decent Windows driver!
  • You've followed through! My next video card purchase won't be for a while, so there's a good change that free drivers will be available, and you just got yourself a customer!

    I know everyone was skeptical when this was announced some months ago. I though "well, it could happen." The silence on the issue lately made me think I had spoken too soon. I was beginning to wonder where the specs were. Well, here they are.

    Thank you ATI!

Advertising is the rattling of a stick inside a swill bucket. -- George Orwell

Working...