Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Graphics Open Source Hardware Linux

AMD Open Sources Their Linux Video API 64

An anonymous reader writes "AMD has open sourced X-Video Bitstream Acceleration, their API by which they expose the Universal Video Decoder 2 GPU under Linux." They may be a little late with this move, and not everything you could wish is now open source, but it's better than nothing.
This discussion has been archived. No new comments can be posted.

AMD Open Sources Their Linux Video API

Comments Filter:
  • Okaaaaaay... (Score:5, Insightful)

    by MrEricSir ( 398214 ) on Saturday February 26, 2011 @07:51AM (#35322930) Homepage

    The ATI drivers for Linux were never perfect, but they worked decently. But ATI/AMD would drop support for older chips that were still in use. The open source community never provided a shim to let these older drivers work with newer builds of X.

    Does open sourcing the drivers really fix the compatibility problem? To me, not building a shim suggests a general lack of caring about ATI drivers. Do we really need the source to give a future to aging ATI/AMD chips?

    • Obviously its not just about "open sourcing the drivers", otherwise we'd have open sourced catalysts and all would be well.
      The reason catalysts work well and open source drivers don't is because, well, they're not the same code at all, and the catalyst ones are a lot more advanced.

      Now they want their view of the standard to be implemented so open source it, but theres other standards already out there, so it feels a bit like "just throwing it around hoping it works"

      • Not neccessarily, the graphics industry is a hotbed of patent litigation waiting to happen. Open sourcing the complete driver would open up a lot of proof for attacks through the courts. Opening up any of it to open source is a huge deal, and show the continuing shift in willingness of manufacturers to work with the linux foundation to provide the best possible experience on the hardware for any potential use. ATI on just linux that's broken, it's the OpenGL support, wchich lags behind even in thier windo
    • Well that is too late I think... Something like using an umbrella after the rain has stopped... :)

      I already have a very bad experience with a couple of laptops with integrated old ATI cards which linux support was dropped by ATI 2 years after they were produced?!...

      So i have learned my lesson very well and it is - NEVER buy anything even closely related to ATI (though it is now AMD :) )...

      No one should make one and the same mistake again, shouldn't he?
      • by Bengie ( 1121981 )
        ATI has better drivers and better cards than nVidia for the Windows platform. Too bad they haven't invested much into Linux :*(

        My $300 6950 out would out-perform a similar $300 card from nVidia, and consume less power and run A LOT cooler. Instead, I OC'd it 100mhz and unlocked a bunch of shaders. Now it smears the floor with another $300 nVidia card. System has been perfectly stable. My friends with nVidia on the other hand, they still have black screen bugs back from 3 generations ago on their new 470s. T
        • by Smauler ( 915644 )

          My Gigabyte gtx460 is 100% stable and runs relatively cool. Under load it'll sometimes get up to 60-70 degrees (with pretty crappy case cooling ATM - my front fan has given up). The lowest I've ever seen it at was 10 degrees. That was in December, when it was cold out, and I don't heat my house when I'm at work. My PC is on 24/7 though... I was suprised it was running so cold - the CPU was at about 15 degrees IIRC. I'm guessing ambient temperature must have been about 5 degrees.

          I had a look at all the

          • by Bengie ( 1121981 )
            Try an Antec 900 for a case. About $100, but hard to beat for it's price. My videocard went from 90c to 40c, but I to had a crappy case.

            Other than the lack of tool-less setup, it is an excellent case.
    • Re:Okaaaaaay... (Score:5, Informative)

      by slash.duncan ( 1103465 ) on Saturday February 26, 2011 @09:50AM (#35323404) Homepage

      Well, there's the proprietary drivers which AMD/ATI does what they want with, dropping support for old chips, etc, and there's the native xorg/kernel/mesa/drm and now KMS drivers, which are open. The open drivers support at least as far back as Mach64 and ATIRage, and while I never used those specific drivers, after I realized what a bad idea the servantware drivers were based on the nVidia card I had when I first switched to Linux, I've stuck with the Radeon native drivers. In fact, I was still using a Radeon 9200 (r2xx chip series) until about 14 months ago, when I upgraded to a Radeon hd4650 (r7xx chip series), so I /know/ how well the freedomware support lasts. =:^)

      And why would the free/libre and open source (FLOSS) folks build a shim for the servantware driver? The kernel specifically does NOT maintain an internal kernel stable ABI (the external/userland interface is a different story, they go to great lengths to maintain that stable), and if anyone's building proprietary drivers on it, it's up to them to maintain their shim between the open and closed stuff as necessary. Rather, the FLOSS folks maintain their native FLOSS drivers.

      And while for the leading edge it it's arguable that the servantware drivers are better performing and for some months may in fact be the only choice, by the time ATI's dropping driver support, the freedomware drivers tend to be quite stable and mature (altho there was a gap in the r3xx-r5xx time frame after ATI quit cooperating, before AMD bought them and started cooperating with the FLOSS folks again, part of the reason I stuck with the r2xx series so long, but those series are well covered now).

      So this /is/ good news, as it should allow the freedomware drivers to better support hardware video accel, as they merge the new information into the freedomware drivers.

    • X.org and Linux kernel developers don't care about any closed source software. When somebody chooses to release software as closed source, he decides that nobody else can update it themselves. Why should open source developers make his life easier by restricting the pace of development of their own software? Open source developers didn't force him to release the software as closed source. Open source software on the other hand can be easily updated to keep up with the pace of upstream development by anybody

      • This kind of "us vs. them" thinking is a failure both in politics and in software.

        If we're more concerned with the licenses than whether or not our computers work, then we've failed as programmers and become lawyers.

        • I disagree. Open Source is not just about the Ubuntu image you can download today, it is about how we create and use software 20 years from today. It took a long time to get hardware vendors to hand out specs or show genuine interest in delivering Free drivers. And here we are, with at least two big players (AMD and Intel) pledging their support and another (NVIDIA) at least playing along somewhat.

          • 20 years ago, FLOSS advocates were saying the exact same thing.

            And yet, my computer's graphics chip STILL doesn't work. I'm sick of the excuses.

            • And yet, my computer's graphics chip STILL doesn't work. I'm sick of the excuses.

              And which one is that? Because right now, the R300g driver (supporting R300-R500 chips) is about to pass Catalyst with a loud woosh in 3D performance. It has passed Catalyst with much louder woosh in 2D performance and stability ages ago. The R600c driver also makes Catalyst eat dust in 2D performance and 3D works fine on older cards (HD5xxx support is still weak because ATI released specs less than 6 months ago).

        • There is no "us vs. them" in this case. There are two software packages, one open source, the other proprietary. Why should developers of the open source package cripple their own software just to keep the proprietary one working? Developers of the propiretary one made the decision to prevent everybody else from contributing fixes and updates. If you're dissatisfied with results when they can't or don't want to keep up with changes in related open source packages, blame the proprietary developers for making
    • The ATI drivers for Linux were never perfect, but they worked decently. But ATI/AMD would drop support for older chips that were still in use. The open source community never provided a shim to let these older drivers work with newer builds of X.

      Does open sourcing the drivers really fix the compatibility problem? To me, not building a shim suggests a general lack of caring about ATI drivers. Do we really need the source to give a future to aging ATI/AMD chips?

      As of January 19 phoronix, puts the average [phoronix.com]

    • The fglrx drivers were terrible. They were ludicrously unstable. From what I understand, they eventually got better, but they would crash the system (ie, straight to POST, not just X11) on a regular basis for years.

      Writing a compatibility layer for old drivers is a very tricky business. Specifically, it's the business of the writers of the old drivers. Only they know what arcane deprecated functionality their software uses, not the writers of the interface.

      Open sourcing the API to the drivers did fix the pr

  • this is good news (Score:4, Interesting)

    by bmalia ( 583394 ) on Saturday February 26, 2011 @07:53AM (#35322938) Journal
    I have always purchased nVidia cards soley because I knew that they provided linux drivers. Lately though, the drivers don't seem to work quite right. Might be getting to be about time for me to give ATI a go.
    • I've used nothing but ATI and nothing but Linux for 6+ years now and I've never had any issues. I am of course using the proprietary ATI drivers though. And I never buy the latest, top of the line video cards.

    • If you don't need cutting edge graphics, give Intel Graphics a go. The drivers are free software -> distributors are permitted to integrate them properly -> installation is a breeze.

      • by bmcage ( 785177 )

        If you don't need cutting edge graphics, give Intel Graphics a go. The drivers are free software -> distributors are permitted to integrate them properly -> installation is a breeze.

        Unfortunately, everybody actually needs cutting edge graphics ...

        • Unfortunately, everybody actually needs cutting edge graphics ...

          For what?

          • by bmcage ( 785177 )
            You don't use those nice Desktop Effects I presume. Did you take a look at how Apple promotes it's upcoming Lion? My wife's laptop with Intel is a joke compared to mine. I need good graphics for work. Of course, it might be the open source drivers that suck.
            • by Knuckles ( 8964 )

              Intel's 3D is plenty capable of desktop effects and stuff like Google Earth. Compiz runs perfectly. When the GP wrote "cutting edge graphics" he was talking about stuff like Crysis 2 and maybe professional 3D use. Few people actually need that.

            • My laptop has an Intel Mobile Series 4 graphics card. KDE compositing rarely drops below 25 frames/second.

              • by tepples ( 727027 )

                KDE compositing rarely drops below 25 frames/second.

                If it drops below 60, it's not keeping pace with your monitor.

  • by arivanov ( 12034 ) on Saturday February 26, 2011 @07:54AM (#35322940) Homepage

    Sigh... That makes what? 4 or 5 different APIs.

    Original XvMC
    Via XvMC VLD extension
    Nvidia - three options - legacy, their bitstream and using CUDA
    Intel

    Sigh... Can't we just get along and agree on a single standard?

    • Fractured API standards are the standard in the open source world. Just look at A/V APIs, web rendering APIs, KDE vs. GTK, etc.

      As long as they can work together programmatically, it's not necessarily a bad thing to have different APIs.

      • Re: (Score:2, Insightful)

        by BitZtream ( 692029 )

        And this is something that most people in the OSS world (has nothing to do with OSS in general, just OSS allows it to happen easier) utterly fail to grasp.

        The bad thing with multiple APIs, that all do essentially the same thing is that they give 'choice'. I realize that most OSS users and indeed most techies LOVE choice, the rest of the world doesn't. Or rather, its not so much that they don't like choice, its that they are not educated about the choice enough to answer them effectively.

        GTK vs Qt/Gnome vs

    • And, since it requires the Catalyst driver, I assume it's tied to X (like VDPAU) which means that integrating into something like DirectFB isn't going to be possible. As far as I can tell the APIs are not only different in detail but different in the way they are abstracted which means it's quite difficult to have them "work together" in any meaningful way.

    • This does seem to be a recurring theme in the open source world. On the one hand, it's great to have/try lots of approaches but we need a more effective way of elevating the most successful to the top. Seems like connecting social media more closely with these types of projects would enable discussion and opinions to act as a catalyst for promoting effective solutions.
    • by Kjella ( 173770 )

      Well XvMC will never do more than MPEG2, so it's not suited for much of anything.

      As far as modern codecs go, nVidia has VDPAU, Intel has VA API and ATI has XvBA. Why everyone needs to reinvent the wheel I don't know, but there it is. I figure eventually someone will write the right wrappers so apps only need to deal with one API.

      • Re:Yet Another API (Score:5, Informative)

        by u17 ( 1730558 ) on Saturday February 26, 2011 @09:15AM (#35323270)

        I figure eventually someone will write the right wrappers so apps only need to deal with one API.

        VA-API is the wrapper that you speak of. It has multiple backends [freedesktop.org], including backends for Intel cards, VDPAU and XvBA.

        • in fact up until this release it was only ever possible to use XvBA through VA-API. now there's a possibility of that changing but i doubt it. Instead i believe this will result in a more stable XvBA backend for VA-API so that it'll end up easier to use.
      • by Xua ( 249955 )
        Actually Intel's VA API has backends that use VDPAU and something from FRGLX. I am not sure these backends are tested well but in theory an application that uses VA API can use acceleration provided by all three major graphics hardware vendors. In addition to decoding VA API can be used to accelerate encoding and post-processing filters.
    • by Ant P. ( 974313 )

      VA-API is the only standard that makes sense to implement [freedesktop.org], unless you like limiting your apps to nvidia/ati users only, or like writing three times as much code.

  • by CajunArson ( 465943 ) on Saturday February 26, 2011 @08:20AM (#35323018) Journal

    Nvidia's VDPAU is already an open standard that other video drivers can implement in Linux for video acceleration, so I'm not sure what this buys us. VDPAU as implemented by Nvidia is also about the only video acceleration standard that isn't totally broken and that can accelerate videos beyond MPEG-2 as well.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      independence from binary-blob drivers is what it buys us.

    • by Anonymous Coward

      It's not true that VDPAU is the only working standard for linux. My Notebook has an Intel Arrandale GPU which works fine with Intels open-source video driver and can decode H264 HD through VA-API just fine.

    • VA-API is an open standard, what's wrong with it? Allegedly both ATI and nVidia cards can backend it (nvidia through VDPAU, ATI through whatever wacky stuff they use.) And of course intel provided it so it works with intel.

      nVidia has the only accelerated OpenGL pipeline that works worth a crap on any platform. Now THAT is interesting.

  • Not open sourced (Score:5, Informative)

    by Kjella ( 173770 ) on Saturday February 26, 2011 @08:47AM (#35323130) Homepage

    This headline is widely misleading. They've now documented their equivalent of nVidia's VDPAU blob, but it's only available when you run the closed source Catalyst driver. TFA says so quite clearly.

    Before anyone starts wondering, this won't do much good for those hoping to see AMD's UVD2 engine supported by the open-source Radeon graphics drivers.

    • by Anonymous Coward
      It's doubly misleading, because API stands for Application Programmer's Interface, and an interface in the context of computer programming means headers/protocols, which by definition aren't compiled and don't have any source. Hence you can't "open-source" an interface.
  • they would release their internal hardware accelerated build of FFMpeg
  • I feel that nVidia uses the same drivers for all operating systems. The core doesn't change, it just has a wrapper to interface with X/DirectX/Quartz. They just update the core significantly once in a while, to the point it can't interface with older cards. That's why the occasionally have huge issues.
  • Mplayer-uau (basically mplayer with full multithreading) plays 1080p H.264 on an Atom D510 without any hardware decoding. I have given up with GPU video decoding on Linux, since software works so well even on fanless processors.
    • by godrik ( 1287354 )

      In my experience, Atom processors do not play high resolution very well. But even if it did, having a version that uses the GPU would be a significant improvement. It would free the CPU to do potentially something else. Including downclocking, which could improve energy efficiency significantly.

      On my PDA (Nokia N810), I used to decompress audio using a software lib. When I switched to a lib that uses the internal DSP, my battery life increase 300%.

      • In my experience, Atom processors do not play high resolution very well. But even if it did, having a version that uses the GPU would be a significant improvement. It would free the CPU to do potentially something else. Including downclocking, which could improve energy efficiency significantly.

        Good point, but the power consumption of a Radeon is not exactly zero, even at idle. If you need to add a discrete GPU to shave off a few CPU watts, I believe the overall consumption increases.

        On my PDA (Nokia N810), I used to decompress audio using a software lib. When I switched to a lib that uses the internal DSP, my battery life increase 300%.

        Another good point. Unfortunately, this AMD announcement does not do much help in the mobile space.

        There is also the general point that hardware acceleration is lagging behind new codec development. Software is much more flexible, even when "hardware" means new drivers/firmware for a general-purpose DSP. In this

"The medium is the message." -- Marshall McLuhan

Working...