Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
GNU is Not Unix Graphics Software Hardware IT

Basic Linux Boot On Open Graphics Card 177

David Vuorio writes "The Open Graphics Project aims to develop a fully open-source graphics card; all specs, designs, and source code are released under Free licenses. Right now, FPGAs (large-scale reprogrammable chips) are used to build a development platform called OGD1. They've just completed an alpha version of legacy VGA emulation, apparently not an easy feat. This YouTube clip shows Gentoo booting up in text mode, with OGD1 acting as the primary display. The Linux Fund is receiving donations, so that ten OGD1 boards can be bought (at cost) for developers. Also, the FSF shows their interest by asking volunteers to help with the OGP wiki."
This discussion has been archived. No new comments can be posted.

Basic Linux Boot On Open Graphics Card

Comments Filter:
  • I can understand open sourcing the software, but can someone explain the benefits of opening the hardware as well?

    • by nurb432 ( 527695 ) on Saturday May 02, 2009 @04:48PM (#27800763) Homepage Journal

      Do you want to be tied to a vendor?

      If the answer is no, then you understand. if you don't mind being tied to a vendor and at their mercy, then i guess the answer for you is that there is no benefit.

      Open hardware has the same value as open software.

      • If the answer is no, then you understand. if you don't mind being tied to a vendor and at their mercy, then i guess the answer for you is that there is no benefit

        Yeah, but open vendors are vendors too. That's the thing. Basically, what you are trying to do is suppress innovation for the sake of commoditization, and that's not a proposition that people want to make.

      • Do you want to be tied to a vendor?

        If the answer is no, then you understand. if you don't mind being tied to a vendor and at their mercy, then i guess the answer for you is that there is no benefit.

        Open hardware has the same value as open software.

        Right... every time I run a high-end game on my propriatary 3D driver I have to stop and flagellate myself halfway though to dull the guilt... not...

    • Re: (Score:3, Interesting)

      Well obviously it's of academic interest. American consumers have sunk billions into video card research and for the most part the implementations are shrouded in mystery locked up in labs. Nobody un-NDA-bound really knows how to build these things: computer graphics is a highly specialized and difficult problem for hardware engineers. The real interest is in making a hardware design that actually works well and then writing up the design in abstract, not to actually make working video cards.

      Also I guess
      • Re: (Score:3, Insightful)

        by Jeff DeMaagd ( 2015 )

        Well obviously it's of academic interest. American consumers have sunk billions into video card research and for the most part the implementations are shrouded in mystery locked up in labs.

        The problem with this line is that the American consumers may have sunk billions into buying video cards, they were never promised any or all the knowledge required to build one. In other words, you bought a product, not the product design process process, and your line seems to suggest confusion on that part.

        • by BikeHelmet ( 1437881 ) on Saturday May 02, 2009 @05:49PM (#27801135) Journal

          and your line seems to suggest confusion on that part.

          Doesn't seem that way to me. He's just pointing out that when compared to other electronics, we have shockingly little info available.

          Even for CPUs, there are fully documented "open-source" microcontrollers available, but for GPUs there's basically nothing. It is a big mystery, how it's all done. And now we've gone so far that GPUs are doing incredible things like juggling 10,000 threads that manage all the shading when, you fire up a game.

          nVidia and ATI stated GPUs are many times more complex than CPUs, and I fully believe them.

          • Its not that big of a mystery, nor does it matter that it does in the grand scheme of things.

            There are AVR microcontrollers that can output a VGA signal via bitbanging, so that part is obviously simple enough.

            You don't really HAVE to continue legacy VGA support, make a new standard thats a good solid standard and will in some way benifit pc manufactures and you'll have bioses and OSes that support them shortly afterwords.

            The problem isn't the technology, the problem is starting from scratch and lasting long

      • by Animats ( 122034 ) on Saturday May 02, 2009 @05:22PM (#27800993) Homepage

        There's not that much mystery about the things. Making a VGA emulator in an FPGA is no big deal. If all you implemented was text mode and mode 13H, it would probably boot Linux. Getting to a card that runs OpenGL is a big job, but not out of reach. The pipeline is well understood, and there are software implementations to look at. As you get to later versions of Direct-X, it gets tougher, because Microsoft controls the documentation.

        But the real problem is that you'll never get anything like the performance of current generation 3D boards with an FPGA. There aren't anywhere near enough gates. You need custom silicon.

        • Just because you used a compiler-compiler toolchain to chew up your OpenGL book and spit out a hardware spec doesn't mean you have an opengl card. Implementing OpenGL efficiently isn't just a "big job" it's essentially the entire field of computer graphics hardware.
          • Re: (Score:3, Insightful)

            by Animats ( 122034 )

            Implementing OpenGL efficiently isn't just a "big job" it's essentially the entire field of computer graphics hardware.

            It's understood, though. And you can do it in sections. Start with an OpenGL implementation that does tessellation, geometry transforms, and fill in software. Build something that just does the fill part. (That's 1995 technology in PC graphics.) Then add the 4x4 multiplier and do the geometry on the board (that's 1998 technology on the PC, 1985 technology for SGI.). Once all that'

            • It's a lot like Linux was ten years ago; a decade behind, but still useful. Fixed that for you.
            • Re: (Score:3, Interesting)

              by TheRaven64 ( 641858 )

              If anything, a modern GPU is easier to implement than an older one. Modern APIs basically ditch the entire fixed-function pipeline in favour of an entirely programmable one. The fixed-function OpenGL 1-style pipeline is implemented entirely in the drivers. The GPU is just a very fast arithmetic engine. Some are SIMD architectures, but a lot of the newer ones are abandoning even this and just having a lot of parallel ALUs. You could probably take an FPU design from OpenCores.org, stamp 128 of them on a

      • American consumers have sunk billions into video card research

        American consumers have sunk billions in buying video cards on which to play games. Then, the companies that designed the components for those video cards invested in video card research.

        It's not exactly the same thing.

        • I was just pointing out that tons of money is being spent so the technology is advancing. Graphics hardware really is amazing; is your car superceded every 2 years by newer models with 10 times as much horsepower? Do "car textbooks" still teach carburetors while mechanics scratch their heads and wonder what happened to the thing with the air holes? Nvidia is making their own products obsolete every few years by their frenetic pace of research, but nobody really knows what they're up to.
      • The tighter the NDA, the more you should suspect that the underlying tech is not rocket science.

        So to speak.

        In this case, graphics is not that hard. Fast graphics, even, is not all that hard.

        The cruft is the thing that is hard. Mechanisms to manage (emulate) the cruft are about the only thing non-obvious enough to get a good patent on, and much of that, if shown to the light of day, will be seen to be covered by prior art.

        A big part of the reason INTEL got so excited about ray-tracing was that they were/are

    • by Daemonax ( 1204296 ) on Saturday May 02, 2009 @04:57PM (#27800827)
      We're geeks... So the reason is "because we can". It provides a system where we don't have another blackbox. We can actually understand down to the lowest level how things are working. This is great for people who desire to understand how things work, and also people that hope for a future of machines and hardware that are under the control of the owners.

      Sorry to get a bit crazy here, but imagine a world with technology like that in Ghost in the Shell. I would not go getting such implants and additions if I did not and could not have complete control and understanding over the stuff. This type of project is a small step in maintaining individual control.
    • by cduffy ( 652 ) <charles+slashdot@dyfis.net> on Saturday May 02, 2009 @04:58PM (#27800841)

      When a piece of music, or a play, enters the public domain, there are effects beneficial to the public:

      • Direct embodiments (sheet music, CDs, etc) become cheaper, and thus accessible to more of the public.
      • Derived works are easier (no licensing hassle) to create.

      These have analogs here. Having a Free video card design means that low-end video cards can become that much cheaper (and that there's more room for new entrants into the very-low-end market), and that there's a common, available base on which new and innovative work can be done.

      • Re: (Score:3, Insightful)

        Agreed. I'm not a gamer, but I like the idea of having an open implementation of a graphics card for my use. Lower the barriers to entry to the market, and things get really interesting.

        I hope this group of engineers can succeed in producing an open board that eventually provides high-end graphics capabilities.
    • Yes. Ok let's take a look at Intel Graphics, OK? So they have free software drivers. Cool and all but what if you wanted to write a Galium3D (totally different style kinda) driver for it? Well you can't, because you do not know the hardware.

      How about you want to reprogram the GPU itself? It would be kinda nice to know what the graphics card is made of.

  • A milestone? (Score:4, Insightful)

    by Brian Gordon ( 987471 ) on Saturday May 02, 2009 @04:45PM (#27800749)
    Isn't VGA a very thoroughly documented and widely implemented standard?

    Also, they can't possibly approach competing with NVidia or ATI and I doubt anyone's going to shell out a billion dollars to build a plant to make their cards. If they're just playing around with FGPAs then this isn't really a serious "Open Graphics Card" ... performance will be terrible .
    • Re: (Score:3, Funny)

      by ypctx ( 1324269 )
      Well, second step is Open Source Factories.
      • by Kotoku ( 1531373 ) on Saturday May 02, 2009 @04:57PM (#27800831) Journal
        Step 1: Open Graphics Card Step 2: Open Source Factories Step 3: ???? Step 4: Communism!
      • Re:A milestone? (Score:5, Interesting)

        by auric_dude ( 610172 ) on Saturday May 02, 2009 @04:57PM (#27800835)
      • Not exactly the same thing, but there are a few IC manufacturers who specialise in low-yield jobs. Often they are a generation behind in terms of process technology (130nm stuff is really cheap now), but they are relatively cheap for lots as small as a few thousand, and a few of them will do lots as small as ten, in the corner of another customer's wafer (much more expensive per-unit). While not everyone can make an open source CPU or GPU in their home, anyone can contribute to the design, and with enough
    • Re:A milestone? (Score:5, Informative)

      by DavidR1991 ( 1047748 ) on Saturday May 02, 2009 @04:53PM (#27800797) Homepage

      The /. post gives the wrong impression about the VGA implementation - it was difficult because they wanted to implement it in a extremely simple fashion, not because VGA itself is complex

    • Re:A milestone? (Score:5, Interesting)

      by iamacat ( 583406 ) on Saturday May 02, 2009 @04:56PM (#27800825)

      Also, they can't possibly approach competing with NVidia or ATI

      If you are running Windows on an x86 box, this may be true. Move to FreeBSD on an ARM embedded display and getting the drivers becomes dicey. Want to optimize medical imaging requiring 48 bit color rather than a typical game? Bet you will have better luck with an FPGA than an off the shelf card.

      • You're gonna need 16-bit D/A. And you don't do that with FPGAs. What you really need is a 48-bit RAMDAC. The rest is easy, you don't even need any GPU acceleration if it would be too difficult to work it out, just use the CPU.

        I have written display drivers for several ARM embedded devices. I find it pretty easy, because when you make a system like that, you can get the entire spec for the display from the display controller vendor, something you can't get from NVidia or ATI.

    • by hpa ( 7948 )

      VGA is reasonably well documented, although a lot of the quirks aren't. It is, however, a horrible and painful design which had a zillion rarely-used features.

      • by taniwha ( 70410 )

        well yes and no - chunks of VGA are well documented - but it's a register spec that doesn't say what happens when you deviate from the documented register values - over the years various programmers have stepped outside the spec and gone their own way (doom, microsoft, ....) enough people have done it that unless your design does the right thing in all these architectural black holes you're not 'compatible'

    • Re:A milestone? (Score:5, Informative)

      by Jeff DeMaagd ( 2015 ) on Saturday May 02, 2009 @05:06PM (#27800891) Homepage Journal

      A lot of times, FPGAs are used for development. Once the design is proven, then you can go to etching into silicon. Almost nobody builds a fab for one chip, the good news is that chip fabs can make numerous different kinds of chips. There are many fabs that are willing to take any design that comes their way, as long as the money is there.

    • by zlogic ( 892404 )

      Both ATI and Nvidia are fabless companies. They only design chips and then send the specs to a plant in China.

    • ...you underestimate the capacity of volunteers, or even companies that allow their engineers to work on such products. As an example, it's been well documented that the cost to build Linux [developerfusion.com] exceeds at least a billion dollars. And few took Linux seriously in the beginning except the volunteers who believed in the project.

      So if companies and individuals worldwide are willing to free themselves from proprietary graphics card designs so that their software will work better, then they're probably willing to
      • So if companies and individuals worldwide are willing to free themselves from proprietary graphics card designs so that their software will work better, then they're probably willing to invest a billion dollars or more, for it.

        I think you're making an unwarranted connection between how much it would have cost to build Linux commercially and how much Linux is actually worth. Though it may have taken a billion dollars of work, if it only carves out 500 million of wealth in its lifetime then investors certain

        • How do you estimate 500 million? Linux has become a utility platform for computing for ventures large and small. And for a growing number, desktop computing. It could be that the value of Linux to many people is intangible. You know, freedom, transparency, that sort of thing. But that can translate into saved man-hours, which is money.

          As far as risk is concerned, most companies and individuals have taken many calculated risks since there is no guarantee of profits. This is true for any venture (oth
          • I do think Linux is worth a billion dollars, but "how much it would have cost" doesn't have anything to do with it.
            • Linux is communism build upon a capitalistic foundation, namely the only way to achieve real communism by choice and therefore maybe a small step in human society, but huge leap for mankind.

              This 'virus' is now spreading to the physical, namely something you can trade. From service to goods, putting capitalism to good use, without needing a government for flawed communism; creating and freely sharing, just because we can and out of free will.

              I see all sorts of tiny things without real visible impact happenin

    • Re: (Score:3, Informative)

      by Rockoon ( 1252108 )
      Dont listen to these people. VGA is very well documented, and the posters which hint at the "non-standard" video modes (the popular oines being 320x240, 320x400, and 360x480 .. as well as 80x50 text mode) are incorrect. While the VGA BIOS may not have an INT call which sets those "non-standard" modes, they are fully predictable and part of the standard, which is why they were very well exploited back in the DOS days.

      They are only "non-standard" in popular belief, but very well THE STANDARD.

      It is true th
    • Re: (Score:3, Informative)

      by DrSkwid ( 118965 )

      All you pups that don't remember VGA modelines and frazzling your monitor with the wrong XFree settings.

      VGA only goes up to 640x480x16 with 256k RAM, after that anything goes. IBM lost their grip on the market when they wrong footed themselves trying to force end users on to their MCA bus and XGA. MB / IOcard cloners started to design their own cards. Vesa Local Bus was born and MCA was largely ignored.

      Intel became the trend setter and (after EISA) PCI became the BUS and 3Dfx stole the gamer market from und

  • by Anonymous Coward on Saturday May 02, 2009 @05:11PM (#27800925)
    from http://www.osnews.com/permalink?360100 [osnews.com] As the original architect of the way VGA is done on this board, perhaps I can offer an explanation. There is perhaps a more straightforward way of implementing VGA than the way we did it. The direct route would require two components. One piece is the host interface that interprets I/O and memory accesses from PCI and manipulates graphics memory appropriate. The other piece is a specialized video controller that is able to translate text (which is encoded in two bytes as an ASCII value and color indices) in real-time into pixels as they're scanned out to the monitor. This is actually how others still do it. To us, VGA is legacy. It should be low-priority and have minimal impact on our design. We didn't want to hack up our video controller in nasty ways (or include alternate logic) for such a purpose, and we didn't want to dedicate a lot of logic to it. Doing it the usual way was going to be too invasive and wasteful. Also, we want eventually to do PCI bus-mastering, which requires some high-level control logic, typically implemented in a simple microcontroller. So we thought, if we're going to have a microcontroller anyhow, why not give it dual purpose. When in VGA mode, the uC we designed (which we call HQ) intercepts and services all PCI traffic to OGD1. Microcode we wrote interprets the accesses and stores text appropriately in graphics memory. Then, to avoid hacking up the video controller, we actually have HQ perform a translation from the text buffer to a pixel buffer over and over in the background. Its input is VGA text. Its output is pixels suitable for our video controller. Aside from the logic reduction, this has other advantages. The screen resolution as seen by the host is decoupled from the physical display resolution. So while VGA thinks it's 640x400, the monitor could be at 2560x1600, without the need for a scaler. It's easily programmable, and we have complete control over how the text is processed into pixels; for instance, we could have HQ do some scaling or use a higher-res font different from what the host thinks we're using. We call it emulation because, in a way, our VGA is implemented entirely in software, albeit microcode that's loaded into or own microcontroller.
    • I approve.

    • Re: (Score:3, Interesting)

      by mako1138 ( 837520 )

      So does the host interface part reside in the Lattice FPGA, in 10K LUTs?

      • Re: (Score:3, Informative)

        by Theovon ( 109752 )

        Yes.

        The XP10 contains these parts:
        - PCI
        - Microcontroller that does VGA
        - PROM interfaces

        The S3 is mostly empty and contains these parts:
        - Memory controller
        - Video controller
        - Room for a graphics engine

  • This, running on a T1 or T2 [opensparc.net] machine, running ${FREEOSOFCHOICE}. yum.
  • unless the chips and expansion card circuit boards can be made in masses to make them more affordable. You have to make them in mass quantities in order to drive the cost of them down.

    Nobody wants to buy a $300 Open Source graphic card, when a closed source graphic card costs $100 and has better graphics.

    Still this is a good idea, instead of Chinese companies stealing closed source ideas and violating IP laws, they can make open source graphic cards using the open source license and be legal. I would like t

    • (Setting aside the idea that this should only be good enough for the undersirables in developing country X ...)

      For now, I'm not thinking about a game video controller.

      I'm thinking about an LCD video controller for a pocket calculator that costs less than JPY 5,000 and runs dc if I ask it to. And gforth and vi. Oh, and bash and gcc, of course.

      And maybe I can plug in an SD with an English--Japanese--Spanish--Korean--etc. dictionary on it.

  • BIOS (Score:3, Interesting)

    by Alex Belits ( 437 ) * on Sunday May 03, 2009 @06:40AM (#27804903) Homepage

    As a person who actually did proprietary BIOS development, I can tell you that:

    1. It's possible to make BIOS boot without VGA.
    2. It's usually a massive pain in the neck.

    One of my projects involved making one of the popular proprietary BIOSes boot on custom x86 hardware that lacked VGA. On the development board (where I could attach and remove PCI VGA card) all it took was setting console redirection in CMOS setup, turning the computer off, removing VGA and booting it again. On production board (with no built-in graphics adapter and no PCI slots) I also had to modify BIOS so console redirection was on by default.

    Then I had to spend weeks rewriting console redirection code to make it work properly -- I had to rely on console messages when debugging custom hardware support, and existing implementation was way too crude to actually display all messages those. Existing implementations merely allocate "VGA" buffer in memory, occasionally check it for changes and send the updates to the serial port using VT100 escape sequences. "Occasionally" is a key word here.

  • Uses of OpenGraphics (Score:3, Interesting)

    by starseeker ( 141897 ) on Sunday May 03, 2009 @12:32PM (#27806901) Homepage

    To all of those who keep saying this project is useless because it will never compete with NVIDIA/ATI:

    Although I agree with those who cite "because we can" as a perfectly valid reason, it is not the only reason. The lack of high quality open source 3D graphics drivers has long been an issue with desktop applications of Linux/*BSD, and while NVIDIA's closed drivers do fairly well they still limit the options of the open source community. If a bug in those drivers is responsible for a crash, it's up to NVIDIA to do something about it. The open source community is prohibited from fixing it. Remember the story about how Stallman reacted when he couldn't fix the code for a printer?

    Plus, who knows what optimizations might be possible if the major open source toolkit devs could sit down with the X.org guys and the OpenGraphics folk and really start to optimize the whole stack on top of an open graphics card? It wouldn't be up to the standards of NVIDIA or ATI for 3D games, but in this day and age you need decent 3D graphics for a LOT more than that! Scientific apps, Graphics applications, CAD applications... even advanced desktop widget features can take advantage of those abilities if they are present. What if ALL the open source apps that need "good but not necessarily top of the line" graphics card support could suddenly get rock solid, flexible support on an open card?

    The paradigm for graphics cards has been "whoever can give the most features for the newest game wins" for a long time now. But there is another scenario - what if maturity starts becoming more important for a lot of applications? For many years, people were willing to toss out their desktop computer and replace it with one that was "faster" because usability improved. Then the hardware reached a "fast enough" point and the insane replacement pace slowed. For some specialized applications, there is no such thing as a computer that is "fast enough" but for a LOT (perhaps the grand majority) of users that point is dictated by what is needed to run their preferred software well. If the open source world can get their applications running very well atop OpenGraphics, who cares what the benchmark performance is? If the user experience is top notch for everything except the "latest and greatest games" (which usually aren't open source games, bty - most of the most advanced open source games are using variations on the quake engines, which being open source could certainly be tuned for the new card) and that experience is better BECAUSE THE CARD IS OPEN AND OPEN SOURCE IS SUPPORTING IT WELL it will have a market that NVIDIA and ATI can't hope to touch. Perhaps not a huge market, but niche products do well all the time.

    There is one final scenario, which is the open nature of this board's design allowing virtually all motherboard manufactures to include it as a default graphics option on their boards at very low cost. That might allow for logic that uses that card for most things and fires up the newest cards specifically for games or other "high demand" applications if someone has one installed (presumably installed because they do have a specific need for the raw power). This would mean broader GOOD support for Linux graphical capabilities across a wide span of hardware as part of the cost of the motherboard, which is a Very Good Thing.

  • This is a graphics card DEVELOPMENT PLATFORM. That implies a few things:

    (1) This is a proving ground for designs that could be turned into a fast ASIC.
    (2) Graphics is only one of countless things you could use this for. How about using this as a basis for cryptographic offload, or high-end audio, or wifi?

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...