Forgot your password?
typodupeerror
GNU is Not Unix Graphics Software Hardware IT

Basic Linux Boot On Open Graphics Card 177

Posted by timothy
from the bumblebee-flies-anyway dept.
David Vuorio writes "The Open Graphics Project aims to develop a fully open-source graphics card; all specs, designs, and source code are released under Free licenses. Right now, FPGAs (large-scale reprogrammable chips) are used to build a development platform called OGD1. They've just completed an alpha version of legacy VGA emulation, apparently not an easy feat. This YouTube clip shows Gentoo booting up in text mode, with OGD1 acting as the primary display. The Linux Fund is receiving donations, so that ten OGD1 boards can be bought (at cost) for developers. Also, the FSF shows their interest by asking volunteers to help with the OGP wiki."
This discussion has been archived. No new comments can be posted.

Basic Linux Boot On Open Graphics Card

Comments Filter:
  • by Brian Gordon (987471) on Saturday May 02, 2009 @04:52PM (#27800793)
    Well obviously it's of academic interest. American consumers have sunk billions into video card research and for the most part the implementations are shrouded in mystery locked up in labs. Nobody un-NDA-bound really knows how to build these things: computer graphics is a highly specialized and difficult problem for hardware engineers. The real interest is in making a hardware design that actually works well and then writing up the design in abstract, not to actually make working video cards.

    Also I guess it's useful to hammer out some foundational "building blocks" and make them available freely so that entry into video card research is easier.
  • Re:A milestone? (Score:1, Interesting)

    by Anonymous Coward on Saturday May 02, 2009 @04:54PM (#27800807)

    There are custom fabs all over the world. This might also be a real boon to folks who make embedded devices; a low-cost video core that you can customize for your application or load on your cpu/fpga combo could be a "big deal"

    John

  • Re:A milestone? (Score:5, Interesting)

    by iamacat (583406) on Saturday May 02, 2009 @04:56PM (#27800825)

    Also, they can't possibly approach competing with NVidia or ATI

    If you are running Windows on an x86 box, this may be true. Move to FreeBSD on an ARM embedded display and getting the drivers becomes dicey. Want to optimize medical imaging requiring 48 bit color rather than a typical game? Bet you will have better luck with an FPGA than an off the shelf card.

  • by Daemonax (1204296) on Saturday May 02, 2009 @04:57PM (#27800827)
    We're geeks... So the reason is "because we can". It provides a system where we don't have another blackbox. We can actually understand down to the lowest level how things are working. This is great for people who desire to understand how things work, and also people that hope for a future of machines and hardware that are under the control of the owners.

    Sorry to get a bit crazy here, but imagine a world with technology like that in Ghost in the Shell. I would not go getting such implants and additions if I did not and could not have complete control and understanding over the stuff. This type of project is a small step in maintaining individual control.
  • Re:A milestone? (Score:5, Interesting)

    by auric_dude (610172) on Saturday May 02, 2009 @04:57PM (#27800835)
  • Re:Hey (Score:3, Interesting)

    by marcansoft (727665) <hector@marcan s o f t . c om> on Saturday May 02, 2009 @06:10PM (#27801263) Homepage

    For what it's worth, nVidia cards can do this just fine and have been able to for a long time. See the "full GPU scaling" option in nvidia-settings. My HTPC's nVidia card also shows the BIOS on my HDTV at 1080p link resolution (while pretending to be VGA to the software).

  • Re:Hey (Score:4, Interesting)

    by DaleGlass (1068434) on Saturday May 02, 2009 @06:48PM (#27801447) Homepage

    But it probably still uses a 8x16 pixel font, which doesn't look that good on a 30" screen.

    I think the idea is that the video card could pretend it's VGA, while substituting an antialiased 32x64 font in its place. Nothing earthshaking of course, but that sure would look nice.

    Your text mode could look like this [omag.es]

  • by mako1138 (837520) on Saturday May 02, 2009 @06:56PM (#27801487)

    So does the host interface part reside in the Lattice FPGA, in 10K LUTs?

  • Re:Hey (Score:4, Interesting)

    by DaleGlass (1068434) on Saturday May 02, 2009 @07:38PM (#27801709) Homepage

    Ok, and how many people are going to run a desktop on it? It's server hardware.

    Again, you seem to be missing my point. Yes, Linux technically doesn't need the BIOS. Yes, there exist other architectures besides x86.

    But, a video card is a product for desktops, and the vast majority of desktops are x86. The vast majority of those start booting in text mode.

    Pretty much all other architectures are unimportant in comparison, because they're used in embedded hardware, or are technically outdated. If anybody is going to buy this thing, I doubt they're going to put it into a modern Sun server.

    It's already a project that's going to find it hard to get wide adoption, why would you make it even harder for it to find an use, by making it incompatible with the most common by far hardware it could be plugged into?

  • Re:Hey (Score:2, Interesting)

    by farfield (1119449) on Saturday May 02, 2009 @08:39PM (#27802143)

    I've used Sparc desktops [sun.com] in the past. I even used one as my main home machine for a while. You could even get Sparc laptops.

    In their time they beat the Intel option imo and they are still in use in some places.

  • by reiisi (1211052) on Sunday May 03, 2009 @05:53AM (#27804749) Homepage

    The tighter the NDA, the more you should suspect that the underlying tech is not rocket science.

    So to speak.

    In this case, graphics is not that hard. Fast graphics, even, is not all that hard.

    The cruft is the thing that is hard. Mechanisms to manage (emulate) the cruft are about the only thing non-obvious enough to get a good patent on, and much of that, if shown to the light of day, will be seen to be covered by prior art.

    A big part of the reason INTEL got so excited about ray-tracing was that they were/are hoping there will be something in there that will be hard enough and innovative enough to get some solid IP protection on.

    False hopes. (... besides IP being an oxymoron ...)

  • BIOS (Score:3, Interesting)

    by Alex Belits (437) * on Sunday May 03, 2009 @06:40AM (#27804903) Homepage

    As a person who actually did proprietary BIOS development, I can tell you that:

    1. It's possible to make BIOS boot without VGA.
    2. It's usually a massive pain in the neck.

    One of my projects involved making one of the popular proprietary BIOSes boot on custom x86 hardware that lacked VGA. On the development board (where I could attach and remove PCI VGA card) all it took was setting console redirection in CMOS setup, turning the computer off, removing VGA and booting it again. On production board (with no built-in graphics adapter and no PCI slots) I also had to modify BIOS so console redirection was on by default.

    Then I had to spend weeks rewriting console redirection code to make it work properly -- I had to rely on console messages when debugging custom hardware support, and existing implementation was way too crude to actually display all messages those. Existing implementations merely allocate "VGA" buffer in memory, occasionally check it for changes and send the updates to the serial port using VT100 escape sequences. "Occasionally" is a key word here.

  • by TheRaven64 (641858) on Sunday May 03, 2009 @10:09AM (#27805807) Journal

    If anything, a modern GPU is easier to implement than an older one. Modern APIs basically ditch the entire fixed-function pipeline in favour of an entirely programmable one. The fixed-function OpenGL 1-style pipeline is implemented entirely in the drivers. The GPU is just a very fast arithmetic engine. Some are SIMD architectures, but a lot of the newer ones are abandoning even this and just having a lot of parallel ALUs. You could probably take an FPU design from OpenCores.org, stamp 128 of them on a die, and have a reasonable (if not stellar) GPU that just needs drivers writing. With an architecture like Gallium3D, even this is relatively easy; the driver is effectively a compiler from an intermediate bytecode to the hardware's instruction set and you can create one by just implementing an LLVM back end for your code (typically around 10KLOC), reusing all of the OpenGL state-tracking from Mesa.

    This is essentially the approach Intel are taking with Larabee. The GPU is just a load of relatively slow CPUs with beefy vector units all on the same die.

  • Uses of OpenGraphics (Score:3, Interesting)

    by starseeker (141897) on Sunday May 03, 2009 @12:32PM (#27806901) Homepage

    To all of those who keep saying this project is useless because it will never compete with NVIDIA/ATI:

    Although I agree with those who cite "because we can" as a perfectly valid reason, it is not the only reason. The lack of high quality open source 3D graphics drivers has long been an issue with desktop applications of Linux/*BSD, and while NVIDIA's closed drivers do fairly well they still limit the options of the open source community. If a bug in those drivers is responsible for a crash, it's up to NVIDIA to do something about it. The open source community is prohibited from fixing it. Remember the story about how Stallman reacted when he couldn't fix the code for a printer?

    Plus, who knows what optimizations might be possible if the major open source toolkit devs could sit down with the X.org guys and the OpenGraphics folk and really start to optimize the whole stack on top of an open graphics card? It wouldn't be up to the standards of NVIDIA or ATI for 3D games, but in this day and age you need decent 3D graphics for a LOT more than that! Scientific apps, Graphics applications, CAD applications... even advanced desktop widget features can take advantage of those abilities if they are present. What if ALL the open source apps that need "good but not necessarily top of the line" graphics card support could suddenly get rock solid, flexible support on an open card?

    The paradigm for graphics cards has been "whoever can give the most features for the newest game wins" for a long time now. But there is another scenario - what if maturity starts becoming more important for a lot of applications? For many years, people were willing to toss out their desktop computer and replace it with one that was "faster" because usability improved. Then the hardware reached a "fast enough" point and the insane replacement pace slowed. For some specialized applications, there is no such thing as a computer that is "fast enough" but for a LOT (perhaps the grand majority) of users that point is dictated by what is needed to run their preferred software well. If the open source world can get their applications running very well atop OpenGraphics, who cares what the benchmark performance is? If the user experience is top notch for everything except the "latest and greatest games" (which usually aren't open source games, bty - most of the most advanced open source games are using variations on the quake engines, which being open source could certainly be tuned for the new card) and that experience is better BECAUSE THE CARD IS OPEN AND OPEN SOURCE IS SUPPORTING IT WELL it will have a market that NVIDIA and ATI can't hope to touch. Perhaps not a huge market, but niche products do well all the time.

    There is one final scenario, which is the open nature of this board's design allowing virtually all motherboard manufactures to include it as a default graphics option on their boards at very low cost. That might allow for logic that uses that card for most things and fires up the newest cards specifically for games or other "high demand" applications if someone has one installed (presumably installed because they do have a specific need for the raw power). This would mean broader GOOD support for Linux graphical capabilities across a wide span of hardware as part of the cost of the motherboard, which is a Very Good Thing.

Badges? We don't need no stinking badges.

Working...