Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software GUI X Hardware

Open Source Graphic Card Project Seeks Experts 370

An anonymous reader writes "Could this dream of many open source developers and users finally happen? A 100% open sourced graphic card with 3D support? Proper 3D card support for OpenBSD, NetBSD and other minority operating systems? A company named Tech Source will try to make it happen. You can download the preliminary specs for the card here (pdf). The project, though a commercial one, wants to become a true community project and encourages experts and everyone who have good ideas to add to the development process to join the mailing list. You can also sign a petition and tell how much you would be willing to pay for the final product."
This discussion has been archived. No new comments can be posted.

Open Source Graphic Card Project Seeks Experts

Comments Filter:
  • Great!! (Score:5, Interesting)

    by Anonymous Coward on Sunday November 28, 2004 @12:13AM (#10935189)
    I've kind of waited for this for years.

    In theory other companies might steal the design and build and sell the card on their own, but if the design is community-owned, then that actually works to lower prices...

    Anonymous Cow
  • Waste of time (Score:5, Interesting)

    by jarich ( 733129 ) on Sunday November 28, 2004 @12:16AM (#10935199) Homepage Journal
    This has come up before.

    Building a good open 2D card? Mabye... I doubt it's really feasible, but have at it. Chase that dream.

    But a 3D card? You are going to make a card to run the latest Quake and Doom? Or even release back of the games? Do you realize how much time, how many thousands of man hours go into these cards? The dollar amount for the simulators, the fabs to make the prototypes, etc

    This could however, make a great teaching tool.

    I take it back... if the card can target elementary 3D and stellar 2D, it could (in a few years) be THE card to own for a commodity Linux box. Target your audience carefully and don't get caught up in the IdSoftware upgrade cycle! :)

  • Great Idea (Score:5, Interesting)

    by mhaisley ( 410683 ) on Sunday November 28, 2004 @12:17AM (#10935204)
    This is a really great idea, but it will probably never work, a mailing list will bring way to many points of view.

    Really what a project like this needs is the developer to shut out the open source community, until the project is done. If linus had made a large project out of the original kernel, I seriously doubt if it would have ever been completed. This should be kept simple, and then open sourced, only once there is a good code base to build from.
  • Re:Dupe! (Score:3, Interesting)

    by log2.0 ( 674840 ) on Sunday November 28, 2004 @12:17AM (#10935210)
    They are trying to get devolper interest, not announcing the open card. Its a different story. Although I may have missed the story you are referring to? :)
  • by Goalie_Ca ( 584234 ) on Sunday November 28, 2004 @12:18AM (#10935214)
    I can understand that this card will never compete with ATI and nvidia which raises the question, is there any reason why ATI can't open source their old graphic cards, such as their 7000 series. Surely that technology is no longer critical to their lead. Sure many of those cards aren't being sold any more, but there are still plenty around and this may open up a niche market so they can produce some as a low-cost device.
  • by DaHat ( 247651 ) on Sunday November 28, 2004 @12:25AM (#10935246)
    I'd wager that even the older cards still bare some similarity to the newer ones, enough so that such designs could give a competitor a major head start in designing future cards. Opening up their plans is nice in theory, but in practice... would almost certainly come back to bite them in the rear.
  • 2D and 3D Patents. (Score:1, Interesting)

    by Anonymous Coward on Sunday November 28, 2004 @12:31AM (#10935280)
    Well as mentioned over on OSNews. There's the always issue of patents. The vorbis people had to deal with them. These people will as well. I suggest looking through the archives for suggestions already discussed. e.g. DSPs.

    I recommended them buying their way in by obtaining the patents to the Tseng 2D chip, and the PowerVR Kyro 3D chip and building from their.

    The other way is doing some truely innovative work (basically reinventing 2D and 3D graphics).
  • Re:Waste of time (Score:3, Interesting)

    by skids ( 119237 ) on Sunday November 28, 2004 @12:37AM (#10935304) Homepage
    Not to stomp on TechSource, but the proposed feature set is over 5 years old. So why would they have any market advantage over 5-year-old Matrox cards (especially given Matrix has quad-monitor cards)?

    I'm all for open-source hardware products, but lets make them something that isn't already readily available in a form opensource folks find to be generally acceptible. They should at least give the thing *one* major feature advantage (how about quad DVI? noone is doing THAT yet... at least not in any reasonable price range.)

    Plus PCI-Express really wouldn't hurt.

  • by Anita Coney ( 648748 ) on Sunday November 28, 2004 @12:40AM (#10935317) Homepage
    Yeah, it COULD happen. But it will also be crap. Does anyone really think a company could simply start competing with nVidia or ATI on features and power?! Heck, 3dfx couldn't do it. Matrox essentially gave up. And what about Virge?! Dare I even mention bitboys?!

    Come one folks, let's get real.

  • Re:False logic (Score:2, Interesting)

    by niteice ( 793961 ) <icefragment@gmail.com> on Sunday November 28, 2004 @12:46AM (#10935335) Journal
    If anything, development of a good "open-source" 3D card could be hampered by patents. What about Mesa? A custom version could be written for this card to provide at least basic 3D...actually, it'll be like running Quake 3 on an S3 Trio64V+, but you get my point.
  • Re:Waste of time (Score:2, Interesting)

    by wrecked ( 681366 ) on Sunday November 28, 2004 @12:51AM (#10935354)
    Well, the difference is that the proposed Tech Source card would be open source, and therefore would (hopefully) evolve, slowly but surely, in the same manner of other open source projects like Linux, Mozilla, OpenOffice.org, etc.

    While I love my Matrox G450, the fact is, Matrox will never release another card like it, nor will they improve on it. If the Tech Source project works, then one day, it will release a card that is superior to the G450.

  • Re:RTFA/RTFWS/RTFE! (Score:5, Interesting)

    by Tough Love ( 215404 ) on Sunday November 28, 2004 @12:58AM (#10935378)
    If you'd read-up on this subject, you'd have seen that these folk *do* know their hardware. They are also not being overly ambitious. While they expect to be able to develop a card which has 3D accelleration for desktop applications, they make no bold claims about gaming.

    Falling anywhere short of, say, OpenGL 1.4 support would make it pretty much useless. In other words, it doesn't have to have pixel shaders, but it has to have good, filtered texture mapping, lighting, alpha, quite a bag of stuff. The Spartan 3 (not III as the tech spec suggests) has 1.5 million gates and 384 MHz, which ought to be enough for a decent 3D core, with one catch: it's got 32 18x18 multipliers, no dividers. Don't even think about floating point, obviously, but without dividers, perspective interpolation is going to be pretty tough. Without perspective interpolation... well, think "1970's".

    I just hope there's a standard way of getting around this. Any hardware hacks out there?
  • by jeif1k ( 809151 ) on Sunday November 28, 2004 @01:15AM (#10935439)
    Some advice:
    • Get the hardware out quickly; if you wait too long, it will be obsolete before you ship.
    • Create a basic development platform (gcc, loader, etc.) and a basic framework with at least a little bit of useful functionality (2D acceleration, minimal 3D); it can be quite incomplete, but it should make it easy for contributors to add functionality one small piece at a time.
    • You can charge a little more than a comparable regular graphics card, but not a lot more. If this becomes a premium custom hardware product, it's dead on arrival.

  • Re:RTFA/RTFWS/RTFE! (Score:3, Interesting)

    by grumbel ( 592662 ) <grumbel+slashdot@gmail.com> on Sunday November 28, 2004 @01:18AM (#10935452) Homepage
    Only question is then what would be the big advantage of such a card vs say a Matrox card, which also have limited 3d capabilites, but are pretty good at plain 2d or vs. an NVidia card with the OpenSource drivers?

    There is of course also the question if OpenSource driver can compete with the quality of say the NVidia drivers, after all they 'just work'[tm], which it not something that I can say about all the OpenSource stuff I use.

    Overall I wish them luck, but I have a hard time imagening a market where such a card would really fit. Being OpenSource is sure a plus, but it alone won't be enough. And so far I still havn't seen a transmeta processor for sale over here in germany, don't really expect this piece of hardware to have much more success.

  • Idea (Score:1, Interesting)

    by Anonymous Coward on Sunday November 28, 2004 @01:30AM (#10935490)
    Make sure you can "beowulf" them together. The GPUs are not going to be powerful so there needs to a way to make it more powerful.
  • by eigendude ( 563238 ) on Sunday November 28, 2004 @01:30AM (#10935491)
    Well, it seems that Xilinx and other big FPGA vendors have already thought of making cards on which you may try to create your own GPU.

    Boards such as the Multimedia Board http://www.xilinx.com/products/boards/multimedia/ [xilinx.com] contain everything you would need. Not cheap though...

    They have not put the whole thing on a PCI card, probably because it's even more fun to integrate a CPU core and build the whole system-on-chip on the FPGA while at it.

    Cheers!

  • Re:RTFA/RTFWS/RTFE! (Score:5, Interesting)

    by SSpade ( 549608 ) on Sunday November 28, 2004 @02:06AM (#10935619) Homepage

    Odds are that your CPU doesn't have a divider on it either.

    Google for Newton-Raphson.

    Fast hardware dividers are big and expensive - somewhat more expensive than a multiplier. But if you have a multiplier and you're not too concerned about performance, or are happy to tradeoff precision for performance, then you can do division using your multiplier, a small seed ROM and a microcode engine.

  • Finally (Score:3, Interesting)

    by poptones ( 653660 ) on Sunday November 28, 2004 @02:33AM (#10935682) Journal
    I was scouring this thread looking for someone else to say this because I knew couldn't be the only one to realize it.

    I have never understood this project. If they want to start with something at least equivalent to a five year old SGI graphics pipeline abd build from there, then I'd say go for it. But the specs on this card don't look any better than the stuff you get right OOTB with an intel chipset (which, after sufferng with this goddamned nvidia system for too long now, is the reason I'll not be buying another AMD system).

    So is the whole point of this card just to pick up the slack for AMD?
  • by macmurph ( 622189 ) on Sunday November 28, 2004 @02:52AM (#10935736)
    I've heard that 3D cards of today are exponentially increasing in number of transistors. It's been said that the problem of displaying 3D is "embarrassingly parallel". Hence, the performance of these cards far outstrip the CPU for parallel processing.

    Some of the thoughts expressed by experts are that 3D cards may become general purpose parallel computing cards.

    If it weren't for bottlenecks in the AGP bus, it would be possible to use 3D cards of today for more general purpose computing (I'm fuzzy on what the actual hold ups are here...timing issues?).

    There have been Slashdot discussions about using the graphics card for audio processing, because audio is usually less than a 32 bit stream. The problem is that audio and often general purpose computing have "real time" requirements.

    Also, make sure your open source card supports ARB_fragment!
  • Re:Waste of time (Score:4, Interesting)

    by ArbitraryConstant ( 763964 ) on Sunday November 28, 2004 @04:14AM (#10936010) Homepage
    Well, because they use an FPGA the card can potentially be reprogrammed to support just enough of OpenGL to do what Quartz Extreme does.

    When you think about it, Quartz Extreme only needs to handle a relatively small number of parallel polygons at basically a constant distance away. That's a much simpler job than millions of triangles at arbitrary angles to each other at varying distances and whatnot.

    The job the video card does can potentially be as simple as figuring out which window is exposed in a given area and grabbing pixels from the appropriate frame buffer. OpenGL is a good deal more complicated than that, but since both the driver and the FPGA are under our control, I would think it would be possible.
  • by mikelambert70 ( 547007 ) on Sunday November 28, 2004 @08:14AM (#10936540)
    Clearspeed http://www.clearspeed.com/ [clearspeed.com] is just coming to market with their CSX600 'application accelerator' processor.

    It has 96 execution elements, 96 ops simultaneously for your data. Sounds like ideal for graphics processing. Power consumption 5 watts, 50 GFlops of computing power.

    And they make PCI-X cards for PC systems. You can have several cards in one system for compounded processing power. Now, all this monster would need is the graphics output parts and drivers. They even have a full development kit for both Windows and Linux. The card's programmed in C.

    Perhaps the PC's of the future would have two CPU's, one linear general purpose CPU (current x86 based) for program code and system management and one massively parallel CPU for tasks better suited for it. If there's no one true road to happiness, make it two then.
  • by mattr ( 78516 ) <mattr&telebody,com> on Sunday November 28, 2004 @08:30AM (#10936569) Homepage Journal
    First I think this sounds like a wonderful project. The remainder of this post is dedicated to crazy ideas an dmaybe one interesting one. I want one or three.

    Might want to consider setting up a site for people to register their interest and potential orders, not just how much you would pay for but actually get the orders.

    I don't remember if it was successful, but Sony has done this in the past. I know it failed once due to (I believe) a weblogic crash due to too many orders or weak system.

    If the website is mentioned every time a story appears on slashdot or some other site, you can continue to accumulate and update information. If you make transparent the financials behind it, people may rush in to get you over the threshold of a precalculated breakeven point (including reasonable profit of course).

    Personally I am in the market for a graphics card in the next 6 months. I am planning on getting the best I can afford at the time, and am curious what this project might offer to sway me. Sure performance is not likely to beat the top of the line of the other competitors at the same price point.. at least that is what one would guess. Maybe not true? Well, the FPGA looks really cool.

    Consider that the fastest supercomputer in the world is the GRAPE-6 (GRAvity PipE) built on FPGAs for simulation of gravitational interactions (of globular clusters, etc.).

    I was thinking it might be closer to something insanely great if you go for the multiple channels now for example. Maybe if you ask about that on your site you'll get people to agree. (How much more would it cost? etc.).

    Also I don't know what the FPGA would promise, presumably quick firmware updates from the net of course. Could part of it be used for another purpose, or is that too difficult? Could an additional FPGA be turned into a chip that runs linux (use it on a PC) or perhaps be flashed with the results of another project (I'd love to have a Perl chip.. make it and they will come?) Could another chip or expanded memory provide say a video wall controller with edge blending for multiple screens in realtime? This kind of thing alone might sell enough to make it useful. What do commercial image processors have that this couldn't?

    I just saw a sexy video switching fabric thingy here [jupiter.com]

    I am curious about what exact "X.org eye candy" this would enable. I am guessing some of: "Direct Link for this comment Brilliant, and about time By Bryan Kagnime (IP: ---.polarnet.ca) - Posted on 2004-11-28 08:23:43 I don't really care so much for the 3d gaming aspect, distribute with the card an opensource operating system like Slackware with some 2d desktop eyecandy (translucency/transparency/openGL) and I'll buy a card for everyone I know with a comp. This'll show users *what* linux is all about, distrobuting a superior product and opening the market share for innovators." ?

    One post on osnews mentioned realtime encoding/decoding of video streams, and though I am not sure this would not still impact the rest of the machine considering the design, that sounds neat!

    128MB is enough to hold a couple frames of 20 times the resolution of a 1024x768 screen and still have over 30 MB left over. What if it included support for edge/corner blending and warping for a video wall? Is it conceivable that this could take the output from a fast consumer card and provide 2D warping and other effects for displays using multiple projected patches? Consider what it is good at. How about talking over the network or other bus to other oss graphics cards for multiple projector support.

    If some nonvolatile memory was included, the card could remember a video wall wallpaper and open window/document information, or keep some megapixel images or something else always available. Would this be useful, say for quick startup or as a backup for important memories?

    How about selling with an external patchbay that can take many video sources and provi

  • Re:Waste of time (Score:5, Interesting)

    by Sentry21 ( 8183 ) on Sunday November 28, 2004 @09:09AM (#10936683) Journal
    The issue I see is that these interfaces don't need to be 3D accelerated - because they're not 3D. Why couldn't 2D acceleration accomplish the same thing? Store the windows in memory as textures and move them around in hardware. This doesn't require 3D itself, it just requires hardware compositing and alpha-blending, hardware accelerated windowing, offscreen rendering, z-buffering, scaling & rotation, and so on.

    That being said, 3D provides a lot more possibilities - you could make windows be actual objects that could be moved forward or backwards, stacked up, leaned against each other, and so on. Implement HAVOC physics so I can grab an icon and smash it into my other icons and watch them scatter all over my desktop, or throw it and watch it bounce off the edge of the screen and land in my network drive.

    Eventually, all we'll need to do to solve the spyware problem is to use a wallhack and noclip and go bounce that crap to the curb. Sure, we'll have to endure the cries of spyware makers shouting 'lamer!' or 'wallhack' or 'aimbot', but we can just kick them off the network if it comes to that, or /ignore them.
  • Priorities (Score:3, Interesting)

    by evilviper ( 135110 ) on Sunday November 28, 2004 @10:43AM (#10936965) Journal
    Even though this is scheduled for a year in the future, I don't think standard TVs will be gone away by then, and good TV-out support is something absent in ATI/NVidia video cards. S-Video is missing from the PDF spec.

    The absolute #1 focus for this card (if they hope to get people to pay more than $30 for it) needs to be fully reprogramable by mere mortals. It would be absolutely wonderful to get a general-purpose FGPA in a computer. People pay more than $100 for crypto cards, video capture cards, etc because hardware is so much better at those tasks. This would wipe the floor with them, because you could program in a new codec or cipher.

    Even if it didn't have any video-output at all, I'd still pay $100+ for a PCI card version. Once video encoding apps are optimized to send the processing that's hardest on the CPU to the FGPA instead, I expect we'll see huge increases in encoding speed. That, BTW, also leads to much more complex codecs (MPEG-6 anyone?) that reduce filesize/bitrate significantly.

    Besides that, I would also like to see a bit of effort in making sure it works on non-x86 hardware. Since this company makes video cards for SPARC systems, I that surely would not be difficult for them to handle.

    If this thing actually sees the light of day, it will completely change what a videocard is. This also strikes me as a potentially piviotal moment in computer hardware. Perhaps, a few years from now, the biggest graphics card maker will have a museum wing dedicated to remember how it all started back in 2004. Yeah, I know it's a stretch, but this really does have that potential.
  • What about this??? (Score:3, Interesting)

    by g_braad ( 105535 ) on Sunday November 28, 2004 @12:30PM (#10937408) Homepage
    http://www.icculus.org/manticore/

    Manticore already exists for some time and it is also what they call Open Hardware. If they could work together, this could result into a good implementation for a Linux/Un*x hardware design.
  • Re:False logic (Score:1, Interesting)

    by isolation ( 15058 ) on Sunday November 28, 2004 @12:43PM (#10937456) Homepage
    If you really designed the interfaces for DirectX and GL in Windows and are not still under NDA you could take a look at how we have implemented OpenGL.dll and the ICD support in ReactOS and tell us if we right or not.

    http://cvs.reactos.com/cgi-bin/cvsweb.cgi/reacto s/ lib/opengl32/

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...