Open Source Graphic Card Project Seeks Experts 370
An anonymous reader writes "Could this dream of many open source developers and users finally happen? A 100% open sourced graphic card with 3D support? Proper 3D card support for OpenBSD, NetBSD and other minority operating systems? A company named Tech Source will try to make it happen. You can download the preliminary specs for the card here (pdf). The project, though a commercial one, wants to become a true community project and encourages experts and everyone who have good ideas to add to the development process to join the mailing list. You can also sign a petition and tell how much you would be willing to pay for the final product."
Great!! (Score:5, Interesting)
In theory other companies might steal the design and build and sell the card on their own, but if the design is community-owned, then that actually works to lower prices...
Anonymous Cow
Waste of time (Score:5, Interesting)
Building a good open 2D card? Mabye... I doubt it's really feasible, but have at it. Chase that dream.
But a 3D card? You are going to make a card to run the latest Quake and Doom? Or even release back of the games? Do you realize how much time, how many thousands of man hours go into these cards? The dollar amount for the simulators, the fabs to make the prototypes, etc
This could however, make a great teaching tool.
I take it back... if the card can target elementary 3D and stellar 2D, it could (in a few years) be THE card to own for a commodity Linux box. Target your audience carefully and don't get caught up in the IdSoftware upgrade cycle! :)
Great Idea (Score:5, Interesting)
Really what a project like this needs is the developer to shut out the open source community, until the project is done. If linus had made a large project out of the original kernel, I seriously doubt if it would have ever been completed. This should be kept simple, and then open sourced, only once there is a good code base to build from.
Re:Dupe! (Score:3, Interesting)
ati & nvidia release old specs? (Score:5, Interesting)
Re:ati & nvidia release old specs? (Score:4, Interesting)
2D and 3D Patents. (Score:1, Interesting)
I recommended them buying their way in by obtaining the patents to the Tseng 2D chip, and the PowerVR Kyro 3D chip and building from their.
The other way is doing some truely innovative work (basically reinventing 2D and 3D graphics).
Re:Waste of time (Score:3, Interesting)
I'm all for open-source hardware products, but lets make them something that isn't already readily available in a form opensource folks find to be generally acceptible. They should at least give the thing *one* major feature advantage (how about quad DVI? noone is doing THAT yet... at least not in any reasonable price range.)
Plus PCI-Express really wouldn't hurt.
"Could this dream... really happen?" (Score:3, Interesting)
Come one folks, let's get real.
Re:False logic (Score:2, Interesting)
Re:Waste of time (Score:2, Interesting)
While I love my Matrox G450, the fact is, Matrox will never release another card like it, nor will they improve on it. If the Tech Source project works, then one day, it will release a card that is superior to the G450.
Re:RTFA/RTFWS/RTFE! (Score:5, Interesting)
Falling anywhere short of, say, OpenGL 1.4 support would make it pretty much useless. In other words, it doesn't have to have pixel shaders, but it has to have good, filtered texture mapping, lighting, alpha, quite a bag of stuff. The Spartan 3 (not III as the tech spec suggests) has 1.5 million gates and 384 MHz, which ought to be enough for a decent 3D core, with one catch: it's got 32 18x18 multipliers, no dividers. Don't even think about floating point, obviously, but without dividers, perspective interpolation is going to be pretty tough. Without perspective interpolation... well, think "1970's".
I just hope there's a standard way of getting around this. Any hardware hacks out there?
get it out quickly and create a framework (Score:5, Interesting)
Re:RTFA/RTFWS/RTFE! (Score:3, Interesting)
There is of course also the question if OpenSource driver can compete with the quality of say the NVidia drivers, after all they 'just work'[tm], which it not something that I can say about all the OpenSource stuff I use.
Overall I wish them luck, but I have a hard time imagening a market where such a card would really fit. Being OpenSource is sure a plus, but it alone won't be enough. And so far I still havn't seen a transmeta processor for sale over here in germany, don't really expect this piece of hardware to have much more success.
Idea (Score:1, Interesting)
Re:Yay! It has an FPGA on it. (Score:3, Interesting)
Boards such as the Multimedia Board http://www.xilinx.com/products/boards/multimedia/ [xilinx.com] contain everything you would need. Not cheap though...
They have not put the whole thing on a PCI card, probably because it's even more fun to integrate a CPU core and build the whole system-on-chip on the FPGA while at it.
Cheers!
Re:RTFA/RTFWS/RTFE! (Score:5, Interesting)
Odds are that your CPU doesn't have a divider on it either.
Google for Newton-Raphson.
Fast hardware dividers are big and expensive - somewhat more expensive than a multiplier. But if you have a multiplier and you're not too concerned about performance, or are happy to tradeoff precision for performance, then you can do division using your multiplier, a small seed ROM and a microcode engine.
Finally (Score:3, Interesting)
I have never understood this project. If they want to start with something at least equivalent to a five year old SGI graphics pipeline abd build from there, then I'd say go for it. But the specs on this card don't look any better than the stuff you get right OOTB with an intel chipset (which, after sufferng with this goddamned nvidia system for too long now, is the reason I'll not be buying another AMD system).
So is the whole point of this card just to pick up the slack for AMD?
Trend: Graphics Cards become General Purpose Cards (Score:4, Interesting)
Some of the thoughts expressed by experts are that 3D cards may become general purpose parallel computing cards.
If it weren't for bottlenecks in the AGP bus, it would be possible to use 3D cards of today for more general purpose computing (I'm fuzzy on what the actual hold ups are here...timing issues?).
There have been Slashdot discussions about using the graphics card for audio processing, because audio is usually less than a 32 bit stream. The problem is that audio and often general purpose computing have "real time" requirements.
Also, make sure your open source card supports ARB_fragment!
Re:Waste of time (Score:4, Interesting)
When you think about it, Quartz Extreme only needs to handle a relatively small number of parallel polygons at basically a constant distance away. That's a much simpler job than millions of triangles at arbitrary angles to each other at varying distances and whatnot.
The job the video card does can potentially be as simple as figuring out which window is exposed in a given area and grabbing pixels from the appropriate frame buffer. OpenGL is a good deal more complicated than that, but since both the driver and the FPGA are under our control, I would think it would be possible.
Why not reuse existing technology? (Score:2, Interesting)
It has 96 execution elements, 96 ops simultaneously for your data. Sounds like ideal for graphics processing. Power consumption 5 watts, 50 GFlops of computing power.
And they make PCI-X cards for PC systems. You can have several cards in one system for compounded processing power. Now, all this monster would need is the graphics output parts and drivers. They even have a full development kit for both Windows and Linux. The card's programmed in C.
Perhaps the PC's of the future would have two CPU's, one linear general purpose CPU (current x86 based) for program code and system management and one massively parallel CPU for tasks better suited for it. If there's no one true road to happiness, make it two then.
Order-driven fabbing, other crazy ideas (Score:3, Interesting)
Might want to consider setting up a site for people to register their interest and potential orders, not just how much you would pay for but actually get the orders.
I don't remember if it was successful, but Sony has done this in the past. I know it failed once due to (I believe) a weblogic crash due to too many orders or weak system.
If the website is mentioned every time a story appears on slashdot or some other site, you can continue to accumulate and update information. If you make transparent the financials behind it, people may rush in to get you over the threshold of a precalculated breakeven point (including reasonable profit of course).
Personally I am in the market for a graphics card in the next 6 months. I am planning on getting the best I can afford at the time, and am curious what this project might offer to sway me. Sure performance is not likely to beat the top of the line of the other competitors at the same price point.. at least that is what one would guess. Maybe not true? Well, the FPGA looks really cool.
Consider that the fastest supercomputer in the world is the GRAPE-6 (GRAvity PipE) built on FPGAs for simulation of gravitational interactions (of globular clusters, etc.).
I was thinking it might be closer to something insanely great if you go for the multiple channels now for example. Maybe if you ask about that on your site you'll get people to agree. (How much more would it cost? etc.).
Also I don't know what the FPGA would promise, presumably quick firmware updates from the net of course. Could part of it be used for another purpose, or is that too difficult? Could an additional FPGA be turned into a chip that runs linux (use it on a PC) or perhaps be flashed with the results of another project (I'd love to have a Perl chip.. make it and they will come?) Could another chip or expanded memory provide say a video wall controller with edge blending for multiple screens in realtime? This kind of thing alone might sell enough to make it useful. What do commercial image processors have that this couldn't?
I just saw a sexy video switching fabric thingy here [jupiter.com]
I am curious about what exact "X.org eye candy" this would enable. I am guessing some of: "Direct Link for this comment Brilliant, and about time By Bryan Kagnime (IP: ---.polarnet.ca) - Posted on 2004-11-28 08:23:43 I don't really care so much for the 3d gaming aspect, distribute with the card an opensource operating system like Slackware with some 2d desktop eyecandy (translucency/transparency/openGL) and I'll buy a card for everyone I know with a comp. This'll show users *what* linux is all about, distrobuting a superior product and opening the market share for innovators." ?
One post on osnews mentioned realtime encoding/decoding of video streams, and though I am not sure this would not still impact the rest of the machine considering the design, that sounds neat!
128MB is enough to hold a couple frames of 20 times the resolution of a 1024x768 screen and still have over 30 MB left over. What if it included support for edge/corner blending and warping for a video wall? Is it conceivable that this could take the output from a fast consumer card and provide 2D warping and other effects for displays using multiple projected patches? Consider what it is good at. How about talking over the network or other bus to other oss graphics cards for multiple projector support.
If some nonvolatile memory was included, the card could remember a video wall wallpaper and open window/document information, or keep some megapixel images or something else always available. Would this be useful, say for quick startup or as a backup for important memories?
How about selling with an external patchbay that can take many video sources and provi
Re:Waste of time (Score:5, Interesting)
That being said, 3D provides a lot more possibilities - you could make windows be actual objects that could be moved forward or backwards, stacked up, leaned against each other, and so on. Implement HAVOC physics so I can grab an icon and smash it into my other icons and watch them scatter all over my desktop, or throw it and watch it bounce off the edge of the screen and land in my network drive.
Eventually, all we'll need to do to solve the spyware problem is to use a wallhack and noclip and go bounce that crap to the curb. Sure, we'll have to endure the cries of spyware makers shouting 'lamer!' or 'wallhack' or 'aimbot', but we can just kick them off the network if it comes to that, or
Priorities (Score:3, Interesting)
The absolute #1 focus for this card (if they hope to get people to pay more than $30 for it) needs to be fully reprogramable by mere mortals. It would be absolutely wonderful to get a general-purpose FGPA in a computer. People pay more than $100 for crypto cards, video capture cards, etc because hardware is so much better at those tasks. This would wipe the floor with them, because you could program in a new codec or cipher.
Even if it didn't have any video-output at all, I'd still pay $100+ for a PCI card version. Once video encoding apps are optimized to send the processing that's hardest on the CPU to the FGPA instead, I expect we'll see huge increases in encoding speed. That, BTW, also leads to much more complex codecs (MPEG-6 anyone?) that reduce filesize/bitrate significantly.
Besides that, I would also like to see a bit of effort in making sure it works on non-x86 hardware. Since this company makes video cards for SPARC systems, I that surely would not be difficult for them to handle.
If this thing actually sees the light of day, it will completely change what a videocard is. This also strikes me as a potentially piviotal moment in computer hardware. Perhaps, a few years from now, the biggest graphics card maker will have a museum wing dedicated to remember how it all started back in 2004. Yeah, I know it's a stretch, but this really does have that potential.
What about this??? (Score:3, Interesting)
Manticore already exists for some time and it is also what they call Open Hardware. If they could work together, this could result into a good implementation for a Linux/Un*x hardware design.
Re:False logic (Score:1, Interesting)
http://cvs.reactos.com/cgi-bin/cvsweb.cgi/react