Nvidia Geforce 4 (NV25) Information 351
msolnik writes: "nV News has a
brief article about the long-awaited NV25-based video adapters. These graphics processors have similar capabilities compared to the XGPU, and are a lot more powerful than GeForce3 Ti500. Since they are manufactured using .13 micron technology, they will probably be clocked at very high levels."
3dfx... (Score:2, Interesting)
What I want (Score:1, Interesting)
Re:eh? (Score:3, Interesting)
This 6 months, the GeForce 3 Ti200/500 came out. Last 6 months ago, the GeForce 3 came out, etc...
This kind of release schedule is what made 3dfx, an once undisputed leader in 3d technology, lag so far behind. Consequently, it's also what has made Matrox not even really care about the 3d market.
Re:Power without Application? (Score:2, Interesting)
I fail to see what's so revolutionary about their hardware. They're basically building huge DSP-style chips where much of the operation is hardcoded for better optimization. If chips continue to do this, of course you're going to see games struggle to catch up. The 3D graphics market seems to be doing very little that's revolutionary--just bringing the chips up to the process limitations of transistor size and speed.
The problem with the current model is that the graphics card itself isn't expected to have any intelligence of its own. It's simply expected to render as much as possible in as little time as it can. Right now, we're expected to pass millions of triangles to the card to render, as well megabytes of textures to slap on them as fast as possible. Imagine if instead, the developer handed the graphics card a mathematical description of the model, and the chip did the rest, filling in details based on fractal algorithms. Instead of applying a bumpy looking texture to a wall, you could make the wall itself bumpy with potentially infinite detail. That would be revolutionary, and would require incredible engineering to design.
Saying the current crop of graphics chips are revolutionary is like denying that SGI ever designed a Reality Engine in the first place. Just because greater integration allows it to be insanely fast doesn't mean it's anything really that new. NVidia's going to have to do something pretty amazing to keep from getting blown away when something truly revolutionary comes around.
Video capture (Score:3, Interesting)
I have an old Asus TNT3400/TV, and I never get to use the
Anyone have any recommendations?
Thanks,
Ian