The Return of S3 335
flynn_nrg writes "Just saw this article on ExtremeTech about S3's new graphics card. S3 is back on the scene with its first new GPU architecture in five years. Rather than take aim at the high-end, S3 has set its sights on the midrange price/performance category, which is currently dominated by ATI's Radeon 9600 XT and nVidia's GeForce FX 5700, both of which are under $200. Today S3 unveils the DeltaChrome S8 GPU, which represents the midrange of its upcoming line of DeltaChrome GPUs."
Also on Tech Report (Score:5, Informative)
It looks like they have half a product. Good enough hardware, absolutely horrible drivers.
And I'm not talking about drivers that don't run quickly. I'm talking about drivers that render things incorrectly or even crash! Ugh.
At least with Intel's Integrated Graphics (or Nvidia or even ATI these days) even though they may not be the quickest on the block at least their drivers *work*.
The Matrox Parhelia (Score:3, Informative)
It seems the Parhelia was a card that was priced at more than most nVidia cards, yet provided no-where near the performance.. yet people still bought them. Why? I remember seeing the benchmarks and the Parhelia was absolutely shocking. Supposedly the only great thing was the FSAA quality but... you don't buy a card just for that, shurely?
So, what was so great about Matrox coming back with the Parhelia? I must have missed the point.
OpenGL support? (Score:2, Informative)
Could one of the reviewers give us a report of what version of OpenGL the deltachrome supports? What extensions does it support? How many instructions long can the fragment and vertex programs be?
GLInfo (w32 application) gives a complete list of all this.
5 Years!? (Score:5, Informative)
Re:Wow (Score:5, Informative)
In some games, Myst for instance, there's really no such thing as frame rate at all. In others, like shooters, the cpu requirements to handle the physics are fairly minimal and nice graphics sells games. These are the ones that require the latest hot card. If you're into sims though, like IL-2 or NASCAR 2003 the physics calculations put the hardest load on the system and for these the hottest cpu, particularly the math coprocessor, will give you the best performance overall.
Everything is always tradeoffs and compromise. Many games even have "favorite" video cards, right down to the particular model and driver. The best you can really do is optimize for your favorite game and play the rest as is possible.
KFG
Re:But wait! (Score:2, Informative)
We could ditch X if we could write our own drivers from specs.
The price better be low (Score:3, Informative)
Re:Wow (Score:4, Informative)
The sad part is that I suspect that ATI's hardware is (and always has been) absolutely top notch. They just don't seem to put much focus on debugging the drivers.
ATI video cards have been banned from my workplace for several years now, and I've not seen a reason to change my mind on that. (Yes, I get to make decisions like that)
Re:But wait! (Score:5, Informative)
Re:Wow (Score:3, Informative)
Re:Been there done That. (Score:2, Informative)
I don't know. My lab has several IBM laptops with ATI cards, and they all run UT (which we use for AI and robotics work) fairly well, that is, ~30 fps. Now, running the AcidUnreal renderer drops it to 2-4 fps.