The Return of S3 335
flynn_nrg writes "Just saw this article on ExtremeTech about S3's new graphics card. S3 is back on the scene with its first new GPU architecture in five years. Rather than take aim at the high-end, S3 has set its sights on the midrange price/performance category, which is currently dominated by ATI's Radeon 9600 XT and nVidia's GeForce FX 5700, both of which are under $200. Today S3 unveils the DeltaChrome S8 GPU, which represents the midrange of its upcoming line of DeltaChrome GPUs."
Wow (Score:2, Insightful)
Maybe it'll drive the prices down a bit.
Re:Wow (Score:5, Funny)
Re:Wow (Score:4, Informative)
The sad part is that I suspect that ATI's hardware is (and always has been) absolutely top notch. They just don't seem to put much focus on debugging the drivers.
ATI video cards have been banned from my workplace for several years now, and I've not seen a reason to change my mind on that. (Yes, I get to make decisions like that)
Re:Wow (Score:3, Informative)
Re:Wow (Score:3, Insightful)
Re:Wow (Score:5, Insightful)
Heck, even if you play cutting edge games, even that $75 card will serve you well unless you absolutely must have 1600x1200 resolution with 32bit color and 435FPS.
Eh, no (Score:2)
Unless you happen to play Deus Ex 2.... (Score:4, Funny)
Re:Unless you happen to play Deus Ex 2.... (Score:2)
Re:Wow (Score:3)
I guess the better upgrade for new games is a faster CPU, not a faster GPU... Who would have thunk it... seriously, for the last 3 years, it has always been the GPU that maxxed out performance on my P4 1.4Ghz... damn...
Re:Wow (Score:5, Informative)
In some games, Myst for instance, there's really no such thing as frame rate at all. In others, like shooters, the cpu requirements to handle the physics are fairly minimal and nice graphics sells games. These are the ones that require the latest hot card. If you're into sims though, like IL-2 or NASCAR 2003 the physics calculations put the hardest load on the system and for these the hottest cpu, particularly the math coprocessor, will give you the best performance overall.
Everything is always tradeoffs and compromise. Many games even have "favorite" video cards, right down to the particular model and driver. The best you can really do is optimize for your favorite game and play the rest as is possible.
KFG
Re:Wow (Score:2, Funny)
Your $400 GPU won't 'wow' you with it's performance if it's just sitting around twiddling its proverbial thumbs, while the rest of your system has the electronic equivalent of a heart attack.
Re:Wow (Score:3, Interesting)
Back in my day we were happy to have textures...
Re:Wow (Score:2)
Re:Wow (Score:2)
Re:Wow (Score:2)
Re:Wow (Score:2)
Thanks for the reply, though.
Re:Wow (Score:5, Insightful)
Especially in a day and age where a hundred bucks more can buy you an entire PC.
Re:Wow (Score:2, Funny)
Perhaps if somebody released a "cutting edge game" that had the same enjoyment value as Quake 2, I'd consider upgrading from my TNT2 card.
Market forces that be, please start working (Score:3, Insightful)
Oh well, as cheap, junky, consumer-level computers are being made, S3 will always have a customer. Its all about the profit margin.
Re:Wow (Score:2)
But wait! (Score:5, Interesting)
Re:But wait! (Score:5, Interesting)
Re:But wait! (Score:3, Interesting)
Re:But wait! (Score:3, Interesting)
Re:But wait! (Score:2)
Re:But wait! (Score:2, Informative)
We could ditch X if we could write our own drivers from specs.
Re:But wait! (Score:5, Informative)
Re:But wait! (Score:2)
I'll take an Nvidia card on linux any day. And please, next time you dog Nvidia, throw in a link or two of hard proof before spouting FUD.
Re:But wait! (Score:2, Insightful)
Re:But wait! (Score:5, Insightful)
Imagine how many video cards are purchased off the shelf at computer stores. Then imagine how many video cards are purchased in new computer sales. I would imagine more video cards are moved by unit in new/refurb(card replaced) sales than individual sales for LOW/MID range cards.
Now I know people purchase high-end cards from stores (I did) but, to sell mid-range cards you usually don't sell to the consumer you sell to the manfacturer.
I would rather spend 'x' amount of money to produce a cheaper and comparable card to the current market norm and get a contract providing Dell w/ cards for their mid-range systems then spending '3x' the amount of money making the "newest and the greatest" card then having to spend another '2x' just marketing the damn thing to a niche market..
I'd rather sell mid-range and more units.
Re:But wait! (Score:4, Insightful)
Of course they're probably not. His point, however, was that *not* having a high-end card to show off and impress people with will decrease their visiblity, among other factors, and make it harder for them to sell midrange cards, even if they are comparable to or better than similarly-midrange cards from NVidia or ATI.
If you see some truly stunning demo from NVidia or ATI on their highest-end card, you're more likely to buy from them, even if you're not shopping for a card anywhere near what you saw. It may not be completely logical, but it's true.
Re:But wait! (Score:2)
Re:But wait! (Score:5, Interesting)
Hence why S3 never really gave a rat's ass about 3d performance before. 3d is expensive to research, create, fabricate, and compete with. That's why there are only 2 players in the market and tons of little guys cranking out 2d cards. S3 would be happy to make a 2d card that can try to do a little 3d if you push it hard.
Look on the bright side though. With s3 texture compression, Quake3 and it's descendents look much better.
Been there done That. (Score:2, Insightful)
And the ViRGE GX2!
And the Savage!
And the Savage4!
And the Savage2000!
Seriously...they've said the same *damn* thing every time. The only inroads this chipset *might* make would be in low-cost laptops, where S3 already had a sizeable market until the GeForce 2 Go and Radeon Mobility started kicking butt.
Re:Been there done That. (Score:2)
(I have windows XP and the video is actually integrated)
Re:Been there done That. (Score:2)
I agree 100%. (Score:2)
Re:Been there done That. (Score:2)
Re:Been there done That. (Score:5, Interesting)
One nice thing about Nvidia's driver upgrades over the years is that each release has improved the performance of damn near every card they make. My assumption is that the drivers are 50% of the card's performance..which would make sense in the context of them being unable to fully open-source the driver.
Good for non-graphics use - and cheap! (Score:5, Interesting)
Basically, we have tons of these things and they were used back in the day when we didn't spend all of our money on expensive computer peripherals.
I would recommend using these for anyone that does not use the computer as a workstation - such as a file server or in my case, a home machine that I ssh into. Heck, I don't ever turn on the monitor quite so often for that thing.
Go S3!
Re:Good for non-graphics use - and cheap! (Score:2)
If I would put one of these cards in a machine I would choose ATI or even better matrox, which have both very stable drivers (and very good 2D quality), something you do want in a server. But a lot of servers have an integrated graphics card, which is fine.
Obviously if you have tons lying around, great, use 'm.
Re:Good for non-graphics use - and cheap! (Score:2)
My favourite was the Nvidia A7N-266 buts its oop now.
Re:Not good for any use except servers (Score:2)
Re:I WOULD LIKE TO SHIT IN YOUR HAT (Score:3, Interesting)
S3 who? (Score:5, Interesting)
There 3d cards sucked back in 96 when I bought my S3 virge. I figured it was going to be the defacto standard since Vodoo was new and never heard of. Just upgrading to NT4 and Linux from DOS, I assumed it was up to the game makers to provide the drivers and not up to directx and opengl to provide support.
But I have upgraded to 2 newer pc's since. I forgot all about them and assumed they went under. I doubt they will support FreeBSD/Linux and X as they did in the past with their own Xserver.
hmm (Score:5, Funny)
Re:hmm (Score:4, Interesting)
Re:hmm (Score:5, Interesting)
Diamond was one of the more prominent aftermarket expansion card marketer of the nineties. They were very successful selling mostly video cards, based first on S3's chipsets, which were very competitive until 3D acceleration became popular, and later nvidia and 3dfx. They branched out into a wide array of products, including SCSI controllers, motherboards (after acquiring Micronics), modems (after acquiring Supra), and audio cards. They invented the portable MP3 player, with the original Rio, and developed some of the first telephone-line and power-line home networking products. But, largely because of acquisition and competition, they were constantly losing money.
S3 was probably in a much worse bind. They were also losing money, but had none of the innovation that characterized Diamond's last years. They had been surpassed by new competition in graphics chipsets, and had no real other business. But through a lucky investment in TMSC fabrication plant, they had some cash on hand, and decided to buy out Diamond. At the time everyone assumed they were going to follow 3dfx's lead and produce sell graphics cards based on their own chipsets directly. But the truth is, they were looking for an exit both from Diamond's core computer component business, and their own graphics chipset line. After the rushed-to-market, broken, Savage 2000 was a market failure, they abandoned expansion cards entirely, throwing away the legacy of two PC hardware pioneers in favor of the Rio MP3 players, and another technology they had acquired, ReplayTV's personal video recorders. At the same time, the graphics chipset operation was spun off as a joint venture with VIA. This is what is now known as S3. The rest of the company was renamed SonicBlue. Completing the trajectory set by S3 management since the days of the Virge, they went bankrupt recently, and the Rio and ReplayTV units changed hands yet again, hopefully to more competent management. Best Data apparently picked up the old Diamond brand at the same time.
As to this new graphics chipset...I wouldn't take it seriously unless it is proven to perform decently (well, actually I wouldn't take it seriously unless it also had Linux support on par with the old Matrox card I use now, but I digress...). As far as I can see VIA is just looking for some paying beta testers to work out the bugs in the core before they embed it in their next-generation southbridge chips, so don't look for a renewed commitment to serious graphics hardware from "S3".
Re:hmm (Score:2)
S3 hasn't been cool... (Score:5, Insightful)
I'm A Little Disappointed (Score:5, Interesting)
Overall, I have to agree with the concensus that S3 is back, and may be primed to stay in the market for some time. The article mentions that they are using a
Either way, the video card market may just be heating up for 2004.
Re:I'm A Little Disappointed (Score:5, Interesting)
Re:I'm A Little Disappointed (Score:2)
Re:I'm A Little Disappointed (Score:2, Insightful)
Interesting for Home Theatre Applications (Score:3, Interesting)
Indeed.
I find this card interesting for home theatre applicatons, where 3d capabilities (while nice and IMHO necessary for a complete entertainment system, including xmame and 3d simution support) don't have to be cutting-edge fast. Of particular note is this card's component output capabilities and ability to do 1080p, 1080i, 720p, etc. Right now my home theatre PC has an ATI card connect
Give us drivers... (Score:5, Insightful)
Right now, I have an NVidia card in my workstation and I hate it. Why? Because I have to choose between using the OpenGL renderer and staying true to my beliefs about software freedom. This basically means that I paid extra for a card that I can only halfway use.
S3, take heed. Give us a product that we can use and we'll support you. Do it. It's the right thing.
Re:Give us drivers... (Score:2)
How many folks, would you estimate, would be willing to pay this 'freedom tax' of a lower performance card in exchange for access to driver internals? I'm genuinely curious, because I wouldn't have thought it would be anywhere near high enough for S3 to bother doing the paperwork, let alone even begin to weigh up IP ramifications.
It's the right thing.
Just as an aside, why is this "the right thing"? The right thing, according to the all-software-should-be-free ethos, sure, but S3 is a hardware company,
Re:Give us drivers... (Score:5, Insightful)
This may come as a bit of a chock, I know, but there are some of us out there actually _not_ willing to have the bleeding edge in graphics performance at great cost (in money, noise and power draw). My main machine is currently a laptop with an NVIDIA GF4 420 GO with 32Mb memory. It can handle anything I throw at it with no problems. True, I do not play the latest "QuakerDoom 40,000 - Bloody Dismemberement" - if gaming was the primary focus for me, I'd have a Windows partition (or, preferably, a PS/2).
Oh, and about "the right thing": you are right - they are a hardware company. Their business is selling hardware to people. Drivers are a cost, not a source of revenue. Anything they do is geared towards driving hardware sales and lowering the cost of providing said hardware. If releasing drivers or specs for Linux will increase sales more than it costs them to do the release, it is a net win.
Re:Give us drivers... (Score:2)
They could be using a software techniques they don't want there competitors it know about.
They certianly don't want to risk there IP by divulging hardware information.
They have to write drivers anyways, since no one would by a card they couldn't see run.
Drivers do have value to the bottom line.
Re:Give us drivers... (Score:3, Funny)
Re:Give us drivers... (Score:2)
Well, when the company doesn't have to pay staff to maintian the drivers, they can lower their prices and offer better performance in an even lower price range while still maintaining profitability. Doesn't seem like a "Tax" to me.
Re:Give us drivers... (Score:3, Interesting)
Sure, that's a good answer, but I doubt they're likely to just fire all their driver staff ( even if they do deserve it ) and turn the whole thing out in the open, right? At the very least, I can't see the windows driver being replaced with an open effort ( call it cultural resistance ), and Windows is where th
Re:Give us drivers... (Score:2)
Sure, so there's no profit motive, such as selling competing closed drivers, to keep them from opening up. Even if they don't write a single line of code, they can get free community support and goodwill by providing good documentation to the XFree team. As far as losing a proprietary edge, I don't think they're planning to compete with the high-end NVidia or ATI cards; I doubt that they have much to hide from the "big guys".
Re:Give us drivers... (Score:2, Interesting)
I'm a Linux user, and I believe in/contribute to "the open source movement". When it comes down to it, however, I care a lot more about things working right than whether or no
Re:Give us drivers... (Score:4, Interesting)
Sometimes I'm reminded of why RMS draws a hard line between Open Source and Free Software. :-)
NVidia's drivers work (relatively) well
For some applications, maybe. For others, the closed drivers are clearly inferior to XFree's "nv" module. For example, if you're running Linux on non-Intel hardware, or running a non-Linux Unix on Intel, then you're pretty much out in the cold. Sure, they release a FreeBSD module every now and then, but that's no help for NetBSD or OpenBSD folks. Do they offer binaries for PowerPC Linux? I'm not sure, and not interested enough to look it up at the moment.
I'm a good programmer. I have some experience debugging hardware drivers and submitting source patches. However, if the "NVidia" module crashes, there's nothing I can do except send in a half-informed bug report and hope that enough other people gripe about the same problem to motivate someone to fix it. Remember, the FSF started as a consequence of RMS not being allowed to fix a broken printer driver. :-
So, if "work[s...] well" means "usually executes without crashing and offers decent performance", then I won't argue. However, that's not the standard of "works well" that I use for myself and my employer.
Re:Give us drivers... (Score:3, Interesting)
Don't get me wrong -- I'd love to see a completely open driver from NVidia, but because of patent issues and licenses they have with other companies, it simply will not happen. Ever. But I need to do actual work on my computer that requires
Re:Give us drivers... (Score:2)
That's quite true. I hope for the sake of S3 and the Linux community that this is not the case; I'd really like to see some legitimate competition.
Why buy mid-range? (Score:5, Interesting)
Why buy something mediocre but brand new, when you could buy something that absolutely kicked ass six months ago for a similar amount of money?
Re:Why buy mid-range? (Score:3, Insightful)
Well, the Radeon 9700s have been out for over a year now, and they are still well over $200. I think that a mid-range 9600 Pro for $130 or so is a good investment. You usually get 70-80% the performance of the high end, but at less than 50% the price.
When you talk about "buying a 6 month old top-end card for a fraction of the price" you are talking about buying a Radeon 9800 for $290 that cost $450 six months ago. Yes, it's a lot less than it was, but that's still too much for the above-casual/below-fan
Re:Why buy mid-range? (Score:5, Insightful)
Why do people buy refurb'd computers?
Why do people goto yard sales?
Why do people goto dollar stores?
Maybe the secretary down the hall doesn't need a Radeon 9800?
Maybe I don't want my kid to use 'this' PC for gaming and only for school work?
There is a market for mid-range cards...
Don't just assume everyone wants to buy the best of everything. (Why isn't Mercedes-Benz the largest car manufacturer in the world?)
Re:Why buy mid-range? (Score:2)
Re:Why buy mid-range? (Score:3, Insightful)
Future support? Driver updates? (Score:5, Interesting)
Flash forward a couple of years, and while NVidia and ATI are still willing to release updated drivers for their cards of that era, the Kyro lingers unsupported, even though NEC (the chip designer) and Guillemot/Hercules (the card manufacturer) are still going strong. My friend wanted to play Halo, and even though the card should've been able to support the game (albeit at a lower resolution/framerate), he can't because his card is basically ignored and unsupported by the game manufacturers and the source comapnies for the card itself.
The moral of the story: S3 is a reasonably well-known name. So is Hercules/Guillemot/NEC. It's gonna take a hell of a price/performance ratio to get me to recommend a video card not based on Ati or NVidia after the Kyro debacle.
Re:Future support? Driver updates? (Score:2)
Actually, I think previous PowerVR chips before the Kyro II also had tile based rendering, but I could be wrong. This presentation on TBR [pvrdev.com] discusses that it seems to be present in the Naomi arcade board and Sega Dreamcasts rendering pipelines, and I'm pretty sure the DC didn't have a Kyro inside, but some earlier PowerVR.
Bitch about the drivers though, I agree.
YLFI
Re:Future support? Driver updates? (Score:2)
Also on Tech Report (Score:5, Informative)
It looks like they have half a product. Good enough hardware, absolutely horrible drivers.
And I'm not talking about drivers that don't run quickly. I'm talking about drivers that render things incorrectly or even crash! Ugh.
At least with Intel's Integrated Graphics (or Nvidia or even ATI these days) even though they may not be the quickest on the block at least their drivers *work*.
Driver Issues (Score:5, Insightful)
Wouldn't this be the perfect situation to open the source and getting the community to squeeze every last bit of performance outta their chip? It helps them save money on paying people to code the driver, and it gets the most outta their hardware. IN addition, it would also give them a healthy community that would reccommend this solution to friends/family that aren't into the bleeding-edge gaming machines.
The Matrox Parhelia (Score:3, Informative)
It seems the Parhelia was a card that was priced at more than most nVidia cards, yet provided no-where near the performance.. yet people still bought them. Why? I remember seeing the benchmarks and the Parhelia was absolutely shocking. Supposedly the only great thing was the FSAA quality but... you don't buy a card just for that, shurely?
So, what was so great about Matrox coming back with the Parhelia? I must have missed the point.
Re:The Matrox Parhelia (Score:2)
complex
Unfortunately... (Score:3, Funny)
OpenGL support? (Score:2, Informative)
Could one of the reviewers give us a report of what version of OpenGL the deltachrome supports? What extensions does it support? How many instructions long can the fragment and vertex programs be?
GLInfo (w32 application) gives a complete list of all this.
Where is my card? (Score:3, Interesting)
Re:Where is my card? (Score:2)
BTW, this is their cheapest card:
Millennium G450 PCI
G45FMDVP32DB
It's $115 in bulk.
If you don't mind a several generation old card, $20 will get you this: http://tekgems.com/Products/matrox-g200-millenium - agp-driver.htm
One generation newer than that, and $42+s&h will get you http://store.yahoo.com/compuvest/330000119-00.html
And, one
5 Years!? (Score:5, Informative)
Re:5 Years!? (Score:2)
The article says it's the "first new GPU architecture in five years." Not the first new GPU.
Re:5 Years!? (Score:2)
I lost track (Score:3, Funny)
I don't know who's who anymore!
$200? (Score:3, Insightful)
Since when is $200 and under the midrange? Isn't that where video cards top out for most of the market?
I only purchased one video card in my life that was over $100 and it was noting spectacular compared to video cards in older systems I had around the house with half the video memeory. What are you people doing with video? Heck, I had a system with a 16 meg voodoo card that can play DVD's. And they are selling on ebay for 10 bucks.
Re:$200? (Score:2, Insightful)
My 5900 Ultra has twice as many transistors as my Pentium 4 (both
You can't expect a fast video card for $80 because you can't
Good, there needs to be some more competition (Score:3, Interesting)
Maybe now, with more competition in this segment of the market, the card makers will start putting out a good final product, and not make the buyers be the the BETA testers!
The good old fashioned S3 -- perfect for servers (Score:4, Insightful)
Observations of an insider on S3's chances... (Score:5, Interesting)
To use an overused buzzword, lets assume that the S3 chip has the best "price/performance ratio" of any chip. S3 still has little chance to gain any real market share, mostly because they have little chance to get in OEM systems.
Let me explain. The retail market (where you go to BestBuy or newegg.com) makes up a very small percentage of the overall market. I can't give real numbers (I don't know if they're NDA'd or copywritten by the research company, so better safe than sorry), but lets just say, it's the OEM sales that pay everyone's salaries and keep the investors happy.
Since OEM sales are so important, lets jump into the mind of the OEM. There are 3 major things that the OEMs care about when choosing the chip to put in their computers.
1)Does this chip perform SIGNIFICANTLY better than what we're already using?
2)Is there any benefit with using company X over company Y?
3)Are we getting a better deal from the new company?
So, what does this mean for S3 (lets throw in XGI also). To put it simply, change is difficult and expensive. Assembly lines need to be retooled, software needs to be changed and re-validated. There needs to be a good reason for an OEM to change.
Going down the checklist:
1) They do not, and never will, have a part that performs that much better than nVidia's or ATI's midrange part (if they keep the "we only want the midrange" strategy). This is because the big 2 can generate a better midrange part by either lowering the price on a higher-end part, or by tweaking the binning of the higher-end parts (a high-end part that fails may be able to run as a mid-range part). Obviously, the low and mid-range parts make up the bulk of sales (and therefore contribute most to market share), so there's no way ATI or nVidia would give up any market share without a fight...and both companies have much more ammo (graphics IP) than S3 or XGI.
2)Positive mindshare in the IT world is a HUGE thing. Most of the time it is more important than the quality of the product. Though, a good product usually generates a greater mindshare, it's not always the case (read: Microsoft...to the uneducated masses). In graphics, it's been shown that the easiest way to generate a positive mindshare is to have the fastest & most stable product. nVidia built it's reputation on it's Riva and GeForce lines. ATI got back in the game with it's 9700. For S3 or XGI to gain mindshare, it can't elicit a "ooh, it's competitive" remark. It needs a "holy shit, that's fast" remark...that or some kick-ass marketing.
3)This would have to be one hell of a deal. Switching involves a risk that they will not sell as many PCs (and make as much money) as they already are. If money alone is driving the deal, the OEM would have to feel that there is a good chance of them making more money while selling fewer PCs...it doesn't take an economics major to see what that would mean for S3's or XGI's profit margins.
So, how could S3 or XGI really take market share from ATI and nVidia? Simple, make the fastest part out there at a price that rivals what nVidia and ATI sell their high-end parts for. Can one/both of them do that? Maybe, but it won't be easy. If they can do that, then they will have a solid foundation for deriving the mid-range parts, and the mid-range parts will practically sell themselves.
Re:Observations of an insider on S3's chances... (Score:4, Interesting)
Compare and contrast: Number of Radeons sold in boxes at retail vs number of GeForce class chips shipped in Dells. Doesn't bear thinking about. And, as we all suspected, the very high end videocard business *actually* *is* a dickwar.
The thing I don't quite get is why S3, who I think have a healthy business licensing IP into embeddded chipsets, northbridges and what have you, would want to be involved in the consumer shitfight? Probably just trying to build a little market presence, eh?
Dave
Tech Report, too (Score:2, Interesting)
The price better be low (Score:3, Informative)
My ATI Radeon 9800 (Score:3, Interesting)
I think if S3 can build a card with drivers stable on the first install... they'll have my money. From what I know the latest geforce FX5900 has the same problems. It's just mind boggling having to pay so much and still dealing with such a bad out-of-box-experience.
I am playing some of the most common games (RTCWET, battlefield 1942, call of duty) and they all took a massive amount of driver tweaks and install sequence to work right. The market is flooded with premature products if you ask me.
ATI and NVIDIA (Score:3, Insightful)
I don't know when the deltachrome will be on the market, but it looks like ATI and nvidia will have some new cards on the market possibly by April which will push the price of the 5900 and 9800 way down, which will in turn push the price of the 5700 and 9600 down which is going to put some serious pressure on everybody else.
I see XGI's Volari as the biggest compitition to S3's DeltaChrome.
Give us documentation. (Score:4, Interesting)
What's the point to not releasing documentation, when your card is not "high speed"? What you have to hide?
By opening source of drivers and releasing documentation - company could gain:
And it means money, because better drivers and better karma means bigger sell.
HDTV set top box (Score:4, Insightful)
The stated market for this thing is OEM sales to Mainboard producers. Doesn't it seem obvious that the inclusion of passable 3d and the ability to output to HDTV natively is positioning this for the set top box market?
How many discussions have there been of the new set top box market, or how to build your own PVR, on Slashdot in the last couple of months?
This chipset isn't for playing doom 3 on your dual monitor winxp system (though it might do that too), it is for using as a capable midrange chip in mini-itx systems, etc.
Just my $.02.
K