Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Graphics Software Hardware

The Return of S3 335

Posted by michael
from the not-dead-yet dept.
flynn_nrg writes "Just saw this article on ExtremeTech about S3's new graphics card. S3 is back on the scene with its first new GPU architecture in five years. Rather than take aim at the high-end, S3 has set its sights on the midrange price/performance category, which is currently dominated by ATI's Radeon 9600 XT and nVidia's GeForce FX 5700, both of which are under $200. Today S3 unveils the DeltaChrome S8 GPU, which represents the midrange of its upcoming line of DeltaChrome GPUs."
This discussion has been archived. No new comments can be posted.

The Return of S3

Comments Filter:
  • Wow (Score:2, Insightful)

    by Bruha (412869)
    Welcome back S3..

    Maybe it'll drive the prices down a bit.
    • Re:Wow (Score:5, Funny)

      by Anonymous Coward on Sunday December 21, 2003 @10:06PM (#7783266)
      Well given their rep it'd almost have to, since they're not going to be driving up quality.
      • Re:Wow (Score:4, Informative)

        by bonehead (6382) on Monday December 22, 2003 @01:02AM (#7784003)
        I can't speak for everybody, but personally I've never owned an S3 card that I was unhappy with. nVida has been hit or miss, and ATI has been a nightmare.

        The sad part is that I suspect that ATI's hardware is (and always has been) absolutely top notch. They just don't seem to put much focus on debugging the drivers.

        ATI video cards have been banned from my workplace for several years now, and I've not seen a reason to change my mind on that. (Yes, I get to make decisions like that)
        • Re:Wow (Score:3, Informative)

          by arivanov (12034)
          A not very well known piece of knowledge is that ATI is extremely picky on thermals. I have found it out the hard way and have been extremely careful not to put an ATI card into a case which does not have good cooling. Especially small factor cases and using it on risers (so it is chip down) are a definite no-no. Once you follow on this it is usually more or less OK (depends what you do with it of course).
    • Re:Wow (Score:3, Insightful)

      by after (669640)
      If they are going to be making pricy cards, then they might as well make them superior to the home user (think ATI, NVIDIA) aimed cards. This is just like SGI with their high-priced chips.
    • Re:Wow (Score:5, Insightful)

      by toddestan (632714) on Sunday December 21, 2003 @10:19PM (#7783327)
      Prices are already pretty reasonable. Unless you play cutting edge games, a $75 video card will do everything you want.

      Heck, even if you play cutting edge games, even that $75 card will serve you well unless you absolutely must have 1600x1200 resolution with 32bit color and 435FPS.
      • by Nazmun (590998)
        Umm no as a gamer i can tell you... the thing is that to play at 1024X768 you'd still need at least 2x the amount at $150 for a decent performing DX9 card (with hardware support for the new shaders). By far at this range is ATI with their radeon 9600 pro and even better is their older model the 9500pro if you can find it.
      • by Rahga (13479) on Sunday December 21, 2003 @10:54PM (#7783517) Homepage Journal
        Dues Ex 2 players, generally, can expect their speeds to cap out at 15 fps, regardless of the video card in use or screen resolution.
      • Well, let me tell you, I recently replaced my $75 video card with a $400 video card... And it didn't help game performance at all!

        I guess the better upgrade for new games is a faster CPU, not a faster GPU... Who would have thunk it... seriously, for the last 3 years, it has always been the GPU that maxxed out performance on my P4 1.4Ghz... damn...
        • Re:Wow (Score:5, Informative)

          by kfg (145172) on Monday December 22, 2003 @12:04AM (#7783798)
          Well, it depends on the game really. A game is not a game is not a game.

          In some games, Myst for instance, there's really no such thing as frame rate at all. In others, like shooters, the cpu requirements to handle the physics are fairly minimal and nice graphics sells games. These are the ones that require the latest hot card. If you're into sims though, like IL-2 or NASCAR 2003 the physics calculations put the hardest load on the system and for these the hottest cpu, particularly the math coprocessor, will give you the best performance overall.

          Everything is always tradeoffs and compromise. Many games even have "favorite" video cards, right down to the particular model and driver. The best you can really do is optimize for your favorite game and play the rest as is possible.

          KFG
        • Re:Wow (Score:2, Funny)

          by MachDelta (704883)
          Err... I think thats called a bottleneck.
          Your $400 GPU won't 'wow' you with it's performance if it's just sitting around twiddling its proverbial thumbs, while the rest of your system has the electronic equivalent of a heart attack.
      • Re:Wow (Score:3, Interesting)

        by MP3Chuck (652277)
        A $75 card won't last you too much longer now... I find my Radeon 8500 struggling with Halo at 800x600, and DeusEx 2 at 640x480 (with shadows completely off!). All these games now with their realtime physics and shadows...

        Back in my day we were happy to have textures...
      • Yes, my $55 MX440 gives me 30fps in UT2003 with high detail at 1152x864. However, I can't get AGP to work (if anyone has a KT400 chipset and nvidia opengl working under linux, tell me!), so 30fps is a bit low. $200 to me seems like WAY to much to pay for a graphics card (although I would like to have 85fps [85Hz refresh on my monitor] at detail maxed out).
        • You may not be able to depend on the AGPGART module that's hardware based, but the NVGART nvidia module should let you do some sort of AGP on that board. This is in the docs that come with the nvidia driver from their site, and they have linux forums on their site as well. As usual, RTFM.
          • Going off topic, I've tried everything NVAGP, AGPGART in 2.6, beta drivers, etc, etc. Googling has been mostly useless, so have forums. I think I might as well post in a forum, eh :-D

            Thanks for the reply, though.
        • Re:Wow (Score:5, Insightful)

          by bonehead (6382) on Monday December 22, 2003 @01:18AM (#7784086)
          $200 to me seems like WAY to much to pay for a graphics card

          Especially in a day and age where a hundred bucks more can buy you an entire PC.
      • Re:Wow (Score:2, Funny)

        by bonehead (6382)
        Unless you play cutting edge games, a $75 video card will do everything you want.


        Perhaps if somebody released a "cutting edge game" that had the same enjoyment value as Quake 2, I'd consider upgrading from my TNT2 card.

      • OMG, you mean S3 isnt dead yet? I would have hoped their crappy products would have driven them out of business long ago.

        Oh well, as cheap, junky, consumer-level computers are being made, S3 will always have a customer. Its all about the profit margin.

  • But wait! (Score:5, Interesting)

    by 77Punker (673758) <<spencr04> <at> <highpoint.edu>> on Sunday December 21, 2003 @10:04PM (#7783253)
    Without some razzle-dazzle high end cards to "wow" people with, they probably won't get the publicity needed to sell these midrange cards.
    • Re:But wait! (Score:5, Interesting)

      by Naffer (720686) on Sunday December 21, 2003 @10:23PM (#7783353) Journal
      You have a point. I was at an electronics store today. I watched in horror as someone picked up an ATI 9600 Pro only to return it to the shelf and grab an Nvidia 5200 because it had 256 Megabytes of RAM. To get the high end market, all you need to do is produce a damn fast card. The midgrade market is tougher to deal with because most people grab the card with the most RAM and the prettiest box.
      • Re:But wait! (Score:3, Interesting)

        by 77Punker (673758)
        Yeah...I guess it will be really good though if they can challenge ATI and NVidia by writing drivers that don't cheat. It'll be good if they write good Linux and *BSD drivers like NVidia does and not leaving people high and DRI.
        • Re:But wait! (Score:3, Interesting)

          by Phexro (9814)
          It'd be even better if they just release the specs instead of providing a buggy, incompatible, crash-prone driver like nVidia does.
          • Hey - it's difficult writing a bug free, completely compatible, non-crashable graphics driver when people have so many different hardware/ software combinations! You can't test the driver on every one!
          • I've yet to see it buggy and incompatible on any machine I've built or the mutations my main box has seen over the years. ATI, on the other hand, is STILL hard to install and STILL has issues.

            I'll take an Nvidia card on linux any day. And please, next time you dog Nvidia, throw in a link or two of hard proof before spouting FUD.
    • Re:But wait! (Score:2, Insightful)

      by Anonymous Coward
      The truth is that the vast majority of cards are sold to OEMs, such as Dell, HP, and Gateway. What these guys care about above all else is price, so if S3 can make a card that performs as well as the type of midrange Nvidia and ATI cards that the OEMs usually use at a cost that's $10 less, S3 will sell a huge volume.
    • Re:But wait! (Score:5, Insightful)

      by Peridriga (308995) on Sunday December 21, 2003 @11:01PM (#7783541)
      Maybe their not aiming for the high-end market.

      Imagine how many video cards are purchased off the shelf at computer stores. Then imagine how many video cards are purchased in new computer sales. I would imagine more video cards are moved by unit in new/refurb(card replaced) sales than individual sales for LOW/MID range cards.

      Now I know people purchase high-end cards from stores (I did) but, to sell mid-range cards you usually don't sell to the consumer you sell to the manfacturer.

      I would rather spend 'x' amount of money to produce a cheaper and comparable card to the current market norm and get a contract providing Dell w/ cards for their mid-range systems then spending '3x' the amount of money making the "newest and the greatest" card then having to spend another '2x' just marketing the damn thing to a niche market..

      I'd rather sell mid-range and more units.
      • Re:But wait! (Score:4, Insightful)

        by Hanji (626246) on Sunday December 21, 2003 @11:54PM (#7783761)
        Maybe their not aiming for the high-end market.
        Of course they're probably not. His point, however, was that *not* having a high-end card to show off and impress people with will decrease their visiblity, among other factors, and make it harder for them to sell midrange cards, even if they are comparable to or better than similarly-midrange cards from NVidia or ATI.

        If you see some truly stunning demo from NVidia or ATI on their highest-end card, you're more likely to buy from them, even if you're not shopping for a card anywhere near what you saw. It may not be completely logical, but it's true.
        • I think more of my point was to the effect that maybe their card sales are going to the people that couldn't tell you what kind of video card is in their system.
    • Re:But wait! (Score:5, Interesting)

      by Afrosheen (42464) on Monday December 22, 2003 @01:06AM (#7784018)
      Believe it or not, the home market is small and insignificant to manufacturers like S3. S3's bread and butter (as is most companies') is the OEM market. If you can put an S3 in a million Dells, Gateways or whatever, corporate desktops, Emachines, you get the picture..then you can make a ton of cash.

      Hence why S3 never really gave a rat's ass about 3d performance before. 3d is expensive to research, create, fabricate, and compete with. That's why there are only 2 players in the market and tons of little guys cranking out 2d cards. S3 would be happy to make a 2d card that can try to do a little 3d if you push it hard.

      Look on the bright side though. With s3 texture compression, Quake3 and it's descendents look much better.
  • by rkz (667993) *
    Yeah! Just like the S3 ViRGE!
    And the ViRGE GX2!
    And the Savage!
    And the Savage4!
    And the Savage2000!

    Seriously...they've said the same *damn* thing every time. The only inroads this chipset *might* make would be in low-cost laptops, where S3 already had a sizeable market until the GeForce 2 Go and Radeon Mobility started kicking butt.
    • And the S3Graphics ProSavage DDR...I'm not a troll...just want to know...WHERE THE HECK CAN I GET A PROPER DRIVER THAT IS NOT BUGGY LIKE MINE IS???? I'm tired of seeing a "windows blinds" effect on some emulators/3d games I have!!!
      (I have windows XP and the video is actually integrated)
    • Sad to say, S3's new DeltaChrome technology is just a bit too late to the game to compete against ATI and nVidia. The only way S3 can compete is to price their cards at an extremely attractive price; if they don't do it, S3 will not be able to take marketshare from graphics cards that use ATI's and nVidia's lower-end graphics chips.
    • Wow, I remember the S3 ViRGE. I think I found one directX game that actually worked with that card (Shogo) and it ran so incredibly slow and had no ability to texture map anything. It was quite funny. And yet, it was called a 3D chip set?
  • by after (669640) on Sunday December 21, 2003 @10:06PM (#7783268) Journal
    I have been using out S3 supply (outrageously large) of these cards for servers for a long time. And it doesn't get any better then that.

    Basically, we have tons of these things and they were used back in the day when we didn't spend all of our money on expensive computer peripherals.

    I would recommend using these for anyone that does not use the computer as a workstation - such as a file server or in my case, a home machine that I ssh into. Heck, I don't ever turn on the monitor quite so often for that thing.

    Go S3!
    • Eh, since when does it matter which kind of card you are putting in a server machine?

      If I would put one of these cards in a machine I would choose ATI or even better matrox, which have both very stable drivers (and very good 2D quality), something you do want in a server. But a lot of servers have an integrated graphics card, which is fine.

      Obviously if you have tons lying around, great, use 'm.
    • I just use the gpu thats built into the motherboard. they don't add very much to the price of the board.

      My favourite was the Nvidia A7N-266 buts its oop now.
  • S3 who? (Score:5, Interesting)

    by Billly Gates (198444) on Sunday December 21, 2003 @10:07PM (#7783271) Journal
    Wow.

    There 3d cards sucked back in 96 when I bought my S3 virge. I figured it was going to be the defacto standard since Vodoo was new and never heard of. Just upgrading to NT4 and Linux from DOS, I assumed it was up to the game makers to provide the drivers and not up to directx and opengl to provide support.

    But I have upgraded to 2 newer pc's since. I forgot all about them and assumed they went under. I doubt they will support FreeBSD/Linux and X as they did in the past with their own Xserver.

  • hmm (Score:5, Funny)

    by SQLz (564901) on Sunday December 21, 2003 @10:09PM (#7783278) Homepage Journal
    DeltaChrome. Sounds like a cheap mod you can buy for your Civic. I wish S3 would die and Diamond would come back.
    • Re:hmm (Score:4, Interesting)

      by rsmith-mac (639075) on Sunday December 21, 2003 @11:52PM (#7783750)
      Diamond is back [diamondmm.com]. Best Data picked them up and relaunched the company(sans the audio division, which someone else owns).
      • Re:hmm (Score:5, Interesting)

        by foonf (447461) on Monday December 22, 2003 @01:07AM (#7784022) Homepage
        Wow...they even brought back the same logo. Brings a tear to the eye. Not that I have entirely fond memories of Diamond products...the Stealth II was nice, but I was always annoyed at the complete lack of support for the original proprietary Monster Sound cards (never even wrote a driver for Windows NT/2K/XP, much less released specs to the linux community -- but I wouldn't have cared at all if it didn't have pretty decent analog output quality, and more power than almost any other PCI card I've used). But the circumstances of their demise left a rather nasty taste in my mouth. The story involves S3 to a large extent, although like Diamond, S3 then was not S3 now.

        Diamond was one of the more prominent aftermarket expansion card marketer of the nineties. They were very successful selling mostly video cards, based first on S3's chipsets, which were very competitive until 3D acceleration became popular, and later nvidia and 3dfx. They branched out into a wide array of products, including SCSI controllers, motherboards (after acquiring Micronics), modems (after acquiring Supra), and audio cards. They invented the portable MP3 player, with the original Rio, and developed some of the first telephone-line and power-line home networking products. But, largely because of acquisition and competition, they were constantly losing money.

        S3 was probably in a much worse bind. They were also losing money, but had none of the innovation that characterized Diamond's last years. They had been surpassed by new competition in graphics chipsets, and had no real other business. But through a lucky investment in TMSC fabrication plant, they had some cash on hand, and decided to buy out Diamond. At the time everyone assumed they were going to follow 3dfx's lead and produce sell graphics cards based on their own chipsets directly. But the truth is, they were looking for an exit both from Diamond's core computer component business, and their own graphics chipset line. After the rushed-to-market, broken, Savage 2000 was a market failure, they abandoned expansion cards entirely, throwing away the legacy of two PC hardware pioneers in favor of the Rio MP3 players, and another technology they had acquired, ReplayTV's personal video recorders. At the same time, the graphics chipset operation was spun off as a joint venture with VIA. This is what is now known as S3. The rest of the company was renamed SonicBlue. Completing the trajectory set by S3 management since the days of the Virge, they went bankrupt recently, and the Rio and ReplayTV units changed hands yet again, hopefully to more competent management. Best Data apparently picked up the old Diamond brand at the same time.

        As to this new graphics chipset...I wouldn't take it seriously unless it is proven to perform decently (well, actually I wouldn't take it seriously unless it also had Linux support on par with the old Matrox card I use now, but I digress...). As far as I can see VIA is just looking for some paying beta testers to work out the bugs in the core before they embed it in their next-generation southbridge chips, so don't look for a renewed commitment to serious graphics hardware from "S3".
  • by pw700z (679598) on Sunday December 21, 2003 @10:11PM (#7783286)
    ...since VESA local bus (VLB) video died. Now THOSE were the days. Even AMD was really, really cool in a mainstream sort of way - anyone remember the 486DX2-80MHz? Or the 120MHz which was faster than the Pentiums at the time? A DX4 120 + a fast S3 VLB video kicked serious butt, at least in 2D and text modes.
  • by rice_web (604109) on Sunday December 21, 2003 @10:14PM (#7783305)
    But for only $150, nothing should hold this card back aside from name recognition. The $150 print point almost seals the deal for me, only that I'm holding out for better offerings from ATi and NVidia before moving up from my GeForce2 MX (I'm not much of a gamer).

    Overall, I have to agree with the concensus that S3 is back, and may be primed to stay in the market for some time. The article mentions that they are using a .13 micron manufacturing process, the same as ATi and NVidia, which should allow them to crank out higher-speed cards within the next few months, at least allowing S3 to remain competitive.

    Either way, the video card market may just be heating up for 2004.
    • by pw700z (679598) on Sunday December 21, 2003 @10:18PM (#7783322)
      Something just occured to me about what might hold it back... I somehow remember s3's video driver quality going down the tubes in a big way towards the (last) end. If they can make a quality product, with quality drivers, and maybe even focus on really great 2d performance, they could be on to something.
      • The article mentions terrible driver support, but I personally think of this as something that can be fixed. Heck, both ATi and NVidia have proven this time and again [and again]
        • The question is how long will it take? How about all the 3dfx owners that never got decent drivers? Did Matrox ever get around to decent OpenGL drivers for the G400 or did they just have their Quake OpenGL->d3d wrapper? Did SiS ever give good drivers for their Xabre? Did Trident ever release good drivers for its products? Are the Kyro drivers good enough to run all applications? Only recently has ATI started producing stable drivers, and even then 7x00 users seem to be experiencing problems sometim
    • Overall, I have to agree with the concensus that S3 is back, and may be primed to stay in the market for some time.

      Indeed.

      I find this card interesting for home theatre applicatons, where 3d capabilities (while nice and IMHO necessary for a complete entertainment system, including xmame and 3d simution support) don't have to be cutting-edge fast. Of particular note is this card's component output capabilities and ability to do 1080p, 1080i, 720p, etc. Right now my home theatre PC has an ATI card connect
  • Give us drivers... (Score:5, Insightful)

    by Just Some Guy (3352) <kirk+slashdot@strauser.com> on Sunday December 21, 2003 @10:17PM (#7783318) Homepage Journal
    ...and we will buy. I mean that. Provide either Open Source drivers for X, or the full specs required to implement them, and you will sell hundreds of thousands of cards to those of us who are more interested in non-proprietary kernel modules than raw performance.

    Right now, I have an NVidia card in my workstation and I hate it. Why? Because I have to choose between using the OpenGL renderer and staying true to my beliefs about software freedom. This basically means that I paid extra for a card that I can only halfway use.

    S3, take heed. Give us a product that we can use and we'll support you. Do it. It's the right thing.

    • How many folks, would you estimate, would be willing to pay this 'freedom tax' of a lower performance card in exchange for access to driver internals? I'm genuinely curious, because I wouldn't have thought it would be anywhere near high enough for S3 to bother doing the paperwork, let alone even begin to weigh up IP ramifications.

      It's the right thing.

      Just as an aside, why is this "the right thing"? The right thing, according to the all-software-should-be-free ethos, sure, but S3 is a hardware company,

      • by JanneM (7445) on Sunday December 21, 2003 @10:40PM (#7783440) Homepage
        No "freedom tax". It is a somewhat lower performance card, with a lower price tag.

        This may come as a bit of a chock, I know, but there are some of us out there actually _not_ willing to have the bleeding edge in graphics performance at great cost (in money, noise and power draw). My main machine is currently a laptop with an NVIDIA GF4 420 GO with 32Mb memory. It can handle anything I throw at it with no problems. True, I do not play the latest "QuakerDoom 40,000 - Bloody Dismemberement" - if gaming was the primary focus for me, I'd have a Windows partition (or, preferably, a PS/2).

        Oh, and about "the right thing": you are right - they are a hardware company. Their business is selling hardware to people. Drivers are a cost, not a source of revenue. Anything they do is geared towards driving hardware sales and lowering the cost of providing said hardware. If releasing drivers or specs for Linux will increase sales more than it costs them to do the release, it is a net win.

        • not true.

          They could be using a software techniques they don't want there competitors it know about.

          They certianly don't want to risk there IP by divulging hardware information.

          They have to write drivers anyways, since no one would by a card they couldn't see run.

          Drivers do have value to the bottom line.

        • Hey, my ps/2 was a 12Mhz! It ran wolfenstein 3d from a floppy in a window the size of a minidisc in monocrome like no other! Thanks to that machine, I was all too ready for ...LOADING... on the playstation...
      • How many folks, would you estimate, would be willing to pay this 'freedom tax' of a lower performance card in exchange for access to driver internals?

        Well, when the company doesn't have to pay staff to maintian the drivers, they can lower their prices and offer better performance in an even lower price range while still maintaining profitability. Doesn't seem like a "Tax" to me.
        • Well, when the company doesn't have to pay staff to maintian the drivers, they can lower their prices and offer better performance in an even lower price range while still maintaining profitability.

          Sure, that's a good answer, but I doubt they're likely to just fire all their driver staff ( even if they do deserve it ) and turn the whole thing out in the open, right? At the very least, I can't see the windows driver being replaced with an open effort ( call it cultural resistance ), and Windows is where th

      • but S3 is a hardware company

        Sure, so there's no profit motive, such as selling competing closed drivers, to keep them from opening up. Even if they don't write a single line of code, they can get free community support and goodwill by providing good documentation to the XFree team. As far as losing a proprietary edge, I don't think they're planning to compete with the high-end NVidia or ATI cards; I doubt that they have much to hide from the "big guys".

    • by Eamon C (575973)
      If you want to make it a political issue, that's fine -- more power to you. But recognize that you're among a minority. I'm not sure I believe that there are "hundreds of thousands" of *desktop* Linux users, I refuse to believe that any preponderance of them "are more interested in non-proprietary kernel modules than raw performance."

      I'm a Linux user, and I believe in/contribute to "the open source movement". When it comes down to it, however, I care a lot more about things working right than whether or no
      • by Just Some Guy (3352) <kirk+slashdot@strauser.com> on Sunday December 21, 2003 @11:46PM (#7783725) Homepage Journal
        I'm a Linux user, and I believe in/contribute to "the open source movement". When it comes down to it, however, I care a lot more about things working right than whether or not I have the source code.

        Sometimes I'm reminded of why RMS draws a hard line between Open Source and Free Software. :-)

        NVidia's drivers work (relatively) well

        For some applications, maybe. For others, the closed drivers are clearly inferior to XFree's "nv" module. For example, if you're running Linux on non-Intel hardware, or running a non-Linux Unix on Intel, then you're pretty much out in the cold. Sure, they release a FreeBSD module every now and then, but that's no help for NetBSD or OpenBSD folks. Do they offer binaries for PowerPC Linux? I'm not sure, and not interested enough to look it up at the moment.

        I'm a good programmer. I have some experience debugging hardware drivers and submitting source patches. However, if the "NVidia" module crashes, there's nothing I can do except send in a half-informed bug report and hope that enough other people gripe about the same problem to motivate someone to fix it. Remember, the FSF started as a consequence of RMS not being allowed to fix a broken printer driver. :-

        So, if "work[s...] well" means "usually executes without crashing and offers decent performance", then I won't argue. However, that's not the standard of "works well" that I use for myself and my employer.

        • by Eamon C (575973)
          I see where you're coming from, but not that many people (including the majority of Linux users) will eschew hardware just because they can't fix their own driver. Maybe there are a couple thousand such people in the entire world, definitely not "hundreds of thousands".

          Don't get me wrong -- I'd love to see a completely open driver from NVidia, but because of patent issues and licenses they have with other companies, it simply will not happen. Ever. But I need to do actual work on my computer that requires

  • Why buy mid-range? (Score:5, Interesting)

    by mu-sly (632550) on Sunday December 21, 2003 @10:19PM (#7783329) Homepage Journal
    Mid-range graphics cards seem a slightly pointless purchase, given that you can buy top-of-the-range cards from 6 months ago for a fraction of their original prices (not to mention the second hand prices).

    Why buy something mediocre but brand new, when you could buy something that absolutely kicked ass six months ago for a similar amount of money?

    • Well, the Radeon 9700s have been out for over a year now, and they are still well over $200. I think that a mid-range 9600 Pro for $130 or so is a good investment. You usually get 70-80% the performance of the high end, but at less than 50% the price.

      When you talk about "buying a 6 month old top-end card for a fraction of the price" you are talking about buying a Radeon 9800 for $290 that cost $450 six months ago. Yes, it's a lot less than it was, but that's still too much for the above-casual/below-fan
    • by Peridriga (308995) on Sunday December 21, 2003 @11:05PM (#7783563)
      Why do people buy used cars?
      Why do people buy refurb'd computers?
      Why do people goto yard sales?
      Why do people goto dollar stores?

      Maybe the secretary down the hall doesn't need a Radeon 9800?
      Maybe I don't want my kid to use 'this' PC for gaming and only for school work?

      There is a market for mid-range cards...

      Don't just assume everyone wants to buy the best of everything. (Why isn't Mercedes-Benz the largest car manufacturer in the world?)
  • by UserChrisCanter4 (464072) on Sunday December 21, 2003 @10:20PM (#7783335)
    Remember the Kyro II? The chip used a unique tile-based rendering system that produced performance similar to the then-current Geforce 2s (although some synthetic benchmarks indicated otherwise) while being priced more in line with the MX line of cards. After much reading and research, a buddy of mine decided to pick one up for his machine, his reasoning being that he wasn't a super hardcore gamer, but wanted to be able to throw down with us every once in a while.

    Flash forward a couple of years, and while NVidia and ATI are still willing to release updated drivers for their cards of that era, the Kyro lingers unsupported, even though NEC (the chip designer) and Guillemot/Hercules (the card manufacturer) are still going strong. My friend wanted to play Halo, and even though the card should've been able to support the game (albeit at a lower resolution/framerate), he can't because his card is basically ignored and unsupported by the game manufacturers and the source comapnies for the card itself.

    The moral of the story: S3 is a reasonably well-known name. So is Hercules/Guillemot/NEC. It's gonna take a hell of a price/performance ratio to get me to recommend a video card not based on Ati or NVidia after the Kyro debacle.
    • Remember the Kyro II? The chip used a unique tile-based rendering system

      Actually, I think previous PowerVR chips before the Kyro II also had tile based rendering, but I could be wrong. This presentation on TBR [pvrdev.com] discusses that it seems to be present in the Naomi arcade board and Sega Dreamcasts rendering pipelines, and I'm pretty sure the DC didn't have a Kyro inside, but some earlier PowerVR.

      Bitch about the drivers though, I agree.

      YLFI

    • Tell him to check the web site. There are new drivers in the last month or two. Get them from Powervr.com, not Guillemot. I did the same thing and bought one. While it mostly rocks, it does run into the not-supported-game problem, and I'm about to have to get my THIRD fan for it. :(
  • Also on Tech Report (Score:5, Informative)

    by Anonymous Coward on Sunday December 21, 2003 @10:20PM (#7783337)
    http://www.tech-report.com/etc/2003q4/deltachrome- s8/index.x?pg=1

    It looks like they have half a product. Good enough hardware, absolutely horrible drivers.

    And I'm not talking about drivers that don't run quickly. I'm talking about drivers that render things incorrectly or even crash! Ugh.

    At least with Intel's Integrated Graphics (or Nvidia or even ATI these days) even though they may not be the quickest on the block at least their drivers *work*.
  • Driver Issues (Score:5, Insightful)

    by miracle69 (34841) on Sunday December 21, 2003 @10:23PM (#7783356)
    So they're releasing a card with serious driver issues, where the top of the line model is expected to compete in the mid-price range market.

    Wouldn't this be the perfect situation to open the source and getting the community to squeeze every last bit of performance outta their chip? It helps them save money on paying people to code the driver, and it gets the most outta their hardware. IN addition, it would also give them a healthy community that would reccommend this solution to friends/family that aren't into the bleeding-edge gaming machines.

  • The Matrox Parhelia (Score:3, Informative)

    by wackybrit (321117) on Sunday December 21, 2003 @10:24PM (#7783360) Homepage Journal
    Perhaps someone with some real knowledge could fill me in here.. but does anyone else remember Matrox 'coming back' less than a year ago with the Matrox Parhelia? This S3 return sounds like it could be the same, unless they make good on their promise of lower prices (and considering the price you can get a GeForce 4 MX for now.. it's a hard fight).

    It seems the Parhelia was a card that was priced at more than most nVidia cards, yet provided no-where near the performance.. yet people still bought them. Why? I remember seeing the benchmarks and the Parhelia was absolutely shocking. Supposedly the only great thing was the FSAA quality but... you don't buy a card just for that, shurely?

    So, what was so great about Matrox coming back with the Parhelia? I must have missed the point.
    • the parhelia sells well in one niche area i know of: stock trading displays. dual- or triple-head trading systems are simply a matter of dropping in the card and loading one driver. it has sane multi-monitor defaults and exceptional 2d performance. the stock traders on the forums i frequent love them.

      complex
  • by SlyDe (247694) on Sunday December 21, 2003 @10:42PM (#7783458)
    ... the new architecture is based on 31-bit integer datestamps and is expected to roll over to zero before it is released.
  • OpenGL support? (Score:2, Informative)

    by Anonymous Coward
    http://delphi3d.net/hardware/

    Could one of the reviewers give us a report of what version of OpenGL the deltachrome supports? What extensions does it support? How many instructions long can the fragment and vertex programs be?

    GLInfo (w32 application) gives a complete list of all this.
  • Where is my card? (Score:3, Interesting)

    by Anonymous Coward on Sunday December 21, 2003 @10:49PM (#7783497)
    I want a new video card, but no one makes the card I want. I do not give a crap about playing games, I want the modern equivalent of a Matrox card, one that is cheap yet renders beautiful color for 2D apps. Hell I still prefer the ATI Rage card with all 8mb of ram to the crappy Geforce with 64mb. I want clean bright color, I do not care how fast the card is. I am willing to pay about 50 bucks for it. So where is it? Maybe I need to find some new old stock matrox cards? Any pointers towards those?
    • Hmm, have you asked Matrox yet? Last I checked, they were still around. Oh, wait, you said $50... hmm... what's wrong with an ATI Radeon 7000, or integrated graphics?

      BTW, this is their cheapest card:
      Millennium G450 PCI
      G45FMDVP32DB

      It's $115 in bulk.

      If you don't mind a several generation old card, $20 will get you this: http://tekgems.com/Products/matrox-g200-millenium - agp-driver.htm

      One generation newer than that, and $42+s&h will get you http://store.yahoo.com/compuvest/330000119-00.html

      And, one
  • 5 Years!? (Score:5, Informative)

    by rsmith-mac (639075) on Sunday December 21, 2003 @11:01PM (#7783540)
    Someone's math is a little off here on how long it's been since the last S3 video card. The last card they produced(not counting numerous mobile parts) was the Savage2000, a DX7 class card designed to compete with the GeForce256 in late 1999/2000. The S2K of course had its infamous issues(defective T&L unit, S3/Diamond was accepting S2K's in trade for TNT2U's), but the point is that it has barely been 4 years, not 5.
    • "Someone's math is a little off here on how long it's been since the last S3 video card."

      The article says it's the "first new GPU architecture in five years." Not the first new GPU.
  • by Stonent1 (594886) <stonent AT stone ... intclark DOT net> on Sunday December 21, 2003 @11:35PM (#7783678) Journal
    of Diamond/Supra/Micronics/S3/Sonic Blue/Rio/Via/Cyrix

    I don't know who's who anymore!
  • $200? (Score:3, Insightful)

    by John Seminal (698722) on Sunday December 21, 2003 @11:39PM (#7783695) Journal
    S3 has set its sights on the midrange price/performance category, which is currently dominated by ATI's Radeon 9600 XT and nVidia's GeForce FX 5700, both of which are under $200.

    Since when is $200 and under the midrange? Isn't that where video cards top out for most of the market?

    I only purchased one video card in my life that was over $100 and it was noting spectacular compared to video cards in older systems I had around the house with half the video memeory. What are you people doing with video? Heck, I had a system with a 16 meg voodoo card that can play DVD's. And they are selling on ebay for 10 bucks.

  • by C. Alan (623148) on Sunday December 21, 2003 @11:48PM (#7783735)
    I reciently bought a ATI AIW 9600 pro card, and It is one of the biggest computer let downs I have ever seen. The drivers were crap, and it took me the better part of two weeks to get all of the 'features' on the card to work.

    Maybe now, with more competition in this segment of the market, the card makers will start putting out a good final product, and not make the buyers be the the BETA testers!
  • by billsf (34378) <billsf&cuba,calyx,nl> on Monday December 22, 2003 @12:05AM (#7783801) Homepage Journal
    Don't us computer professionals deserve a usable video card for Eur 10,-- or so? Gamer/lamer crap and 5.1 Dolby sound has no place in the bulk of the computing world.
  • by Anonymous Coward on Monday December 22, 2003 @12:10AM (#7783819)
    I recently quit one of the big 2 GPU companies to pursue other opportunities...which one is irrelevant, but this is an AC post none-the-less. This is a brief look at the business end...I'll leave the "it's great" or "it's garbage" discussions to others.

    To use an overused buzzword, lets assume that the S3 chip has the best "price/performance ratio" of any chip. S3 still has little chance to gain any real market share, mostly because they have little chance to get in OEM systems.

    Let me explain. The retail market (where you go to BestBuy or newegg.com) makes up a very small percentage of the overall market. I can't give real numbers (I don't know if they're NDA'd or copywritten by the research company, so better safe than sorry), but lets just say, it's the OEM sales that pay everyone's salaries and keep the investors happy.

    Since OEM sales are so important, lets jump into the mind of the OEM. There are 3 major things that the OEMs care about when choosing the chip to put in their computers.

    1)Does this chip perform SIGNIFICANTLY better than what we're already using?

    2)Is there any benefit with using company X over company Y?

    3)Are we getting a better deal from the new company?

    So, what does this mean for S3 (lets throw in XGI also). To put it simply, change is difficult and expensive. Assembly lines need to be retooled, software needs to be changed and re-validated. There needs to be a good reason for an OEM to change.

    Going down the checklist:

    1) They do not, and never will, have a part that performs that much better than nVidia's or ATI's midrange part (if they keep the "we only want the midrange" strategy). This is because the big 2 can generate a better midrange part by either lowering the price on a higher-end part, or by tweaking the binning of the higher-end parts (a high-end part that fails may be able to run as a mid-range part). Obviously, the low and mid-range parts make up the bulk of sales (and therefore contribute most to market share), so there's no way ATI or nVidia would give up any market share without a fight...and both companies have much more ammo (graphics IP) than S3 or XGI.

    2)Positive mindshare in the IT world is a HUGE thing. Most of the time it is more important than the quality of the product. Though, a good product usually generates a greater mindshare, it's not always the case (read: Microsoft...to the uneducated masses). In graphics, it's been shown that the easiest way to generate a positive mindshare is to have the fastest & most stable product. nVidia built it's reputation on it's Riva and GeForce lines. ATI got back in the game with it's 9700. For S3 or XGI to gain mindshare, it can't elicit a "ooh, it's competitive" remark. It needs a "holy shit, that's fast" remark...that or some kick-ass marketing.

    3)This would have to be one hell of a deal. Switching involves a risk that they will not sell as many PCs (and make as much money) as they already are. If money alone is driving the deal, the OEM would have to feel that there is a good chance of them making more money while selling fewer PCs...it doesn't take an economics major to see what that would mean for S3's or XGI's profit margins.

    So, how could S3 or XGI really take market share from ATI and nVidia? Simple, make the fastest part out there at a price that rivals what nVidia and ATI sell their high-end parts for. Can one/both of them do that? Maybe, but it won't be easy. If they can do that, then they will have a solid foundation for deriving the mid-range parts, and the mid-range parts will practically sell themselves.
    • by WasterDave (20047) <davep&zedkep,com> on Monday December 22, 2003 @03:36AM (#7784554)
      I've heard these arguments - particularly the one about how retail sales are basically irrelevant - in a number of places, and hearing it again just confirms my suspicions about how true they are.

      Compare and contrast: Number of Radeons sold in boxes at retail vs number of GeForce class chips shipped in Dells. Doesn't bear thinking about. And, as we all suspected, the very high end videocard business *actually* *is* a dickwar.

      The thing I don't quite get is why S3, who I think have a healthy business licensing IP into embeddded chipsets, northbridges and what have you, would want to be involved in the consumer shitfight? Probably just trying to build a little market presence, eh?

      Dave
  • Tech Report, too (Score:2, Interesting)

    by Anonymous Coward
    Tech Report also has DeltaChrome preview [techreport.com] with screen shots of just how messed up S3's drivers are in some applications.
  • by doormat (63648) on Monday December 22, 2003 @12:58AM (#7783990) Homepage Journal
    Because the card is only an "adequate" performer so far. Of course, that review left a lot to be desired, synthetic benchmarks arent a good basis. More real games, less 3DMark2xxx. nVidia showed how easy it is to cheat at synth benches.
  • My ATI Radeon 9800 (Score:3, Interesting)

    by superpulpsicle (533373) on Monday December 22, 2003 @01:14AM (#7784060)
    Ok, I just bought this card and it seriously took over a week to configure to get things stablize. I jumped around from the catalyst 3.7, 3.8, 3.9, 3.10 before things would work decently.

    I think if S3 can build a card with drivers stable on the first install... they'll have my money. From what I know the latest geforce FX5900 has the same problems. It's just mind boggling having to pay so much and still dealing with such a bad out-of-box-experience.

    I am playing some of the most common games (RTCWET, battlefield 1942, call of duty) and they all took a massive amount of driver tweaks and install sequence to work right. The market is flooded with premature products if you ask me.
  • ATI and NVIDIA (Score:3, Insightful)

    by Saville (734690) on Monday December 22, 2003 @01:49AM (#7784239)
    http://www.digitimes.com/NewsShow/Article.asp?date Publish=2003/12/19&pages=A7&seq=47

    I don't know when the deltachrome will be on the market, but it looks like ATI and nvidia will have some new cards on the market possibly by April which will push the price of the 5900 and 9800 way down, which will in turn push the price of the 5700 and 9600 down which is going to put some serious pressure on everybody else.

    I see XGI's Volari as the biggest compitition to S3's DeltaChrome.
  • by Jacek Poplawski (223457) on Monday December 22, 2003 @02:02AM (#7784286)
    IMHO for Linux community drivers don't matter so much as free documentation. S3, please release detailed documentation to your card, so people could create Free Drivers, both 2D (XFree86, kdrive, framebuffer, etc) and 3D (DRI).

    What's the point to not releasing documentation, when your card is not "high speed"? What you have to hide?

    By opening source of drivers and releasing documentation - company could gain:

    • better drivers (because coders will find bugs and send patches)
    • new drivers (for Linux and other OSes)
    • karma


    And it means money, because better drivers and better karma means bigger sell.
  • HDTV set top box (Score:4, Insightful)

    by Cuchullain (25146) on Monday December 22, 2003 @10:40AM (#7786292) Homepage
    I think that everyone who is comparing this chipset with the high end ATI and Nvidia chipsets is missing the point.

    The stated market for this thing is OEM sales to Mainboard producers. Doesn't it seem obvious that the inclusion of passable 3d and the ability to output to HDTV natively is positioning this for the set top box market?

    How many discussions have there been of the new set top box market, or how to build your own PVR, on Slashdot in the last couple of months?

    This chipset isn't for playing doom 3 on your dual monitor winxp system (though it might do that too), it is for using as a capable midrange chip in mini-itx systems, etc.

    Just my $.02.

    K

If it happens once, it's a bug. If it happens twice, it's a feature. If it happens more than twice, it's a design philosophy.

Working...