Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware

SiS Releases 0.13-micron Xabre600 GPU 111

EconolineCrush writes "NVIDIA may be struggling to bring the GeForce FX to market on a 0.13-micron manufacturing processes, but it looks like SiS has beat them to the punch. Tech Report has a review of the new Xaber600, which is the first mainstream GPU that I know of to be manufactured using 0.13-micron process technology. The Xabre600's performance isn't overly impressive, even when compared to a low-end Radeon 9000 Pro, but it's nice to see SiS one-upping the graphics giants when it comes to process technology."
This discussion has been archived. No new comments can be posted.

SiS Releases 0.13-micron Xabre600 GPU

Comments Filter:
  • by cbcbcb ( 567490 ) on Tuesday November 26, 2002 @08:57AM (#4758223)
    Do SiS still support the DRI project [sf.net]?
  • Ummm (Score:5, Insightful)

    by Clue4All ( 580842 ) on Tuesday November 26, 2002 @09:00AM (#4758249) Homepage
    I'm really confused by this article.

    NVIDIA may be struggling to bring the GeForce FX to market on a 0.13-mircon manufacturing processes, but it looks like SiS has beat them to the punch. Tech Report has a review of the new Xaber600, which is the first mainstream GPU that I know of to be manufactured using 0.13-micron process technology.

    nVidia's GeForce FX is already in production.

    The Xabre600's performance isn't overly impressive, even when compared to a low-end Radeon 9000 Pro, but it's nice to see SiS one-upping the graphics giants when it comes to process technology.

    Okay, if it's not that great, and nVidia is already producing theirs, how exactly are they beating them to the punch? It's nice to see another article on the 0.13-micron process, but I really have no idea what your point is supposed to be.
    • Re:Ummm (Score:3, Insightful)

      by C_To ( 628122 )
      Its stating that although the Nvidia GeForce FX is in production, we will most likely not see any version of the card avaliable to consumers until Mid-January to Feburary.

      I agree, who cares about the process, but its nice to see an alternative video card that may have some performance for a decent price compared to ATI and Nvidia (who also make quality cards). Sounds like something OEM's might use on some machines for its price.
    • all it means is that they happened to get their GDS to TSMC or whomever sooner. and by the limited performance of said design, that doesn't seem too difficult.
    • Nuff said. If they're on the market now, they beat nVidia.
    • Ok, so it can't even out-perform low end Radeon boards, and they're 'one upping' and beating someone to the punch??

      Ok, so now it's thin, but it still is a crappy product in general.. gg SiS.

      Sorry, try building one that works properly with OpenGL (i.e. doesn't lock up while playing OpenGL games like about 70% of their boards do), or provide a Linux driver.. heck, then we'd have something to be impressed about.

      Sorry, that is trollish, but I think they've got a long way to go to restore consumer confidence in a product that's been as bad as theirs for as long as it has.

  • A matter of time (Score:3, Insightful)

    by GeckoFood ( 585211 ) <[geckofood] [at] [gmail.com]> on Tuesday November 26, 2002 @09:01AM (#4758255) Journal

    It is just a matter of time if the bottleneck is in the drivers. It would be great to see SiS get seriously competitive at the top end of the GPU battle and give both nVidia and ATI something about which to worry. If it's in the chip instead, though, all the driver tweaks in the world will not help it catch up.

    Quickly supplying Linux drivers is a good move on their part, too. Wait too long, and they will cut themselves out of an important market. Windows ain't the only game in town anymore...

    Good luck to SiS!!

    • by ceejayoz ( 567949 ) <cj@ceejayoz.com> on Tuesday November 26, 2002 @09:13AM (#4758320) Homepage Journal
      Quickly supplying Linux drivers is a good move on their part, too. Wait too long, and they will cut themselves out of an important market. Windows ain't the only game in town anymore...

      Sorry, but Windows might as well be the only game in town, at least for graphics cards. What's the main market for graphics cards? Gamers. How many new games come out for Linux? Very few.
      • older games don't need gfx cards?

        the few new games that are coming to linux don't need gfx cards, the games that are available on linux don't need gfx cards?

        if there is more than even 1 game there is market for card to play it with.

        txt sucks seriously for movies.(yeh, mplayer aaout doesn't look that good). software scaling sucks seriously too.

        windows is the _main_ player in game town but not the only one.
      • Re:A matter of time (Score:5, Informative)

        by Patoski ( 121455 ) on Tuesday November 26, 2002 @10:09AM (#4758738) Homepage Journal
        Actually, its not as bleak as that. The push to get hardware accelerated drivers for Linux is gathering momentum thanks at least in part to the companies who create CGI for entertainment industry. This industry is an early adopter of Linux and is moving fairly aggressively to incorporate Linux into their organizations. This is why you see most of the professional 3D packages now with Linux versions of their products. Now, the same companies who buy these expensive modeling packages (like Maya which is $10,000) buy those *really* expensive professional level graphics cards. There are usually very large profit margins on these professional level cards so the video card manufacturers tend to bend over backwards to try and appease these customers, as the profit margin on the consumer side is wafer thin. This is one large reason why you're seeing more accelerated video card drivers for Linux pop up nowadays from the larger video card manufacturers. ATI just released their unified accelerated drivers for Linux and Nvidia has had drivers for Linux for quite a while. Heck, even my trusty little Kyro II card has had Linux drivers for quite some time. Things are looking up for Linux video drivers and its only going to get better as time goes on.
  • by larsoncc ( 461660 ) on Tuesday November 26, 2002 @09:02AM (#4758266) Homepage
    I think I'll reserve my "dancing in the street" jubilation for when a .13 micron process starts benefitting the consumer.

    Manufacturing processes change quite frequently. Although a .13 Micron process will mean that these companies will be able to yield more chips per wafer, the pricing model on high end graphics cards has remained static over the past few years.

    When the top-of-the-line graphics card costs half of what it does today (heck, say... $150, instead of $300, or even $400), THEN that's cause to celebrate new manufacturing processes.

    Until then, it's an incrimental improvement that's designed to maximize profit.
    • by archeopterix ( 594938 ) on Tuesday November 26, 2002 @09:09AM (#4758304) Journal
      Manufacturing processes change quite frequently. Although a .13 Micron process will mean that these companies will be able to yield more chips per wafer, the pricing model on high end graphics cards has remained static over the past few years.
      What? I think that .13 micron isn't about more chips per wafer - in fact it yields probably less chips per wafer - the thinner your tracks are, the lower success rate. As far as I know it's all about power consumption and clock rate - the smaller your stuff, the less power needed and the faster it can run without overheating.
      • Actually, the less area a chip takes up, the less likely it is to have a defect in it. It yields more chips per wafer because their are more chips on a wafer and each individual chip is less likely to have a defect.
        • Actually, the less area a chip takes up, the less likely it is to have a defect in it. It yields more chips per wafer because their are more chips on a wafer and each individual chip is less likely to have a defect.
          The thinner the paths, the more likely it is for them to become underetched - the acid solution gets under the mask and breaks the paths.
          • by Pii ( 1955 )
            And to continue the point:

            The .13 micron process uses less raw materials, which is a cost savings to the manufacturers.

            The smaller die also results in a cooler running chip, which can result in a boon for performance.

            The trouble, as archeopterix points out, is that this process requires a great deal of precision. Early yields will likely be prone to failures until the kinks are worked out. Once the line has been straightened out, the accountants will be pleased, as will the speed freaks.

          • Ah but smaller die size means it is less likeley
            to be damaged by particulate matter. Im sure its
            not a linear relationship, but smaller feature
            size does increase yield.

            -Brandon
        • Environmental defects (a meteoric dust particle crashlands on a few dozen traces) per chip decrease as chips decrease in size. Manufacturing defects per chip increase as chips decrease in size. What the overall effect is depends on the manufacturing plant in question, the chip design in question, and probably the geologic stability of the continent as a whole during critical manufacturing steps.

          The original poster is correct: smaller traces => faster chips | lower power.
      • OK, I can see that the manufacturing process is a bone of contention. I had heard from sources like this: http://techzone.pcvsconsole.com/news.php?tzd=1313 that the .13 manufacturing process increases efficiency.

        The industries would pursue this unless there was financial gain from it, like PERHAPS better failure rates.
    • >When the top-of-the-line graphics card costs
      >half of what it does today (heck, say... $150,
      >instead of $300, or even $400), THEN that's
      >cause to celebrate new manufacturing processes.

      For the record: the $400 you talked about is the PRICE not the COST.

      Just don't buy the newest top-of-the-line. I always find the top-of-the-line 6 months ago to be the best deals.
      • Sure. I guess my point is that I'd celebrate a new pricing model (if cheaper) over a new manufacturing process any day.

        It's similar to how computing costs have dropped over the past few years. Slimmer margins have encouraged increased volume, which in turn forces manufacturers to find better processes, which lowers manufacturing costs, which starts the process anew.

        When there's very little price difference between top of the line and next-to top of the line, people usually pick top of the line, or as close as they can afford.

        I see this happening with the "PC commodities" - HDD, RAM, etc. Hopefully the trend will continue to encompass all components.
    • Yes the price of top spec EVERYTHING in computing remains the same. Whether it be servers, PCs, Graphics cards or what ever. Moore's law states that things improve which means....

      Ready for it....

      WHAT WAS TOP SPEC LAST YEAR ISN'T THIS YEAR

      Sorry for the shouting but really, what a silly thing to say. The Top Spec is the most expensive to manufacture as its new, has R&D to pay off and the volumes are lower. As the technology improves it becomes cheaper to make the old Top Spec thing as its now possible to make better things so making the older item becomes simpler, also the volumes go up as the cost goes down which again makes it cheaper to produce.

      Saying "Top Spec cost half of what it does today" is just silly, the top spec card TODAY will cost HALF what it does now in 18 months time or so. That is because it will be the commodity item by then. The top spec card 18 months ago is now HALF the price that it was then.

      Welcome to computing, its nice to have you aboard.
      • Really?

        What did you pay for a top of the line computer 10 years ago? I still have an ad for a PS/2 with XGA graphics, 80 Meg HDD, that retailed for over $10,000.

        It would be difficult to find a computer over $5,000 today (unless you're buying a mac!), and with present-value in mind, that means it's less than half price for top of the line.

    • When the top-of-the-line graphics card costs half of what it does today (heck, say... $150, instead of $300, or even $400), THEN that's cause to celebrate new manufacturing processes.

      Funny thing, I just purchased a new AOpen Xabre 400 that performs beautifully (with signed, WHQL drivers that XP doesn't complain about!) - and I paid $105.00. This is a 64MB card with 8x AGP. It also has DVI and SVGA out. This is top of the line, as far as mass market hardware goes.

      I looked at the ATI and nVidia based cards and features, and the Xabre trounced them on price/performance. (caveat - I'm not in to FPS) The deciding factor was that it was the cheapest card with 1080*720 resolution DVI output, and OMFG do DVDs look good like this! :-)
    • I believe the point is that these cards using the SiS chipset are already cheap and getting cheaper.
      These cards (Xabre) will cost you about fifty bucks. I have an SiS AG315 that I paid forty for with 64mg RAM and tv out. Try looking

      Perhaps you should be dancing in the streets already.

      Unfortunatley most people out there are blinded by advertising and think they need a godawful expensive card to do 300+ fps in games.

      1. Go to newegg, get a card for $42 (ECS AG315T) and have fun.

      2.??????

      3. Dance in the streets
    • When the top-of-the-line graphics card costs half of what it does today
      Why would high-end graphics cards come down in price? For instance, 3DLabs Wildcat-series cards are, conceivably, always going to be priced in the several thousand dollar range. This is the high-end. What you are referring to is the upper level of gaming video cards. IMO the upper level of that segment has come down, but it's a matter of economics, as always. In these days of 3 GHz processors, people aren't so willing to pay $400 for a good gaming card anymore, so economic factors have caused the "high-end" or near high-end of gaming cards to come down. This has nothing to do with the 0.13 micron process, and I don't think anybody has been saying that the 0.13 micron process would drive costs down. If anything, it's to cram more transistors onto a chip and reduce heat/power. Look, there will always be a high-end. For example, even nowadays with the $200 Wal-Mart PC, you can still pay as much as you want at the high-end.
    • It's weird though. Back around 1997-1998, a high end graphics card cost about $200 (ex. Voodoo 2 at its release). These days, it's closer to $400. Meanwhile, CPUs have reveresed. They used to cost close to $600, now it's more like $300
  • Conclusion... (Score:4, Informative)

    by swordboy ( 472941 ) on Tuesday November 26, 2002 @09:02AM (#4758271) Journal
    For those who didn't make it through all 14 pages (just asking for a whoopin')... This card has nothing to do with GeforceFX capabilities:

    Conclusions
    The Xabre600's pixel shaders give it an obvious edge over the Geforce4 MX in a feature category that will only become more important as time goes on. Sure the GeForce4 MX 460 is faster now, but it may not support all the new eye candy in future DirectX 8.1 titles.

    Against the Radeon 9000 Pro, the Xabre600 starts to look a lot worse. Here the DirectX compatibility playing field is level, but the Radeon 9000 Pro's pixel shader performance is much better, as is its performance in real world applications. Even if the Xabre600 is able to achieve price parity with the Radeon 9000 Pro, ATI's value offering is still going to be a better deal.

    Let's not even get into how the Xabre600 compares with the GeForce4 Ti 4200, because it really doesn't. The GeForce4 Ti 4200 is likely to be the most expensive of the Xabre600's closest competitors, anyway.

    The fact that the Xabre600's performance can't keep up with the competition doesn't mean that there isn't value to the part. That SiS is able to produce the chip on a 0.13-micron process is impressive in itself, and I'm happy to see that the new drivers have fixed all the compatibility problems. With the improved compatibility of the latest drivers, SiS at least has a DirectX 8-class graphics chip with the Xabre600, even if it's not the most competitive one.

  • by SeanTobin ( 138474 ) <byrdhuntr@hCHEETAHotmail.com minus cat> on Tuesday November 26, 2002 @09:03AM (#4758277)
    Gotta love this quote from the article:
    The Xabre600 is positioned against more mid-range offerings like the Radeon 9000 Pro, which is why you won't find it running at 500MHz with a Dustbuster strapped on its back.
  • 0.13 micron? (Score:3, Insightful)

    by Erpo ( 237853 ) on Tuesday November 26, 2002 @09:09AM (#4758302)
    Does this reduction in size really matter? I mean, it's great that graphics companies can use lower quantities of resources to feed consumer demand (the environment, remember?), but does this particular advance really matter? I'll get excited when price points for new high-end graphics cards get much lower, performance significantly exceeds the curve, or a switch is made to a _much_ smaller manufacturing process (e.g. two digit nanometers).

    I guess it just has to do with how much you need to have a faster graphics card than all your friends?
    • Does this reduction in size really matter?

      Yes, size matters quite a bit when it comes to laptops and other mobile devices - even settop boxes for that matter. When you want low power consumption and low heat output smaller chips are the answer. Advances like this always trickle down to the mobile market and consumers benefit as a result.

    • The ENVIRONMENT??? hello, they're making these out of SAND for crying out loud! Silicon==SAND. Last time I checked Silicon makes about 80% of the earth's crust, and we don't have to worry about running out!

      Ok, I'm just joking, sure there's other stuff invovled like aluminum... etc.
      • The ENVIRONMENT??? hello, they're making these out of SAND for crying out loud!

        Yeah, the environment. :)

        Chips may be made out of silicon, but there are a whole host of other toxic chemicals that are used in the manufacturing process that you don't receive in the box when you buy your shiny new graphics accelerator.
  • by qoncept ( 599709 ) on Tuesday November 26, 2002 @09:14AM (#4758322) Homepage
    Sega always had a knack for "beating competitors to the punch." Sega Master System, Genesis, Saturn, and Dreamcast were all the first 8-, 16-, 32- and 128-bit systems, respectively. What they all had in common was that when the competitor came out with their system a few months later, Sega's was never as good (from a technology standpoint, of course; I won't go in to what systems were best).
  • How many mircon's in a micron? :)

    </nitpick>
  • Does anyone else find it a bit ironic that the smaller (.13 micron) video cards require the much larger heatsync/cooling systems?
  • by BillLeeLee ( 629420 ) <bashpenguinNO@SPAMgmail.com> on Tuesday November 26, 2002 @09:28AM (#4758397)
    If SiS is going to spend millions on the 0.13 micron fab processes, they should really also attempt to make something that can compete with Ati and Nvidia's new cards. As it stands, being able to pump out .13 micron chips seems only like it's for bragging rights, because this chip barely compares with a Radeon 9000, which (I think) is only .15 micron. But hey, maybe SiS really likes spending the money.
    • They're probably aiming at the embedded market... motherboards. With a embedded graphics card there's no hope of really competitive performace, so getting something with no heatsink required is important. It would be nice if they didn't lock the clocks though, so you could overclock it well. It sure would have potential.
  • There have been a lot of new GPUs and video cards lately but, this one is a little unusual. Although this chip doesn't have quite as high a performance level as some of the newest cards, it also doesn't require a frickin liquid nitrogen refridgeration unit in order to operate safely. In fact, compared with the newest video cards, its small heat sink and fan seem down right anemic. That's a good thing.
  • by binaryDigit ( 557647 ) on Tuesday November 26, 2002 @09:39AM (#4758477)
    I think some people are missing the point of this. Some have said ".13u, big whoop, how does that help me? on such a slow *ss chip". Well the point is that SiS is not trying to compete with the big boys on the high not, that's not their schtick. They want to push motherboards, esp to oems, and this product allows them to offer "higher" end graphics to their customers. It won't be long before they shrink the puppy enough to integrate directly into their chipsets, thereby offering oems an attractive compromise between speed and price.

    So in the end, the fact that they can "push the envelope" as far as their production process goes does bode well for the consumer. You just have to look at this product in the context for which it was intended.
  • I think that the statements concerning the article are a bit flawed, considering that the manufacturing if the FX line is already underway by Nvidia. However, the introduction of SiS (or any competitor) into the market will give consumers more options. With only two graphics card manufacturers worth mentioning, the cost of the hardware will continue to follow the same trends. A third player may shake that formula up a bit. Somehow I doubt it, though.
  • Maybe the benchmarks aren't that impressive, but I don't think that's the point. Having a third major manufacturer step up to give ATI and N'Vidia some competition can only be a good thing, if indeed in the future SiS starts making some high end cards. Hopefully it will prevent the others from sitting on their laurels like 3Dfx seemed to do when it introduced the VooDoo 5's and they flopped.

    ~S
  • When is cirrus logic comming out with their new gpu?
  • Hmmm (Score:2, Insightful)

    by atari2600 ( 545988 )


    Where do i begin:

    The dead horse must be turning in it's grave but do you really want to play games on linux? Sure i ran Soldier of FOrtune and it looked better than it did in Windows and Descent III and UT were awesome. But many linux users(including me) prefer Windows for gaming - why? Because i cannot play Tactical Ops on Linux/any other unix - simple reason - there is not much gaming software available for baby linux right now. I am not trolling but the point is i think SiS can forget Linux drivers and make their Windoze drivers better...alteast for now.

    I doubt this card would give the ATI 9000 Pro much of a challenge going by the benchmarks in the article. It did give the Geforce4Mx card a run for its money but thats why the 9000 card from ATI entered the market and the ATI 9000 can be bought for 85$ (64MB version)...so unless this Xabre card costs somewhere around 40-50$, i dont see why someone would want this card as a gamer.

    I like to play my games at 1024x768 and 16bit color and they usually run great on my Athlon 900 - what card do i have? nVidia Geforce2GTS with 32MB DDR - Even UT2003 ran great - given a choice, i, a budget conscious gamer would get a card somewhere near 80-100$, which is where the Geforce3 Titanium and the 9000Pro cards are right now. Leave the ti4600s and the 9700Pros to the really rich kids - getting a card faster than your friend's card(s) is a cliche - its how much you can shell out and how much you can extract from the little graphics card of yours. I still remember how my Alliance Promotion SVGA PCI card with 2MB VRAM kicked ass while my friend's AGP Trident 8mb sucked in most software rendering modes :). See the point?

    It would be good for sis to bring out this card and price it low enough so they still make a profit - if not, well ATI and nVidia aren't stupid. Although they make a lot of money on their high-end cards, more cards are sold in the 70-100$ price range.

  • Am I the only one who doesn't give a about how many microns the manufacturing process is? It's the result that counts, not how they do it. Since it seems this card isn't up to beating the competition on the features people are looking for, just like other SiS cards in the past, I don't see what's so special about it.
  • It's SiS, it's graphics, who cares?

    I really doubt that SiS will really compete with nVidia and ATi in the near future.

    What sucks has sucked and will always suck.
  • by dgenr8 ( 9462 ) on Tuesday November 26, 2002 @11:56AM (#4759674) Journal

    You simply have to hand it to a company whose latest financial report [sis.com] has the words Zeon PDF Driver Trial emblazoned across every page.
  • Great, ANOTHER video card manufacturer who goes buggering with standards. Here's hoping they get AGP right this time instead of hosing those with certain Intel motherboards.
  • Real computer scientists only write specs for languages that might run
    on future hardware. Nobody trusts them to write specs for anything homo
    sapiens will ever be able to fit on a single planet.

    - this post brought to you by the Automated Last Post Generator...

You can be replaced by this computer.

Working...