Forgot your password?
typodupeerror
Hardware

GeForce3 and Linux 104

Posted by michael
from the fast-faster-fastest dept.
IsyP writes: "I noticed on evil3D.net that they have posted the first benchmarks of the newly available GeForce3 in Linux. Funny how the marginal performance increase coincides nicely with shipping delays and a $150 price cut to ~$350 from the original $500. Either way, it is nice to see performance of that level in Linux."
This discussion has been archived. No new comments can be posted.

GeForce3 and Linux

Comments Filter:
  • by Anonymous Coward
    I really can't understand this fascination with FSAA. Maybe if it was done by using good texture filters, and analytically along the edges of primitives, but all we have is brute force supersampling.

    You say that running in 1600 vs 800 doesn't make a difference, but that exactly what you do with FSAA. The scene is rendered at 2-4 times the original resolution and then is filtered back down to the screen res.

    The other point is how many times have you been in the middle of a quake fire fight and thought "Damn, I wouldn't have died then if it hadn't been for that tiny bit of dot crawl over there on that wall." It doesn't happen. In fact it was an education for me to go to a LAN party a little while ago and see all the serious quake heads running at 512x300 (or whatever it is) with all the textures switched off, and this was with Geforce 2's in their boxes. All they care about is frame rate, frame rate, frame rate. FSAA takes that away.

    FSAA is a gimmick. It's a way of using up all the extra fill rate that these boards have without needing Carmack and Co. rewrite all their code to use it up in sensible ways.
  • by Anonymous Coward
    They don't provide source. They provide a wrapper as a means to defeat the GPL. Legally it wouldn't stand up in court, but Linus won't take action because he thinks that ignorning all linux-kernel posts from users of binary-only drivers will be sufficent to kill the driver.
  • But...well... I prefer to just have the raw detail of higher resolutions, rather than FSAA.

    I agree. When you filter 1600x1200 down to 800x600 using 4x antialiasing, you're simply throwing away information in the image.

    Antialiasing only makes sense when you start to bump against the resolution limits of the display. If the card is capable of rendering 3200x2400 at a good framerate, it doesn't help me much. So that's when antialiasing can be used to give me a better 1600x1200 image.

    Coincidentally, we _are_ just now hitting those limits. The GF3 is fast enough to render high quality scenes at a good framerate at 2048x1536, which is beyond the capability of most monitors. So 1024x768 w/4x AA becomes a useful mode.

  • I have a Matrox Mystique 220 PCI and I'm still happy with that. I don't care much about 3D performance, please compare the 2D performance of the cards also!
  • I've recently been digging around trying to figure out which GF2 I'm going to buy (since there will obviously be a price drop on them shortly) and came across multiple sites with Q3A benchmarks from different speed CPUs (www.tomshardware.com being one of them). 850Mhz seems to be just a hair under the line for what Q3A really needs to scream. At 850Mhz performance appears to still bottleneck at the CPU for the Windows version (and likely the Linux version as well)... 900Mhz is where most of them show that performance tops out. Going above that doesn't seem to make the frame rate go significantly higher, but it's a fairly sizeable jump in performance from 850 to 900 Mhz.
  • I very much doubt those programs that run on the graphics card are ever going to work on anything but Nvidia chipsets. If games start to use them( and it appears the effects possible with it are really something) doesn't that create the same lock on the game hardware market that 3DFX had with Glide?

    Doesn't that worry you just a bit?

    Go you big red fire engine!

  • Which 3rd party Kernel Module?
  • Quake 3 Arena, yesterday's game, is a poor demonstration of the capabilities of a new card like the GeForce 3. Perhaps a game which is pushing the limits of graphics cards and CPU's, like Tribes 2, should have been used (with detail, distance, and texture settings maxed).

    Quake3Arena gets easily 100 fps on the previous generation of GeForce 2, what do you expect, 200?
  • A GeForce2 MX is perfectly good for UT or Q3A -- GeForce 3 is aimed at something better, something that does not yet exist perhaps.
  • I guess I like keeping my gaming off my computers for the most part as I usually have something better to waste my spare cycles on. However I did finally get around to buying Heroes of Might & Magic III for Linux and have been quite pleased by how well it runs. I've been playing several days without any kind of a crash. It's as reliable as playing on a console it seems. Because of this I'll probably be spending a lot more money on Linux/PC games. Can't wait for some of the new stuff like Black & White to get ported over. :)
  • by _Gnubie_ (14485) on Thursday May 10, 2001 @03:44AM (#232981)
    Yes the extensions do make the difference but guess what. Those extensions are SUPPORTED IN LINUX. Nvidia have been great about supporting all their groovy extensions as OpenGL extensions. (non official ARB one yet) Take a look at the extensions offered next time you boot up Quake 3 in the Driver Info. In some cases the OpenGL extensions NVIDIA supply surpass the DirectX feature

    NVIDIA's OpenGL offering is IMHO a GREAT driver. I'm doing OpenGL programming using it and getting great speed and visual accuracy. Also for a 3rd party Kernel Module its damn stable - cant't remember when it last crashed (never on the 0.96 release I think)

  • by Osty (16825) on Thursday May 10, 2001 @02:30AM (#232982)

    In 3 months time, NV will announce the "upcoming" GEF4, every Tom, Dick & Harry will be saying "Ahh man I gotta wait for the new NV, the GEF3 is just outdated", prices will plummet and in 12 months time you won't be able to give away your spanking new tech GEF3.

    Actually, nVidia tends to work on 6-8 month refresh cycles. the "fall refresh" for the GF3 (if we see one this year, considering that the GF3 is only now becoming available for purchase by the masses) would be a GF3 Ultra, or GF3 Pro. As well, you'll see GF3 MX (neutered like all the other MX cards -- cut out half the rendering pipelines), Quadra (high-end workstation version), and GO (mobile) versions.

    However, unless you love to live on the bleeding edge (which admittedly many people do, and those that do should know what they're getting into), there's no point in upgrading your card with every refresh. If you've got a GeForce 256, or any MX card, the GF3 might be a good buy for you in a few months. If you're running a GF2 (anything but MX), you shouldn't bother with the GF3. If you're still on a TNT2 or older, the GF3 is the board to get. Amortize the $400 price over three years of not buying a video card ($125 for the GF 256 you didn't buy, $125 for the GF2 you also didn't buy, $150 for the GF3) and it becomes easily palatable.

  • by Osty (16825) on Thursday May 10, 2001 @02:23AM (#232983)

    That being said, I think 3dfx getting killed was about the worst thing that could happen to the 3D industry. Competition is much less now, and 3dfx always showed a very strong commitment to Open source.

    Bah. 3dfx had sucked for quite a while prior to their death. Half-assed competition is not competition at all. However, believe it or not, there's still competition in the 3D accelerator market. ATi's Radeon line is going strong, and a new rev is expected later this year. The upcoming Kyro II board (boards? Or is Hercules the only board manufacturer?) looks to really push nVidia on the low end, as well.

    With that said, you'd have to be blind not to acknowledge that nVidia is currently the leader in high-end, gaming, mid-market, low-end, and even moving into mobile video for a reason -- damn good technology. The GeForce 3 continues along that line. The only problem is that we've currently hit a bandwidth bottleneck, so you're not going to see ever-increasing frame rates. What you are going to see are higher framerates at higher detail levels when developers begin taking advantage of the new features.

    As for 3dfx being "good" because they supported Open Source, all I can say to that is "bah". If you want to make your purchasing decisions based on something so ephemeral, that's fine by me. I'll continue to purchase top of the line hardware because I like getting the most for my money, all philosophical differences aside.

  • by Quarters (18322) on Thursday May 10, 2001 @06:40AM (#232984)
    Tribes 2 is also, "Yesterday's game". The 3D engine in it is just an evolutionary upgrade from the Tribes 1 engine.

    Until games come out that specifically make use of the GeForce 3's new capabilities (per pixel shaders, vertex shaders, etc...) then there won't be any program that gives a total picture of what the card can do.

  • I have been using an Nvidia GeForce2 MX for about 6 months. Quite simply put it beats the pants off of my former Voodoo - which never quite worked correctly. A few of my friends have purchased the GeForce3 just to make me drool at the slickness and ease at which it draws... (mmm gaming) While on the topic of Nvidia I do have a question - has anyone had a chance to play with the GeForce2 Go? I am considering buying a new Dell Inspiron with it and have been unable to find success stories / etc under Linux - anyone?

  • Just choose the closest model and don't worry about it too much. You may even want to skip the X configuration and go directly to the new drivers - you can download them from www.nvidia.com (they have an ftp site that has all of the drivers at ftp://ftp1.detonator.nvidia.com/pub/drivers/englis h/XFree86_40/. There is 1) a kernel driver and 2) an OpenGL driver for XFree86 - you need both of them. The latest is 0.9-769 last time I checked. If you have updated your kernel or anything in RedHat get the source rpm for the kernel module.

  • The drivers support the API calls, but none of the games do.

  • I notice that alot of people are complaining that they cant get access to the features of the card under Linux. I would think that we should really get the performace up before we care about features. For instance, they didn't mention how fast it could do 2D blts, but I can tell you that its MUCH slower than on windows, so are most cards in Linux. How about reading back from the frame buffer? NVidia has been known to be really slow here.

    Its amazing to me that people dont pay attention to 2D performance any more. I know alot of you are going to say that its good enough, and that X windows works just fine for you. But really, think about it, your wasting thousands, no millions of clock cycles waiting around for blts. And dont even think about getting hardware accelerated alpha transparent blts. I have been looking around for any support under Linux for that and it just dosn't exist.

    Well, thats enough ranting for me.
  • NVidia is basically doing what Intel used to do, play with MHz figures and charge a hefty premium for their latest "new" chip.

    Anyone remember the P133 vs P150, which only offered about 5% spped increase, but had a hefty price premium.

    Just like AMD came along and bashed about Intel, someone needs to come along and bash about NVidia in the Video graphics arena. Good hardware, but expensive with crap drivers.
  • Yeah, you just proved my point (in a sense). All this cool new stuff was just like MMX, a few years later MMX is fairly common and used now.

    But all this kewl pixel shading will mean zilch until some game makes use of it and becomes popular.

    yeah I will probably change my mind, but all this new hype means nothing until it is fairly widespread tech.

    At the moment, its just something for those on the NG to say "Hey I got the big bad latest NV, is 3000 fps in Q3 good? " or "How do I overclock the new NV?". SO in other words, the card is useless other than for those who want the "latest" and are prepared to pay the premium.

    BTW: I am sure those features have been available in high end cards for a long time.
  • But are those "feats of engineering" worth the hefty price?

    Definately NOT.

    In 3 months time, NV will announce the "upcoming" GEF4, every Tom, Dick & Harry will be saying "Ahh man I gotta wait for the new NV, the GEF3 is just outdated", prices will plummet and in 12 months time you won't be able to give away your spanking new tech GEF3.

    I re-iterate, NV is doing what Intel used to do when it had the market to itself in the "MMX" days.

  • yes, they only perform better in Quake benchmarks. I am sure karmack whinged a while back that the vendors were only optimizing their drivers for Quake.
  • Those people playing in 512x384 with the textures blurred aren't playing you, they're playing each other.

    What's a real pain in Quake is when you're in the middle of a battle and get a quarter-second pause when running into a new room as the computer loads textures. If you die because of that, it takes all the fun out of it. For you and for your opponent.

    The point of dropping the detail is to remove the computer from the equation as much as possible, so it's a game of skill between the players, not their machines.

    In something like Myst, the graphics are everything. In something like Quake3, the graphics are just a way of describing the world that contains you, your opponent, and the weapons.
  • Some people are doing very interesting things with Blender [blender.nl], which is, ostensibly, a game development modeller and renderer.
  • Also notice the distinct lack of details about the benchmark... the only details given are the system specs, so I would be inclined to question how valid the results really are.

    If you read the introduction to the benchmark you'll notice it says 'teaser'. The rest of the details will be coming later, after which you'll be able to get a better idea about what is going on. I'd take them with a pinch of salt for now until we see the rest of the data, but no need to completely disregard them.
  • I was thinking about this last night.. Microsoft is having a 'GPL is evil' crack at the moment, I expect to see their cronies, other companies who rely on restricting information flow and use, to follow suit.

    Having good quality drivers for advanced hardware is critical to keeping Linux as an acceptable choice for many home users. Linux already suffers from this, poor or non-existent support for some 'killer' hardware, especially USB devices. This does not affect the server space, but probably has a huge impact in the home.

    Nvidia have a close relationship with Microsoft (Xbox). How long before we see large companies what want to crush the concept of open source/open standards/open information applying pressure on their partners to ensure that the support for the latest hardware is not provided.

    While the antitrust suit was ongoing this might of been suicide, but now, especially with George Jr in place, they might decide they can get away with it.

    Actually I was bracing myself for Nvidia to announce no decent drivers for Open-Source OS's, saying 'we won't encourage un-american (their definition, not mine) activities as they turn round and bend over to Microsoft and it's ilk. I thought that mundies crap was in preparation for this, and I still worry that they may try this in the future.

    So congrats Nvidia, you just reduced my paranoia by a bit.


    EZ
  • Is there source code and/or documentation for writing drivers? NVidia has been kind of a pain lately with their TNT drivers (XF86-3 drivers have source, XF86-4 drivers are binary-only), so are they going to get on the train or not?
    ------
  • Make sure you have the right drivers installed and AGP functioning correctly. It sounds like you don't. I won't say it's always easy to get it going right either.... Grab something like CPUID and check the AGP settings.
  • The reason games don't run as well under Linux as Windows isn't the multitasking. Windows 2000 is a true multitasking OS, and it runs most games faster than Win9x.

    The problem is maturity. The drivers and libraries for Windows have been tweaked, retweaked, and then tweaked again. Give the drivers and libraries on Linux the same treatment and I'm sure we'll see equally good results!

    ---
    Go to http://www.sexiestgeekalive and vote for Angie this month! Yes, she knows they screwed up the link to her homepage.
  • are you smoking crack? I have a GF2 GTS w/ 64 megs, and we also have a voodoo5 w/ 64 megs in my house, both running off athlons, mine an 850, the other an 800.. and at 800x600, FSAA 2x, in OpenGL, they look roughly the same, but the GeForce has 10-20 more FPS (this is in counterstrike).. when you push them to 1024x768, the V5 drops significantly in FPS.. as does the GF2, but not as noticeably.

    Still, why would you use a GF2 in 4x mode? I see using the V5 in 4x, b/c that's what the voodoo chips are good at.. looking pretty, w/ some hit in FPS.. the GF2s are better at high-fps, low FSAA applications.. don't try to make the GF2 look like the V5.

    //Phizzy

  • Well ... new chipset and graphics power is all nice and that, but I got one major problem with the GF3: It's lacking features I currently have and use.

    Jep, I'm one of those fools who bought a G400 when it came out and now I'm used to having 2 VGA outputs on one card, combined with a TV-Out for playing my DVDs or just running presentations on beamers ... When going to a GF3 I will have to pay double the price my old G400 had when it was new only to lose my Dual-Head functionality.

    Call me dumb, but I actually prefer staying with Matrox, especially as their cards are still being properly supported with new driver and BIOS releases, something which really annoys me with the nVidia Chipsets. (Let's face it, you buy a card from Elsa, Asus or whoever and they support it for 6 months after which any new drivers you'll be seeing are the nVidia reference ones, which of course lack all the extra features your specific card has.)

    Anyway, the above is just my 2 cents. I'll be staying with my MGA G400 as long as I can and I hope somebody not nVidia releases a GF3 competitor card before mine gets unbearable.
  • I doubt there will ever be a GLX driver that will make use of these new shaders on the chip

    Well, the glx module *has* to make use of the shaders, otherwise you won't see anything. But you can't make use of the power of these new features without directly programming the card, of course. So yes, hand-coding is necessary, but that is a plus! That's like the difference in power between notepad and emacs ... a scriptable graphics card! That's ingenious! So please stop whining that it doesn't go over 120 fps in QuakeIII ...
  • I don't understand the problem. Yes, I agree, there is too much hype about the wrong features. And yes, current games don't make use of these features. So what? The card isn't even out yet! (or is it? Well not for long, anyway ...)
    And for the overclocking and "I've got a bigger cock"-factor ... well, you get that with every new piece of hardware.

    And AFAIK, not even high-end hardware has features comparable to the vertex and pixel shaders in the GeForce3. They have diffferent stuff you don't find anywhere else (like the color matrix on sgi, or hardware support for the accumulation buffer), but no really programmable hardware at that level.
  • No, it doesn't worry me. Because these extensions are a documented part of both DirectX and OpenGL, and they can be implemented in any other graphics card. I don't think nVidia can keep others from doing this (and it certainly isn't in Microsoft's interest to support a monopoly by nVidia). So I am quite optimistic that this isn't going to be a big problem.
  • by ghoti (60903) on Thursday May 10, 2001 @01:03AM (#233005) Homepage
    NVidia is basically doing what Intel used to do, play with MHz figures and charge a hefty premium for their latest "new" chip.

    Bullshit. The GeForce3 has a bunch of new features that other graphics cards don't even come close to. Ever heard of vertex and pixel shaders? Now you can write your own little program that runs on the graphics card for every vertex or for every pixel drawn. And it's a powerful language, too!
    Current games don't take advantage of that, but wait a year or so, and you will change your mind. An area where these things are already used (at least in prototypes) is visualization. It is now possible to do 3D volume rendering etc. at very high speeds using these features.
    So comparing the GeForce2/3 to the P133/150 is ridiculous. Drivers are a different matter, though ... (they're not crap, they're just not free)
  • The licensing of the kernel was specifically clarified to ensure that binary only drivers could be used.
  • I'm sorry, but that is just plain false. Do the research. The GeForce3 includes lots of features that simply do not exist in earlier generations. I'm impressed by the fact that the '3 beats the older cards, running older non-tuned software, while being clocked far slower (-50MHz when going from the GeForce2 Ultra to the GeForce3). That says something about the engineering involved, I think.
  • by Emil Brink (69213) on Thursday May 10, 2001 @01:21AM (#233008) Homepage

    "Easy". Check out the first paper on this page [unc.edu]. It's from SIGGRAPH 2000, where it rocked my world. ;^) It describes how OpenGL, with two (fairly simple, although not supported by today's[*] hardware) extensions, can be used to execute RenderMan shaders.

    [*]: Check out what a certain id Software programmer typically says when asked about desirable future directions for rendering hardware, and extrapolate. ;^)
  • nVidia doesn't make money directly out of driver sales but their drivers are a major selling point for their products.

    The ability to use the nVidia detenator drivers is a huge boost for anyone who owns a GeForce card. The divers that came with my Asus 6800 and the new versions on the Asus website are amazingly poor. They are pretty much unusable, not only do they have stunning incompatabilities (Real Player for god's sake) but they make my system crash very regularly.

    Aside from quality issues, the drivers can also yeild some pretty big performace gains, I know I saw way better frame rates after switching from the Asus drivers to the nVidia Det 3 drivers.

    I can understand why they are releasing binary only linux drivers. I'm not very happy about it but I do understand.

  • by 0xA (71424) on Thursday May 10, 2001 @02:43AM (#233010)
    You have a really good point here.

    I run RH 7.1 and Quake 3 is okay under linux (Geforce 256 DDR, P3500) but it's still a touch better under windows. The problem is, when something like updatedb kicks off, it slows to a crawl.

    I really can't think of a good way to deal with this, when I'm starting a game I don't always think about everything that's scheduled to run in the next hour, or could be started for some reason. What I'd like to have is a script I could run that would automatically knock everything else down prority wise before I launched to game.

    I guess what this comes down to is me not really understanding enough about how procoess prority is handled by the kernel so I'm not sure how fix this. Has anyone else ever tried to set something like this up before? If there were tools out there to do this I think it would do a lot to improve gaming on linux.
  • That being said, I think 3dfx getting killed was about the worst thing that could happen to the 3D industry. Competition is much less now, and 3dfx always showed a very strong commitment to Open source.

    That's crap. They had NO COMMITMENT to Open Sourcing a DAMNED thing until they started grasping at straws for ways to surive!

    Originally, they were very strict on the terms and conditions concerning use of GLide. In fact, I remember quite well how they went after Glide Underground for posting Glide wrappers.

    Glide only started preaching about Opened Source after they realized their CLOSED API was losing market to Microsoft's closed API and OpenGL.

    On the negative note -- without 3DFX... TAGOR has less to complain about.

    "Everything you know is wrong. (And stupid.)"
  • All the GeForce3 functionality available in DX8 is also available under OpenGL (under windows). Given that NVidia's drivers are basically identical under Windows and Linux, I *assume* that all of the GeForce3 functionaility is also available under Linux via OpenGL.

    I don't have my GeForce3 yet, so I don't know for sure though.
  • Offtopic here, but...

    Hey bro, if you like Meshuggah, please check out my buddy's (LegionsInHiding) guitar medley from Chaosphere:

    Meshuggah_-_Chaosphere_-_Guitar_Medley_by_LegionsI nHiding.mp3 [apk.net]

    Mike Roberto
    - GAIM: MicroBerto

  • (and 4x is only available in Windows... *sigh*)

    export __GL_ENABLE_FSAA=1
    export __GL_FSAA_QUALITY=2

    That should put it in 4xFSAA (aka 2x2) with no LOD bias (pretty).

    ------

  • Game companies are not going to write games that take advantage of a chip if hardly any of their customers own in. It's a chicken-and-egg problem. Normally, you'd expect innovation to go nowhere in this situation. However, NVidia has consistently taken the risk and implemented technology that they knew wouldn't be fully used until a year or two later, and thus kept the industry moving forward. And all you can do is blast them for it? *sigh*

    ------

  • Tribes 2 currently has no method of recording demos, so it's not possible to get *exactly* the same scene rendered.

    Also Tribes 2 is far more sensitive to non video card related issues affecting fps... try talking to all my buddies who's GF2s perform way below my GF1DDR.

  • And since NVidia makes their GFX cards comply with M$ Direct3D, expect it to fare better in Windows, when the added features is used.<br>
    So if you don't want those extras, get a GeoForce 2 Ultra card and wait for GeoForce 4 or a company that makes GFX cards directed more towards linux.
  • Actually, it's 3 words: "two thousand three." even if one incorrectly put "and" in there, it'd be 4.
  • by Animats (122034)
    The NV20 chip (the GEForce 3 to consumers) isn't any faster than the GeForce 2 on operations the GEForce 2 can support, despite the "up to 7 times faster" comments in NVidia ads. What it does have is a whole new set of texture-related operations. These require both driver level and application level support. It's possible to get Renderman-level quality out of the thing in real time, as the chameleon demo shown at the GDC demonstrated, but it's not clear how widely used that capability will be.

    The NV20 architecture will be in the XBox, so it's getting considerable development attention.

    The price drop has happened. It's $357.12 at ProVantage. The $550 price was probably just to hold down initial demand while production ramped up; the product has only been available for a few weeks. Carmack's comment was that developers should get one immediately; others should wait.

  • You don't need graphics-card/linux-specific hacks. It was reported on OpenGL.org [opengl.org] recently that nVidia added the functionality to OpenGL it's extensions system [sgi.com].

    All we need now is the implementation of their extension in Mesa - if they're going to go to all the trouble of developing OpenGL extensions you'd expect nVidia to help there as well.

  • Bullshit. The GeForce3 has a bunch of new features that other graphics cards don't even come close to. Ever heard of vertex and pixel shaders? Now you can write your own little program that runs on the graphics card for every vertex or for every pixel drawn.

    Does anyone else think about those Amiga games that actually used the Copper to draw graphics by changing the color registers on the fly, instead of actually drawing pixels in the conventional way? In many ways the GF3 programmable shaders bring Copper to my mind, although that poor old component only had three instructions and seriously more limited capabilities. (Well, has... my Amiga still works ;)

  • Right now, FSAA is just done by downsampling the image from a higher resolution. Thats why 800x600 with 2x FSAA benchmarks at within a single percent or so of 1600x1200. FSAA looks nicer on than off, but I find that the higher res looks nicer than the downsampled version, so unless you have such a fast card that you get decent frames per second at a resolution higher than your monitor can handle there isn't much point. I still find it amusing that the company that pushed the minimal graphic improvement that is FSAA is the same one that spent years telling us that 32-bit colour is a waste of power.
  • The point of running games as benchmarks is that for the majority of card buyers thats the real world application that matters most. Some people will be using it for Maya/Lightwave/3DS, and viewperf or a couple of preset scenes are ideal, but for the market GeForce is aimed at UT and Q3A are used for the same reason the ZD Business tests use Office - artificial benchmarks give artificial results.
  • 1) Its GeForce, not GeoForce (picky, but helpful when websearching to have the correct spelling)

    2) If you are buying a cheap card now in order to wait for GeForce4 or a price drop on the 3, then the GeForce2mx is much more of a price/performance sweet spot than the Ultra is at.

    3) While its true that the new per-pixel shaders are a good match with DirectX 8, they are also available to OpenGL, so there is nothing to stop their use on Linux if you are prepared to code for it.
  • Aah, I see your point now. Unfortunately, as you point out, games that fully exploit vertex and pixel shaders don't exist to any great degree yet, and certainly not with good benchmarking features. My post was pointing out that these are the programs currently used by the target market; these and CS, anyway. Since CS is running fine on my system with all details turned on at the highest resolution my monitor allows on a GeForce DDR I've no inclination to buy the new card yet. On another note about this benchmarking, part of the problem will be that I don't anticipate anyone emulating vertex and pixel shading in software, so benchmarks will have to be about picture quality as well as fps, and thats a whole lot more difficult to measure.
  • according to the tribes 2 dev team, a representative from nvidia is supposed to be coming to help them integrate technology into the T2 engine which will take advantage of the geforce3.
  • This means that if you are playing some 3d game, and someone accesses your server, of perhaps just some program you are running just needed disk access, that program gets all cpu (for just a very short time). It results in a sudden drop in framerate.

    Well, what happens on Windows when someone accesses your web server while you're playing a game?

    If you're running stuff in the background then either:

    • It will interrupt your game, or
    • It won't work while you play.
    If you want to stop something (eg cron, apache) from running at all, you can send it a SIGSTOP to freeze it. Unfreeze with SIGCONT. Or don't run updatedb from cron - run it manually instead when you've got some free time.
  • hehe, maybe one day we'll see HAL-15 hypercomputers (check out starbridgesystems.com) running Quake on their custom FPGA video accelerators. :)

    The only thing I don't like is that the vertex shader language is a spec that's essentially dictated by Microsoft. That means that when it comes to these l33t new extensions there's only one standard and that is DirectX 8. Yeah, it'll be supported as an afterthought in OpenGL, in a proprietary extension that's different from card to card. Then again, if it weren't for Carmack, and possibly the independent popularity of Linux among SGI users, OpenGL would have been dead and buried, folded into DX7 (remember Fahrenheit?) so I could be lamenting a standard that's doomed anyway. Shame though; OpenGL had a nice clean API whereas DirectX is a mess.
  • I dunno about you, but to me it r0x that Tribes 2 shipped for Linux, right out of the starting gate. I take it as a good sign for the future of Linux gaming.
  • by BitwizeGHC (145393) on Thursday May 10, 2001 @03:33AM (#233030) Homepage
    Carmack's been highly critical of game technology for as long as I've been reading his plans. It's not fanboy-like; the guy's a tech-head and evaluates things from a more rational perspective than the hype and ad copy we're used to seeing. Let's face it: the usual sources, e.g., IGN, PC Gamer, Penny Arcade, don't sweat the technical details; as long as the staff can frag in increased visual glory they're pleased as punch. If the Carmack gets enthused about something, it's going to be good.
  • I promised myself that my TNT was the last nvidia card, after less than impressive Linux support followed their pie-in-sky promises. For some stupid reason, I then bought a TNT2. Well, was I ever sorry. It's been responsible for 5/5 of my re-installs. Crashing to the point of fucking my filesystem. Not what Linux is about. I will now pay a $300 premium (Australian) for a Radeon ... as soon as I can get the cash together. And this time I mean it. No more nvidia crap.
  • I would recommend the new Mandrake 8 (ISOs are available from www.mandrake.com). This is an excellent desktop OS for people who are either fairly new to Linux or just like the nice graphical interface (Graphical LILO, Aurora, etc). Plus with the hardware detection and the Windows-like config tools in X, it beats the pants off using linuxconf.

  • 1. Debian
    2. Redhat

    3. Slackware

    Ftp to ftp.slackware.com. Look in /pub/mirrors/slackware/slackware-7.1/iso. Get install.iso.

    What are peoples experences with these two?

    My experience is that I always come crawling back to slackware in a month or two... YMMV.
  • I realize that the card itself has a higher clockspeed than the GF2, but what I'm saying is that you can't just compare the GF3 and the GF2 benchmarks when it comes to the quality of the extensions.

    The reason I specified those two new extensions specifically, is because they are the ones that people will notice dramatically. From a programmer's prespective all the new API calls kick ass, but I'm not expecting all slashdotters to keep up on nVidia's new extensions. The GF3 is so cheap because of the market right now. nVidia needs to keep showing that they are moving inventory to stay on the upside of this turbulent market. ----- P.S. I dig your site man! :)

  • by M3shuggah (162909) on Thursday May 10, 2001 @05:17AM (#233035)
    (in a win32 environmnet...)

    PER-PIXEL SHADING : What is per-pixel shading? It's a method of applying special rendering effects... per pixel. It allows material and real world effects to be applied individually to a pixel for more accuracy and intensity. Per-pixel shading will redefine the visual look and feel of imagery for PC graphics. Per-pixel shading has long been used in film production to create a more realistic and lifelike appearance for computer generated imagery. If you've seen Toy Story, you'll definitely remember Buzz Light-year. Remember the translucent reflection on Buzz's helmet? How the environment and light streaks reflected off the glass but also let the image underneath show through? That was done with per-pixel shading. Until now, it wasn't practical to use per-pixel shading on a PC because of the intense power and processing requirements needed. Sure, you could have done that in 3D Studio but could you have done it in real-time? Could the effect be applied to an entire frame at high resolution in 1/60th of a second? Not until now.

    Per-pixel shading is useful for simulating natural phenomena and accurate surface attributes such as fur, cloth, metals, glass, rock, and other highly detailed surfaces. Traditionally, effects were done on an entire triangle and sometimes an entire texture using a technique called interpolation. Special effects were done using calculations based on the vertices of the triangle and interpolating the entire area from the vertices. The end result is a generalized visual appearance... like an estimate or approximation of the final image. The key benefit of using interpolation is that it is fast and easy to apply. But, the downside to it is that with large triangles, the resulting image contains artifacts, which degrades overall image accuracy and quality.

    Using per-pixel shading, effects and calculations are applied to individual pixels. Since the triangle will be composed of many pixels, the resulting image is highly accurate in representing what the image was intended to be. Let's assume that a generic triangle is drawn together (including its area) using 100 pixels. Now, we also have a effect pallet of 10 effects. Each pixel then, can accept any one of the ten that are available. That's an outcome of 10,000 different possible effects just for that one triangle. If interpolation was used, than the effect is fixed using that one out of ten effects and generalized across the entire triangle. Below is a visual comparison between interpolation and per-pixel shading.

    PROGRAMMABLE PIXEL SHADERS : The GeForce3 can handle four or more textures at a time. Logically, the GeForce3 would have to be able to handle them independantly to accomplish the "infinite" number of effects that Nvidia claims it is capable of doing. Besides juggling textures independantly, they are also able to apply effects to each texture independantly using the DirectX8 shader, as been said by Nvidia.

    With the new engine, it is possible to have effects like a texture surface that's shiny, bumpy, and dynamically changing. Also, with the nfiniteFX engine (programmable pixel and vertex shaders), the developer can custom program the engine itself to do whatever they want it to do from an unlimited number combinations and permutations.

    Once the texture combination calls are completed, there can be an unlimited number of combinations that you can do with the 8 texture blends. All of this wraps under the DirectX 8 pixel shaders.

    MY DISPUTE : Is that the drivers that Evil3d used weren't using any of the extra API calls. Given, they aren't out yet. But by disregarding these new features along with the GF3, it makes it look like it is just an overpriced GF2!

    If anyone has seen the presentation that John Carmack made at MacWorld this year, he unveiled his next 1st-person shooter. It looks qutie realistic and not to mention it is full of these new API calls. (It isn't just wasted coding, it does have a purpose.)

  • Pro/ENGINEER. Hopefully the rumors I keep hearing that something is in the works will come true.
  • No.. you should have updatedb set to a low priority to start with. If it's something that's important to you, it's simple enough to modify your crontab + a few startup scripts to give the game more CPU...
  • > But all this kewl pixel shading will mean zilch until some game makes use of it and becomes popular.

    By the time it becomes "popular", it's common and hence isn't really as "fun", imho ;)
  • by boaworm (180781) <boaworm@gmail.com> on Thursday May 10, 2001 @01:04AM (#233039) Homepage Journal
    Imho, if I play using 800x600 or 1600x1200 doesnt really make a difference. Sure, its a bit more goodlooking, but what i really want is good fast FSAA. I cant really see why benchmark a chip in 1600x1200 without FSAA enabled (1.5 or 2.0).

    That's what makes the real difference, the ability to play games in full color, with reasonable screensize (800 or 1024) and heavy FSAA.

    Guess i've to wait for those benchmarks though :-/

  • It's true that visual quality won't win a game of Quake, but for high visual quality, full anti aliasing is the way to go: If you want to see some really bad visual "quality", turn off texture filtering: replace the trilinear mipmapping (GL_LINEAR_MIPMAP_LINEAR) with mipmapped nearest neigbor (GL_NEAREST_MIPMAP_NEAREST). Now that sucks. But interestingly, edge aliasing is not a problem in that mode. Why isn't it? Because the filtering is the same for all pixels - almost none. With texture filtering enabled but missing edge antialiasing, the edges stick out (even in high resolutions) and become the most obvious signal that you are looking at polygons, not stairs and archs. FSAA really is a great leap for immersion in not-so-fast games.
  • Do you know what renice(8) is for?
    You could just man renice and gen up on what you need to do.
    One of my friends runs Seti@home reniced to -19 on his K6-2/500 and it completes a unit in about 8 hours...
  • by Gingko (195226) on Thursday May 10, 2001 @03:16AM (#233042)
    The graphics card area of interest has moved away somewhat from the super-high fill rate battles that were all the rage in the day of the Voodoo cards.

    It's all about interesting and orthogonal features now. GeForce 3 brings vertex and pixel shaders in hardware to the mix, as well as hardware shadow map support. The disappointing thing is that 3D textures (despite word otherwise from John Carmack) don't appear to be accelerated in hardware, at least with latest drivers (see a recent thread in the advanced section of www.opengl.org [opengl.org] on that unfolding story - NVidia are soon to make an announcement on what the deal really is).

    Being able to program in a pseudo-assembler language for custom per-pixel effects is a hark back to the old days when you had complete freedom over everything you could do, but most of it was slow. Now we have a better mix where we are hardware accelerated, but pretty flexible down to a programmable level. *However* the current revision of pixel shaders (1.0? 1.1? can't remember) on DirectX (and very similarly and more relevantly on OpenGL) aren't as flexible as some may like (notably John Carmack), since to paraphrase him "You can't just do a bunch of maths and lookup a texture". Hopefully that will get better with time.

    Yes these things are important to games mostly. And yes they are arguably the biggest step forward in consumer graphics tech since the original 3dfx card... certainly since hardware TnL. Wait for the price to come down (since initial pricing is aimed at developers and the *really* hardcore gamer), and in the meantime amuse yourself with some of the demos from Nvidia's developer site [nvidia.com]. Nvidia are by far the most developer friendly company I've ever encountered, so short of Open Sourcing their drivers (which we have no right to expect them to do), they are almost ideal from my (developer's) perspective.

    Henry

  • Vertex Shaders/nFiniteFX engine aren't the only new 'features' of the GeForce3 [replacing the GF2's pixel shader]. Performance is increased because it processes the data more efficiently thru the use of the crossbar memory control, z-occlusion culling, etc, which are useable in games today. When the 0.9-8 drivers are released with HRAA [High-Resolution Anti-Aliasing] support, then the GeForce3 will show it's true worth. [See our GeForce3 Guide [evil3d.net] for more details].

    Also, it is cheaper than most GeForce2 Ultra cards... so it is an incredible buy. At $350USD, it is the cheapest launch of a flagship NVIDIA product in over 18-months.
  • Most of our reviews are accompanied with a link to our benchmarking guide [evil3d.net] that describes our methodology in detail. Since you were 'inclined to question' the validity, hopefully the above link will provide the answers.
  • Whether or not the GF3 will be a good buy is questionable. I'm probably not going to purchase one, only because I just dropped US$400 on a GF2.
    But regardless of how its going sell, it's rediculous to say that it won't be supported. With the new X server I get similar performance in linux and windows with my GF2. And yes, that's with a custom GLX module (provided by NVidia), which will probably either work with the GF3, or will be modified. And yes, with an alternate kernel module (provided by NVidia) which I took fifteen minutes out of my day to download and install.
    This is linux we're talking about. You _always_ have to work a little harder to make things work just right. That's why its fun. Having a much better X server and screaming 3D gaming is worth the extremely small amount of work one has to personally do in taking the extra time to just _wait_ for a download and then just _wait_ for a compile.
  • by aussersterne (212916) on Thursday May 10, 2001 @01:35AM (#233046) Homepage
    No doubt!

    I originally bought a Voodoo5 card and played everything at 1024x768 at 2x FSAA. Beautiful!

    Eventually I sold it and went GeForce2 because Linux didn't support the Voodoo5 well. Unfortunately, the FSAA quality isn't as good -- I have to play 4x FSAA on the GF2 to get the same visual effect most of the time (and 4x is only available in Windows... *sigh*) so I have to play at 800x600 most of the time.

    But it's worth it.

    I couldn't imagine going back to playing non-FSAA, even at 1600x1200. People who still haven't seen FSAA... It's worth the cost of a hardware upgrade, IMO.

    Now I'm dying to get my hands on a GF3 to try the new HRAA (is that the right abbreviation?) alongside the nifty lighting improvements.

    Here's to Doom 3 on GF3 with AA.
  • while you are right to a point, the GeForce3 does have new features. A good number of these features simply need developer support.

    That being said, I think 3dfx getting killed was about the worst thing that could happen to the 3D industry. Competition is much less now, and 3dfx always showed a very strong commitment to Open source.

    Scott
  • I don't count radeon much of a threat--there is still the problem of the name ATI that many people simply do not like. I just don't see at their current level of play that ATI is much of a threat right now. Nvidia is encroaching (or already conquered) on every single territory that Ati used to dominate.

    as for kyro...again we'll see.

    don't forget things like HSR--there are lots of tricks left to reduce bandwidth, but agreed, that is the major bottleneck right now.

    I own an Nvidia GeForce2 right now. I used to own a 3dfx. 3dfx had great support under linux ad was easy to use. There's no doubt 3dfx hasn't been a serious contender for awhile, however don't forget that even in the months leading directly up to 3dfx shutting their doors, 3dfx cards dominated in retail. They made popular cards, their failure was primarily OEM and volume based.

    Scott
  • Yes. Well.. not 200, but how about 60fps at max settings with 4X AA at 1024? [sharkyextreme.com] Yeh, thats what I thought.
  • Maya for Linux is here. I have already switched to it (from IRIX) as my primary Maya development platform. It is basically complete; only lacks a couple of odd features from the IRIX or win32 version like quicktime movie generation.

    Whitney Battestilli
    Alias/Wavefront
  • What exactly is the point of playing Q3 with no textures in 512*300? "Well it looks like shit, and I'm bored as hell but I'm owning those losers who are busy having fun with the game." I honestly can't understand the mentality of people who will sacrifice the game experience for a small boost in their frag count.
  • by James Foster (226728) on Thursday May 10, 2001 @01:33AM (#233052)
    I'd argue that the reason why the performance is only marginally better is due to the Linux drivers probably being VERY early drivers.
    On Windows (which has the more developed drivers at this point in time, since NVidia would have that ranked as a priority) at 1600 * 1200 the GeForce 3 has a healthy increase over any previous video cards, whereas in this benchmark the performance is actually worse as the resolution increases! (Windows benchmarks of the GeForce 3 are the inverse of this)
    Give NVidia some time and then benchmark the GeForce 3 on Linux, the performance increase should be a nice gap.
    Also notice the distinct lack of details about the benchmark... the only details given are the system specs, so I would be inclined to question how valid the results really are.
  • Just like AMD came along and bashed about Intel, someone needs to come along and bash about NVidia in the Video graphics arena.

    How about ATI? I'd love to see them do it.
  • What games are there for linux that will play on a laptop (sony viao xg29 with 256megs of ram)?

    I find that games run nice in linux if you run the said two above, otherwise its harder to get high fps?

    I ask this because I have a desktop that this card is going to land in and a laptop that i want to be able to play this game with. Both are for lan parties, but I cant tell what distro I want to use, what are the pro's and cons? I am stuck with two distros as my choices (I can download the ISO's (so that means suse is out):
    1. Debian
    2. Redhat
    What are peoples experences with these two?


    Are you on the Sfglj [sfgoth.com] (SF-Goth EMail Junkies List) ?
  • I do. And all of you should too. There's no better reason to own a computer than gaming. Beats the hell outta the PS2(Sony, not IBM) and the Dreamcast.

    With more games available on Linux, this is leading more and more into a situation where I can kill a certain partition on my hard-drive.

    Well, I guess it's not so bad as long as the evil overlords don't require me to upgrade. That partition is still running '98, since they haven't added any killer-app features. That may well be their downfall.

    Here's to more and better and faster games on *nix. May they not crash or close on the windows button.
    [/karmawhore]
  • Wow. Where did you pull that quote from? I believe I said "Beats the hell outta the PS2(Sony not IBM) and the Dreamcast." Where do you get a reference to OS/2 from that?
  • Hey if its not supported now whats the point in forking the cash now. I'll wait a year and when games DO start supporting it I'll think about buying it at a much cheaper price
  • It's probably safe to say that it beats the hell outta the IBM PS/2 as well...
    I'm going to assume you meant OS/2, not the ill fated PS/2 PC architecture.

    BdosError

  • I got the IBM thingy from the reply to yours. But I appear to have been brain dead. My bad.

    BdosError

  • Just like AMD came along and bashed about Intel, someone needs to come along and bash about NVidia in the Video graphics arena. Kyro 2 is about to do just thet. The recent benchmarks are quite impressive, same speed or even faster in high resolutions at about 1/2 of the GF3 price Not too bad.
  • NVidia drivers are free (as in cost), and they do provide the source code as well, though you aren't allowed to change/redistribute the code. I don't know how hard they'd enforce this though.
    You'll find the drivers at http://www.nvidia.com/Products/Drivers.nsf/Linux.h tml [nvidia.com]

    Andrew
  • by andrewscraig (319163) on Thursday May 10, 2001 @01:00AM (#233062)
    Judging from the benchmark results - the GF3 doesn't strike me as a particularly good buy right now...I doubt there will ever be a GLX driver that will make use of these new shaders on the chip - as it would seem that the code must be customized particularly for a particular game. As graphics-card-specific hacks are quite rare in linux, I doubt that the GF3 will ever become the graphics card of choice for it (especially given that you have to go and download a kernel module before it'll even work in 3D!!)

    Andrew
  • ATI might, if 1. they can catch up in hardware, and 2. they write better drivers.
  • what's the chances of being able to use the board's calculation speed for actual pre-rendered graphics? i'm sure some of the card's features (being able to work out dot-products in one line of asm etc...) could be used for rendering or at least assisting with the calculations... surely someone's thought of and done a feasability analysis of this before... right?

    i was angry:1 with:2 my:4 friend - i told:3 4 wrath:5, 4 5 did end.
  • In general, I think people should use less games for benchmarks. Yes I know, a lot of people love games, but it'd be better to use e.g. viewperf, so you can compare the GForce to *real* graphics cards.
  • nVidia is not overpriced. The GForce cards come very close to professional cards that cost several $1000. For example, a TNT2Ultra or a GForceMX beats a Visualize fx6 in many benchmarks.
  • by stew77 (412272) on Thursday May 10, 2001 @02:13AM (#233067)
    Now that Linux comes closer to professional 3D solutins, we need more software that makes use of it.
    No, not Quake. Real software.
    Maya is coming soon, but there are still a few other things that you need to have a complete 3D solution, like proper NLE and PostPro software. Plus, a bit of competition wouldn't be bad: How about Cinema4D or Imagine? It'd also be cool to see Elias or Eclipse on Linux.
  • The one thing that still disturbs me, is that an untweaked linux-system can't get the same gameplay as an average windows-system (yes i know; `system' may be a bit high up the ladder :)
    Sure you can get higher average and maximum framerate's, windows just has better gameplay.

    The reason for this is that linux is a real multitasking system. Unlike in windows, where the system is totally taken over by the program that uses the most cpu-time (or is in the foreground), linux just leaves it all running at the same performance.
    This means that if you are playing some 3d game, and someone accesses your server, of perhaps just some program you are running just needed disk access, that program gets all cpu (for just a very short time). It results in a sudden drop in framerate.
    One can try to set the priority of the 3d game to -20 for example, but another program will always get the cpu for a short time.
    The current solution for this (the problem also shows up at decoding dvd or mpeg) is running only programs that are really needed. Or off course buying a dual-processor system :).

    But perhaps there can be something done at multitasking-priority level. Getting the system to switch faster between the programs maybe (because an actual multitasking-system would need a cpu for every possible task it's running), or perhaps a whole different approach: instead of giving a program the cpu for a short period of time, getting tiny bits of the lower-prioritized program processed in between the other program (game in this case). It could be however that this causes a higher use of cache, but I am not that good informed about that..

    In any case, this is the real reason why people still want windows for playing games, and as long as this problem isn't handled, it will always be a bit more confortable fragging on windows.
  • Surely the Open Source Movement is once again hindered in the creation of drivers that make full use of the GeForce3's capabilities because it was developed for DirectX 8.0.

    I don't know how closely guarded a secret the methods of DX8 are by their owners (I'll tactfully not mention their name :-) ), and so it's difficult to say whether we're looking to nVidia to provide drivers for Linux, or we hackers will have to develop them ourselves.

    Which brings to the fore once more the issue of driverless hardware being largely redundant. Can we ask nVidia to take the same care over their Linux drivers as those for Windows? And then, will we get them as Open Source?

    (come on, you and I know that nVidia don't make money out of driver sales, and so it's going to be okay for them to write the drivers so they sell the card to all you hardcore gamers who also choose Linux.)

    Take care,
    Ken.

I cannot conceive that anybody will require multiplications at the rate of 40,000 or even 4,000 per hour ... -- F. H. Wales (1936)

Working...