GeForce3 and Linux 104
IsyP writes: "I noticed on evil3D.net that they have posted the first benchmarks of the newly available GeForce3 in Linux. Funny how the marginal performance increase coincides nicely with shipping delays and a $150 price cut to ~$350 from the original $500. Either way, it is nice to see performance of that level in Linux."
Re:i'd be glad to see some FSAA (Score:1)
You say that running in 1600 vs 800 doesn't make a difference, but that exactly what you do with FSAA. The scene is rendered at 2-4 times the original resolution and then is filtered back down to the screen res.
The other point is how many times have you been in the middle of a quake fire fight and thought "Damn, I wouldn't have died then if it hadn't been for that tiny bit of dot crawl over there on that wall." It doesn't happen. In fact it was an education for me to go to a LAN party a little while ago and see all the serious quake heads running at 512x300 (or whatever it is) with all the textures switched off, and this was with Geforce 2's in their boxes. All they care about is frame rate, frame rate, frame rate. FSAA takes that away.
FSAA is a gimmick. It's a way of using up all the extra fill rate that these boards have without needing Carmack and Co. rewrite all their code to use it up in sensible ways.
Stop spreading NVIDIA's lies! (Score:1)
Re:Heavy Price for kewl new gear (Score:1)
Re:Not everyone wants FSAA... (Score:2)
I agree. When you filter 1600x1200 down to 800x600 using 4x antialiasing, you're simply throwing away information in the image.
Antialiasing only makes sense when you start to bump against the resolution limits of the display. If the card is capable of rendering 3200x2400 at a good framerate, it doesn't help me much. So that's when antialiasing can be used to give me a better 1600x1200 image.
Coincidentally, we _are_ just now hitting those limits. The GF3 is fast enough to render high quality scenes at a good framerate at 2048x1536, which is beyond the capability of most monitors. So 1024x768 w/4x AA becomes a useful mode.
Matrox Mystique (Score:1)
850Mhz CPU test only? (Score:2)
Locking ourselves in (Score:1)
Doesn't that worry you just a bit?
Go you big red fire engine!
Re:It's the new extensions that makes a difference (Score:1)
Q3A a poor benchmark for GF3 (Score:2)
Quake3Arena gets easily 100 fps on the previous generation of GeForce 2, what do you expect, 200?
Re:Q3A a poor benchmark for GF3 (Score:2)
Re:I like gaming goodness. (Score:2)
Re:It's the new extensions that makes a difference (Score:4)
NVIDIA's OpenGL offering is IMHO a GREAT driver. I'm doing OpenGL programming using it and getting great speed and visual accuracy. Also for a 3rd party Kernel Module its damn stable - cant't remember when it last crashed (never on the 0.96 release I think)
Re:Heavy Price for kewl new gear (Score:4)
Actually, nVidia tends to work on 6-8 month refresh cycles. the "fall refresh" for the GF3 (if we see one this year, considering that the GF3 is only now becoming available for purchase by the masses) would be a GF3 Ultra, or GF3 Pro. As well, you'll see GF3 MX (neutered like all the other MX cards -- cut out half the rendering pipelines), Quadra (high-end workstation version), and GO (mobile) versions.
However, unless you love to live on the bleeding edge (which admittedly many people do, and those that do should know what they're getting into), there's no point in upgrading your card with every refresh. If you've got a GeForce 256, or any MX card, the GF3 might be a good buy for you in a few months. If you're running a GF2 (anything but MX), you shouldn't bother with the GF3. If you're still on a TNT2 or older, the GF3 is the board to get. Amortize the $400 price over three years of not buying a video card ($125 for the GF 256 you didn't buy, $125 for the GF2 you also didn't buy, $150 for the GF3) and it becomes easily palatable.
Re:Heavy Price for kewl new gear (Score:5)
Bah. 3dfx had sucked for quite a while prior to their death. Half-assed competition is not competition at all. However, believe it or not, there's still competition in the 3D accelerator market. ATi's Radeon line is going strong, and a new rev is expected later this year. The upcoming Kyro II board (boards? Or is Hercules the only board manufacturer?) looks to really push nVidia on the low end, as well.
With that said, you'd have to be blind not to acknowledge that nVidia is currently the leader in high-end, gaming, mid-market, low-end, and even moving into mobile video for a reason -- damn good technology. The GeForce 3 continues along that line. The only problem is that we've currently hit a bandwidth bottleneck, so you're not going to see ever-increasing frame rates. What you are going to see are higher framerates at higher detail levels when developers begin taking advantage of the new features.
As for 3dfx being "good" because they supported Open Source, all I can say to that is "bah". If you want to make your purchasing decisions based on something so ephemeral, that's fine by me. I'll continue to purchase top of the line hardware because I like getting the most for my money, all philosophical differences aside.
Re:Q3A a poor benchmark for GF3 (Score:3)
Until games come out that specifically make use of the GeForce 3's new capabilities (per pixel shaders, vertex shaders, etc...) then there won't be any program that gives a total picture of what the card can do.
Love it! (Score:2)
Re:Love it! (Score:2)
Re:USELESS MARKS (vertex and per pixel shaders any (Score:1)
Why care about features when performance is so low (Score:1)
Its amazing to me that people dont pay attention to 2D performance any more. I know alot of you are going to say that its good enough, and that X windows works just fine for you. But really, think about it, your wasting thousands, no millions of clock cycles waiting around for blts. And dont even think about getting hardware accelerated alpha transparent blts. I have been looking around for any support under Linux for that and it just dosn't exist.
Well, thats enough ranting for me.
Heavy Price for kewl new gear (Score:1)
Anyone remember the P133 vs P150, which only offered about 5% spped increase, but had a hefty price premium.
Just like AMD came along and bashed about Intel, someone needs to come along and bash about NVidia in the Video graphics arena. Good hardware, but expensive with crap drivers.
Re:Heavy Price for kewl new gear (Score:1)
But all this kewl pixel shading will mean zilch until some game makes use of it and becomes popular.
yeah I will probably change my mind, but all this new hype means nothing until it is fairly widespread tech.
At the moment, its just something for those on the NG to say "Hey I got the big bad latest NV, is 3000 fps in Q3 good? " or "How do I overclock the new NV?". SO in other words, the card is useless other than for those who want the "latest" and are prepared to pay the premium.
BTW: I am sure those features have been available in high end cards for a long time.
Re:Heavy Price for kewl new gear (Score:1)
Definately NOT.
In 3 months time, NV will announce the "upcoming" GEF4, every Tom, Dick & Harry will be saying "Ahh man I gotta wait for the new NV, the GEF3 is just outdated", prices will plummet and in 12 months time you won't be able to give away your spanking new tech GEF3.
I re-iterate, NV is doing what Intel used to do when it had the market to itself in the "MMX" days.
Re:Heavy Price for kewl new gear (Score:1)
Re:i'd be glad to see some FSAA (Score:2)
What's a real pain in Quake is when you're in the middle of a battle and get a quarter-second pause when running into a new room as the computer loads textures. If you die because of that, it takes all the fun out of it. For you and for your opponent.
The point of dropping the detail is to remove the computer from the equation as much as possible, so it's a game of skill between the players, not their machines.
In something like Myst, the graphics are everything. In something like Quake3, the graphics are just a way of describing the world that contains you, your opponent, and the weapons.
Re:More Software! (Score:1)
Re:Marginal increase... (Score:1)
If you read the introduction to the benchmark you'll notice it says 'teaser'. The rest of the details will be coming later, after which you'll be able to get a better idea about what is going on. I'd take them with a pinch of salt for now until we see the rest of the data, but no need to completely disregard them.
This will piss off the orgs who want to kill GPL (Score:1)
Having good quality drivers for advanced hardware is critical to keeping Linux as an acceptable choice for many home users. Linux already suffers from this, poor or non-existent support for some 'killer' hardware, especially USB devices. This does not affect the server space, but probably has a huge impact in the home.
Nvidia have a close relationship with Microsoft (Xbox). How long before we see large companies what want to crush the concept of open source/open standards/open information applying pressure on their partners to ensure that the support for the latest hardware is not provided.
While the antitrust suit was ongoing this might of been suicide, but now, especially with George Jr in place, they might decide they can get away with it.
Actually I was bracing myself for Nvidia to announce no decent drivers for Open-Source OS's, saying 'we won't encourage un-american (their definition, not mine) activities as they turn round and bend over to Microsoft and it's ilk. I thought that mundies crap was in preparation for this, and I still worry that they may try this in the future.
So congrats Nvidia, you just reduced my paranoia by a bit.
EZ
Is there source/specs? (Score:2)
------
Re:Insightful? (Score:1)
Insightful? (Score:2)
The problem is maturity. The drivers and libraries for Windows have been tweaked, retweaked, and then tweaked again. Give the drivers and libraries on Linux the same treatment and I'm sure we'll see equally good results!
---
Go to http://www.sexiestgeekalive and vote for Angie this month! Yes, she knows they screwed up the link to her homepage.
Re:i'd be glad to see some FSAA (Score:2)
Still, why would you use a GF2 in 4x mode? I see using the V5 in 4x, b/c that's what the voodoo chips are good at.. looking pretty, w/ some hit in FPS.. the GF2s are better at high-fps, low FSAA applications.. don't try to make the GF2 look like the V5.
//Phizzy
Re:Heavy Price for kewl new gear (Score:1)
Jep, I'm one of those fools who bought a G400 when it came out and now I'm used to having 2 VGA outputs on one card, combined with a TV-Out for playing my DVDs or just running presentations on beamers
Call me dumb, but I actually prefer staying with Matrox, especially as their cards are still being properly supported with new driver and BIOS releases, something which really annoys me with the nVidia Chipsets. (Let's face it, you buy a card from Elsa, Asus or whoever and they support it for 6 months after which any new drivers you'll be seeing are the nVidia reference ones, which of course lack all the extra features your specific card has.)
Anyway, the above is just my 2 cents. I'll be staying with my MGA G400 as long as I can and I hope somebody not nVidia releases a GF3 competitor card before mine gets unbearable.
Re:Only incremental performance (Score:2)
Well, the glx module *has* to make use of the shaders, otherwise you won't see anything. But you can't make use of the power of these new features without directly programming the card, of course. So yes, hand-coding is necessary, but that is a plus! That's like the difference in power between notepad and emacs
Re:Heavy Price for kewl new gear (Score:2)
And for the overclocking and "I've got a bigger cock"-factor
And AFAIK, not even high-end hardware has features comparable to the vertex and pixel shaders in the GeForce3. They have diffferent stuff you don't find anywhere else (like the color matrix on sgi, or hardware support for the accumulation buffer), but no really programmable hardware at that level.
Re:Locking ourselves in (Score:2)
Re:Heavy Price for kewl new gear (Score:4)
Bullshit. The GeForce3 has a bunch of new features that other graphics cards don't even come close to. Ever heard of vertex and pixel shaders? Now you can write your own little program that runs on the graphics card for every vertex or for every pixel drawn. And it's a powerful language, too!
Current games don't take advantage of that, but wait a year or so, and you will change your mind. An area where these things are already used (at least in prototypes) is visualization. It is now possible to do 3D volume rendering etc. at very high speeds using these features.
So comparing the GeForce2/3 to the P133/150 is ridiculous. Drivers are a different matter, though
Re:Stop spreading NVIDIA's lies! (Score:1)
Re:Heavy Price for kewl new gear (Score:1)
OpenGL offline rendering (Score:3)
"Easy". Check out the first paper on this page [unc.edu]. It's from SIGGRAPH 2000, where it rocked my world. ;^) It describes how OpenGL, with two (fairly simple, although not supported by today's[*] hardware) extensions, can be used to execute RenderMan shaders.
[*]: Check out what a certain id Software programmer typically says when asked about desirable future directions for rendering hardware, and extrapolate.Re:Incremental performance (but a driver issue?) (Score:2)
The ability to use the nVidia detenator drivers is a huge boost for anyone who owns a GeForce card. The divers that came with my Asus 6800 and the new versions on the Asus website are amazingly poor. They are pretty much unusable, not only do they have stunning incompatabilities (Real Player for god's sake) but they make my system crash very regularly.
Aside from quality issues, the drivers can also yeild some pretty big performace gains, I know I saw way better frame rates after switching from the Asus drivers to the nVidia Det 3 drivers.
I can understand why they are releasing binary only linux drivers. I'm not very happy about it but I do understand.
Re:multitasking games; windows vs linux (Score:3)
I run RH 7.1 and Quake 3 is okay under linux (Geforce 256 DDR, P3500) but it's still a touch better under windows. The problem is, when something like updatedb kicks off, it slows to a crawl.
I really can't think of a good way to deal with this, when I'm starting a game I don't always think about everything that's scheduled to run in the next hour, or could be started for some reason. What I'd like to have is a script I could run that would automatically knock everything else down prority wise before I launched to game.
I guess what this comes down to is me not really understanding enough about how procoess prority is handled by the kernel so I'm not sure how fix this. Has anyone else ever tried to set something like this up before? If there were tools out there to do this I think it would do a lot to improve gaming on linux.
Re:Heavy Price for kewl new gear (Score:2)
That's crap. They had NO COMMITMENT to Open Sourcing a DAMNED thing until they started grasping at straws for ways to surive!
Originally, they were very strict on the terms and conditions concerning use of GLide. In fact, I remember quite well how they went after Glide Underground for posting Glide wrappers.
Glide only started preaching about Opened Source after they realized their CLOSED API was losing market to Microsoft's closed API and OpenGL.
On the negative note -- without 3DFX... TAGOR has less to complain about.
"Everything you know is wrong. (And stupid.)"
Re:Incremental performance (but a driver issue?) (Score:2)
I don't have my GeForce3 yet, so I don't know for sure though.
Re:USELESS MARKS (vertex and per pixel shaders any (Score:1)
Hey bro, if you like Meshuggah, please check out my buddy's (LegionsInHiding) guitar medley from Chaosphere:
Meshuggah_-_Chaosphere_-_Guitar_Medley_by_LegionsI nHiding.mp3 [apk.net]
Mike Roberto
- GAIM: MicroBerto
Re:i'd be glad to see some FSAA (Score:1)
export __GL_ENABLE_FSAA=1
export __GL_FSAA_QUALITY=2
That should put it in 4xFSAA (aka 2x2) with no LOD bias (pretty).
------
Re:Heavy Price for kewl new gear (Score:1)
Game companies are not going to write games that take advantage of a chip if hardly any of their customers own in. It's a chicken-and-egg problem. Normally, you'd expect innovation to go nowhere in this situation. However, NVidia has consistently taken the risk and implemented technology that they knew wouldn't be fully used until a year or two later, and thus kept the industry moving forward. And all you can do is blast them for it? *sigh*
------
Re:Q3A a poor benchmark for GF3 (Score:1)
Also Tribes 2 is far more sensitive to non video card related issues affecting fps... try talking to all my buddies who's GF2s perform way below my GF1DDR.
It's the new extensions that makes a difference (Score:1)
So if you don't want those extras, get a GeoForce 2 Ultra card and wait for GeoForce 4 or a company that makes GFX cards directed more towards linux.
Re:Actually... (Score:1)
NV20 (Score:2)
The NV20 architecture will be in the XBox, so it's getting considerable development attention.
The price drop has happened. It's $357.12 at ProVantage. The $550 price was probably just to hold down initial demand while production ramped up; the product has only been available for a few weeks. Carmack's comment was that developers should get one immediately; others should wait.
NVidia extensions available in OpenGL (Score:2)
You don't need graphics-card/linux-specific hacks. It was reported on OpenGL.org [opengl.org] recently that nVidia added the functionality to OpenGL it's extensions system [sgi.com].
All we need now is the implementation of their extension in Mesa - if they're going to go to all the trouble of developing OpenGL extensions you'd expect nVidia to help there as well.
Re:Heavy Price for kewl new gear (Score:1)
Bullshit. The GeForce3 has a bunch of new features that other graphics cards don't even come close to. Ever heard of vertex and pixel shaders? Now you can write your own little program that runs on the graphics card for every vertex or for every pixel drawn.
Does anyone else think about those Amiga games that actually used the Copper to draw graphics by changing the color registers on the fly, instead of actually drawing pixels in the conventional way? In many ways the GF3 programmable shaders bring Copper to my mind, although that poor old component only had three instructions and seriously more limited capabilities. (Well, has... my Amiga still works ;)
Re:i'd be glad to see some FSAA (Score:1)
Re:Q3A a poor benchmark for GF3 (Score:1)
Re:It's the new extensions that makes a difference (Score:1)
2) If you are buying a cheap card now in order to wait for GeForce4 or a price drop on the 3, then the GeForce2mx is much more of a price/performance sweet spot than the Ultra is at.
3) While its true that the new per-pixel shaders are a good match with DirectX 8, they are also available to OpenGL, so there is nothing to stop their use on Linux if you are prepared to code for it.
Re:Q3A a poor benchmark for GF3 (Score:1)
Re:Q3A a poor benchmark for GF3 (Score:1)
Re:multitasking games; windows vs linux (Score:1)
Well, what happens on Windows when someone accesses your web server while you're playing a game?
If you're running stuff in the background then either:
Re:geforce3 isn't about MHz (Score:1)
The only thing I don't like is that the vertex shader language is a spec that's essentially dictated by Microsoft. That means that when it comes to these l33t new extensions there's only one standard and that is DirectX 8. Yeah, it'll be supported as an afterthought in OpenGL, in a proprietary extension that's different from card to card. Then again, if it weren't for Carmack, and possibly the independent popularity of Linux among SGI users, OpenGL would have been dead and buried, folded into DX7 (remember Fahrenheit?) so I could be lamenting a standard that's doomed anyway. Shame though; OpenGL had a nice clean API whereas DirectX is a mess.
Re:Why bother? Fucking nerds. (Score:2)
Re:Heavy Price for kewl new gear (Score:4)
No more nvidia crap for me (Score:1)
Re:Other then unreal and quake? What distro? (Score:1)
Re:Other then unreal and quake? What distro? (Score:1)
2. Redhat
3. Slackware
Ftp to ftp.slackware.com. Look in
What are peoples experences with these two?
My experience is that I always come crawling back to slackware in a month or two... YMMV.
Re:USELESS MARKS (vertex and per pixel shaders any (Score:1)
The reason I specified those two new extensions specifically, is because they are the ones that people will notice dramatically. From a programmer's prespective all the new API calls kick ass, but I'm not expecting all slashdotters to keep up on nVidia's new extensions. The GF3 is so cheap because of the market right now. nVidia needs to keep showing that they are moving inventory to stay on the upside of this turbulent market. ----- P.S. I dig your site man! :)
USELESS MARKS (vertex and per pixel shaders any1?) (Score:4)
PER-PIXEL SHADING : What is per-pixel shading? It's a method of applying special rendering effects... per pixel. It allows material and real world effects to be applied individually to a pixel for more accuracy and intensity. Per-pixel shading will redefine the visual look and feel of imagery for PC graphics. Per-pixel shading has long been used in film production to create a more realistic and lifelike appearance for computer generated imagery. If you've seen Toy Story, you'll definitely remember Buzz Light-year. Remember the translucent reflection on Buzz's helmet? How the environment and light streaks reflected off the glass but also let the image underneath show through? That was done with per-pixel shading. Until now, it wasn't practical to use per-pixel shading on a PC because of the intense power and processing requirements needed. Sure, you could have done that in 3D Studio but could you have done it in real-time? Could the effect be applied to an entire frame at high resolution in 1/60th of a second? Not until now.
Per-pixel shading is useful for simulating natural phenomena and accurate surface attributes such as fur, cloth, metals, glass, rock, and other highly detailed surfaces. Traditionally, effects were done on an entire triangle and sometimes an entire texture using a technique called interpolation. Special effects were done using calculations based on the vertices of the triangle and interpolating the entire area from the vertices. The end result is a generalized visual appearance... like an estimate or approximation of the final image. The key benefit of using interpolation is that it is fast and easy to apply. But, the downside to it is that with large triangles, the resulting image contains artifacts, which degrades overall image accuracy and quality.
Using per-pixel shading, effects and calculations are applied to individual pixels. Since the triangle will be composed of many pixels, the resulting image is highly accurate in representing what the image was intended to be. Let's assume that a generic triangle is drawn together (including its area) using 100 pixels. Now, we also have a effect pallet of 10 effects. Each pixel then, can accept any one of the ten that are available. That's an outcome of 10,000 different possible effects just for that one triangle. If interpolation was used, than the effect is fixed using that one out of ten effects and generalized across the entire triangle. Below is a visual comparison between interpolation and per-pixel shading.
PROGRAMMABLE PIXEL SHADERS : The GeForce3 can handle four or more textures at a time. Logically, the GeForce3 would have to be able to handle them independantly to accomplish the "infinite" number of effects that Nvidia claims it is capable of doing. Besides juggling textures independantly, they are also able to apply effects to each texture independantly using the DirectX8 shader, as been said by Nvidia.
With the new engine, it is possible to have effects like a texture surface that's shiny, bumpy, and dynamically changing. Also, with the nfiniteFX engine (programmable pixel and vertex shaders), the developer can custom program the engine itself to do whatever they want it to do from an unlimited number combinations and permutations.
Once the texture combination calls are completed, there can be an unlimited number of combinations that you can do with the 8 texture blends. All of this wraps under the DirectX 8 pixel shaders.
MY DISPUTE : Is that the drivers that Evil3d used weren't using any of the extra API calls. Given, they aren't out yet. But by disregarding these new features along with the GF3, it makes it look like it is just an overpriced GF2!
If anyone has seen the presentation that John Carmack made at MacWorld this year, he unveiled his next 1st-person shooter. It looks qutie realistic and not to mention it is full of these new API calls. (It isn't just wasted coding, it does have a purpose.)
Re:More Software! (Score:1)
Re:multitasking games; windows vs linux (Score:1)
Re:Heavy Price for kewl new gear (Score:1)
By the time it becomes "popular", it's common and hence isn't really as "fun", imho
i'd be glad to see some FSAA (Score:3)
That's what makes the real difference, the ability to play games in full color, with reasonable screensize (800 or 1024) and heavy FSAA.
Guess i've to wait for those benchmarks though :-/
Re:i'd be glad to see some FSAA (Score:2)
Re:multitasking games; windows vs linux (Score:1)
You could just man renice and gen up on what you need to do.
One of my friends runs Seti@home reniced to -19 on his K6-2/500 and it completes a unit in about 8 hours...
Who cares about the speed ups? (Score:5)
It's all about interesting and orthogonal features now. GeForce 3 brings vertex and pixel shaders in hardware to the mix, as well as hardware shadow map support. The disappointing thing is that 3D textures (despite word otherwise from John Carmack) don't appear to be accelerated in hardware, at least with latest drivers (see a recent thread in the advanced section of www.opengl.org [opengl.org] on that unfolding story - NVidia are soon to make an announcement on what the deal really is).
Being able to program in a pseudo-assembler language for custom per-pixel effects is a hark back to the old days when you had complete freedom over everything you could do, but most of it was slow. Now we have a better mix where we are hardware accelerated, but pretty flexible down to a programmable level. *However* the current revision of pixel shaders (1.0? 1.1? can't remember) on DirectX (and very similarly and more relevantly on OpenGL) aren't as flexible as some may like (notably John Carmack), since to paraphrase him "You can't just do a bunch of maths and lookup a texture". Hopefully that will get better with time.
Yes these things are important to games mostly. And yes they are arguably the biggest step forward in consumer graphics tech since the original 3dfx card... certainly since hardware TnL. Wait for the price to come down (since initial pricing is aimed at developers and the *really* hardcore gamer), and in the meantime amuse yourself with some of the demos from Nvidia's developer site [nvidia.com]. Nvidia are by far the most developer friendly company I've ever encountered, so short of Open Sourcing their drivers (which we have no right to expect them to do), they are almost ideal from my (developer's) perspective.
Henry
Re:USELESS MARKS (vertex and per pixel shaders any (Score:1)
Also, it is cheaper than most GeForce2 Ultra cards... so it is an incredible buy. At $350USD, it is the cheapest launch of a flagship NVIDIA product in over 18-months.
Re:Marginal increase... (Score:2)
Re:Only incremental performance (Score:1)
But regardless of how its going sell, it's rediculous to say that it won't be supported. With the new X server I get similar performance in linux and windows with my GF2. And yes, that's with a custom GLX module (provided by NVidia), which will probably either work with the GF3, or will be modified. And yes, with an alternate kernel module (provided by NVidia) which I took fifteen minutes out of my day to download and install.
This is linux we're talking about. You _always_ have to work a little harder to make things work just right. That's why its fun. Having a much better X server and screaming 3D gaming is worth the extremely small amount of work one has to personally do in taking the extra time to just _wait_ for a download and then just _wait_ for a compile.
Re:i'd be glad to see some FSAA (Score:3)
I originally bought a Voodoo5 card and played everything at 1024x768 at 2x FSAA. Beautiful!
Eventually I sold it and went GeForce2 because Linux didn't support the Voodoo5 well. Unfortunately, the FSAA quality isn't as good -- I have to play 4x FSAA on the GF2 to get the same visual effect most of the time (and 4x is only available in Windows... *sigh*) so I have to play at 800x600 most of the time.
But it's worth it.
I couldn't imagine going back to playing non-FSAA, even at 1600x1200. People who still haven't seen FSAA... It's worth the cost of a hardware upgrade, IMO.
Now I'm dying to get my hands on a GF3 to try the new HRAA (is that the right abbreviation?) alongside the nifty lighting improvements.
Here's to Doom 3 on GF3 with AA.
Re:Heavy Price for kewl new gear (Score:1)
That being said, I think 3dfx getting killed was about the worst thing that could happen to the 3D industry. Competition is much less now, and 3dfx always showed a very strong commitment to Open source.
Scott
Re:Heavy Price for kewl new gear (Score:1)
as for kyro...again we'll see.
don't forget things like HSR--there are lots of tricks left to reduce bandwidth, but agreed, that is the major bottleneck right now.
I own an Nvidia GeForce2 right now. I used to own a 3dfx. 3dfx had great support under linux ad was easy to use. There's no doubt 3dfx hasn't been a serious contender for awhile, however don't forget that even in the months leading directly up to 3dfx shutting their doors, 3dfx cards dominated in retail. They made popular cards, their failure was primarily OEM and volume based.
Scott
Re:Q3A a poor benchmark for GF3 (Score:1)
Re:More Software! (Score:1)
Whitney Battestilli
Alias/Wavefront
Re:i'd be glad to see some FSAA (Score:1)
Marginal increase... (Score:4)
On Windows (which has the more developed drivers at this point in time, since NVidia would have that ranked as a priority) at 1600 * 1200 the GeForce 3 has a healthy increase over any previous video cards, whereas in this benchmark the performance is actually worse as the resolution increases! (Windows benchmarks of the GeForce 3 are the inverse of this)
Give NVidia some time and then benchmark the GeForce 3 on Linux, the performance increase should be a nice gap.
Also notice the distinct lack of details about the benchmark... the only details given are the system specs, so I would be inclined to question how valid the results really are.
Re:Heavy Price for kewl new gear (Score:1)
How about ATI? I'd love to see them do it.
Other then unreal and quake? What distro? (Score:1)
I find that games run nice in linux if you run the said two above, otherwise its harder to get high fps?
I ask this because I have a desktop that this card is going to land in and a laptop that i want to be able to play this game with. Both are for lan parties, but I cant tell what distro I want to use, what are the pro's and cons? I am stuck with two distros as my choices (I can download the ISO's (so that means suse is out):
Are you on the Sfglj [sfgoth.com] (SF-Goth EMail Junkies List) ?
I like gaming goodness. (Score:1)
With more games available on Linux, this is leading more and more into a situation where I can kill a certain partition on my hard-drive.
Well, I guess it's not so bad as long as the evil overlords don't require me to upgrade. That partition is still running '98, since they haven't added any killer-app features. That may well be their downfall.
Here's to more and better and faster games on *nix. May they not crash or close on the windows button.
[/karmawhore]
Re:I like gaming goodness. (Score:1)
Re:Heavy Price for kewl new gear (Score:1)
Re:I like gaming goodness. (Score:1)
BdosError
Re:I like gaming goodness. (Score:1)
BdosError
Re:Heavy Price for kewl new gear (Score:1)
Re:but no drivers in the kernel (Score:2)
You'll find the drivers at http://www.nvidia.com/Products/Drivers.nsf/Linux.
Andrew
Only incremental performance (Score:3)
Andrew
Re:Heavy Price for kewl new gear (Score:1)
Re:geforce3 isn't about MHz (Score:1)
i was angry:1 with:2 my:4 friend - i told:3 4 wrath:5, 4 5 did end.
Re:Q3A a poor benchmark for GF3 (Score:1)
Re:Heavy Price for kewl new gear (Score:1)
More Software! (Score:4)
No, not Quake. Real software.
Maya is coming soon, but there are still a few other things that you need to have a complete 3D solution, like proper NLE and PostPro software. Plus, a bit of competition wouldn't be bad: How about Cinema4D or Imagine? It'd also be cool to see Elias or Eclipse on Linux.
multitasking games; windows vs linux (Score:1)
Sure you can get higher average and maximum framerate's, windows just has better gameplay.
The reason for this is that linux is a real multitasking system. Unlike in windows, where the system is totally taken over by the program that uses the most cpu-time (or is in the foreground), linux just leaves it all running at the same performance.
This means that if you are playing some 3d game, and someone accesses your server, of perhaps just some program you are running just needed disk access, that program gets all cpu (for just a very short time). It results in a sudden drop in framerate.
One can try to set the priority of the 3d game to -20 for example, but another program will always get the cpu for a short time.
The current solution for this (the problem also shows up at decoding dvd or mpeg) is running only programs that are really needed. Or off course buying a dual-processor system
But perhaps there can be something done at multitasking-priority level. Getting the system to switch faster between the programs maybe (because an actual multitasking-system would need a cpu for every possible task it's running), or perhaps a whole different approach: instead of giving a program the cpu for a short period of time, getting tiny bits of the lower-prioritized program processed in between the other program (game in this case). It could be however that this causes a higher use of cache, but I am not that good informed about that..
In any case, this is the real reason why people still want windows for playing games, and as long as this problem isn't handled, it will always be a bit more confortable fragging on windows.
Re:Incremental performance (but a driver issue?) (Score:3)
I don't know how closely guarded a secret the methods of DX8 are by their owners (I'll tactfully not mention their name
Which brings to the fore once more the issue of driverless hardware being largely redundant. Can we ask nVidia to take the same care over their Linux drivers as those for Windows? And then, will we get them as Open Source?
(come on, you and I know that nVidia don't make money out of driver sales, and so it's going to be okay for them to write the drivers so they sell the card to all you hardcore gamers who also choose Linux.)
Take care,
Ken.