typodupeerror
Note: You can take 10% off all Slashdot Deals with coupon code "slashdot10off." ×

## Tom's Hardware on The GeForce256116

~fly~ writes "Tom's has a detailed review with benchmarks of Nvidia's new GeForce256 'GPU'. " The synopsis: High expectations, but it appears to meet the demand.
This discussion has been archived. No new comments can be posted.

## Tom's Hardware on The GeForce256

• #### Re:Tomzilla (Score:1)

by Anonymous Coward
Well, here are some links and quotes to give you more of an idea of what went down, although I'm having problems finding the original start of the conflict, but others have summed it up just as well.

But most importantly, here are some of the LINKS you requested for.

From what I understand of the issue, the first Q3Arena(Q3A)beta was released for Windows, and Tom's Hardware used this program to benchmark a whole bunch of cards.

Brian Hook, a developer at Id, makers of Q3A, didn't much like Q3A being used at such an early stage of development, and blasted Tom for basing the evaluation of 3D Cards on Q3A.

AT least that's what I remember, someone correct me if I'm wrong.

And for what it is worth, I fell on the side of Brian Hook. I mean, it's their software, they know Q3A inside out.

I've heard Brian speak at Siggraph, one of the premiere Computer Graphic conferences.

And I've followed his .plan and editorials for a long time, and while he can be very blunt and truthful at times, he has always conducted himself in a manner to speaks well for him, IMHO.

Who would you choose to listen to?

http://www.tomshardware.com/blurb/99q3/990802/blur b-01.html

3D Chips, Quake3 and more ... Talking about 'being reckless to only serve oneself' brings me to another thing. Have you heard the 'news'? In the new version of Quake3-Test TNT2-boards are still beating Voodoo3-boards. Does that surprise anyone? Well, it should! I remember somebody making a big fuss over me claiming exactly that, when I ran two different versions of Quake3 before. I remember crap like 'inaccurate test' and 'software piracy', which was all only covering up for the fact that some individuals didn't like the TNT2 looking a lot better than Voodoo3 in Q3. I don't know what nice benefits the key person who ranted about my article received from 3Dfx and I can only guess which benefits were received by the publications who used his ridiculous accusations to crucify me, but now we can see the crystal clear truth, I was right all along. Thus I am looking forward to receive some nice apology letters from all the individuals who accused me of all kinds of crap. Let the apologies run in, I look forward to one in particular. In case you don't remember that person anymore, he was once taking advantage of the fact that he worked for Id Software, and luckily for all of us he's quieted down in the last months since nobody cares about him anymore.

>>>

And here is the link to the original benchmarks Tom made that sparked the debate.... http://www.tomshardware.com/releases/99q2/9905111/ index.html

Here, for contrast, is Brian Hook's response about Tom's statements, it is in a column on VoodooExtreme called 'Ask Grandmaster B'.

>>> May 14th, 1999 - (3:00am MDT)

Snipped from tomshardware.com

"This Brian Hook, or 'Grandmaster B', as he likes to call himself very modestly, has said a whole lot in the past, some of it was true, other stuff wasn't. 'Timedemo' DOES work if you do it right, and Brian is unfortunately incorrect. Sorry Brian, but even 'Grandmaster B' is not perfect"

For several months now it seems as if his site has taken a very biased attitude towards certain video card manufacturers. Tom Pabst will go to any lengths to "prove" the superiority of one card over another. All personal opinions aside, how does it make you feel when you read comments such as this one that exhibit blatant bias and even disregard your comments towards a game that you programmed? This is just one of the many examples of the rampant biases in his "review" site.

David C.

Dave,

I've seen those comments, and I think they pretty much speak for themselves. My track record speaks for itself within this industry, and unlike others, I don't have a resume that basically consists of "I've done a lot of HTML, therefore I must know my shit, right?". I actually do this stuff for a living, so maybe I'm not full of shit. And I don't advertise products that I'm also reviewing on my Web page. I don't make ANY money from Grandmaster B adsor from .plan files, so you know there's no conflict of interest when I tell it like it is. I do not own any shares of stock in 3D accelerator companies, so there's no conflict there.

When I write this stuff, I do it because I enjoy it and because I like to educate others. It's that simple. It's not a job, it's not something I have to do to feed my dogs or pay off my car. I do it for the love of writing, communication, and education. It's that simple.

Tom's numbers might be valid. Then again, they might not. There's some variability there, depending on how paranoid you are about his advertiser revenue affecting his findings.

But all that is irrelevant -- what is going to happen now is that id will publish authoritative numbers using up to date drivers and production hardware. Those numbers will be the unarguable truth, using the most up to date code possible. There will be no illicit overclocking, no conflict of interest, and no advertiser revenue from chip manufacturers to dilute any discovery.

Tom's comments about _me_ are irrelevant -- Pissing matches over personalities are pretty much useless,people will take sides over who they like the most and rarely will listen to the issues. In the end, what matters is getting good, honest numbers to the public, and that is something that id has promised will be around Real Soon Now.

>>>

• #### G400 whippin' GeForce at 32bit (Score:1)

by Anonymous Coward
It is interesting to compare the 32 bit performance of the G400 and the GeForce

640x480 GeForce has a slim lead
1600x1200 G400 slapping the GeForce silly

Expendable 1280x960 [anandtech.com]

Q3Test 1.08 1600x1200 [anandtech.com]

both from Anand's review [anandtech.com]
• #### Opinions? (Score:1)

by Anonymous Coward
Here is what happend: Tom posted Quake3test benchmarks, this was at a time when the current test did not correctly calculate the fps scores from timedemo, this was a fact stated by John Carmack and Brian Hook. Hook stated in his .plan that Tom's benchmarks were inaccurate. Tom responded by saying Hook didn't know what he was talking about (keep in mind Hook was the #2 programmer on the quake3 project up untill he moved over to verant) and Tom refused to release any facts about how he ran the timedemo's other then that he did something or other to make it work. I could have made up #'s and claimed the same thing, and I have no reason to beleive that this was not what Tom did. Eventualy it boiled down to a pissing contest Tom vs Hook as to who knew how the Quake3 engine worked. What Tom was doing was the equivalant of me going up to Linus and telling him he doesnt know jack shit about linux. This had nothing to do with "Tom's opinions", this had to do with the fact that he was posting lie's that favoured a certain company using unverifiable techniques that the developers said were invalid. Tom has no credibility, and I now refuse to goto any page containing "tomshardware.com" in the url. Btw, I hope you get your moderator status revoked, as I specificaly recall CmdTaco stating that said status was not to be used in such a manner you suggested, and that it was also to be anonymous (i.e. it was never intended that moderators threaten to use their powers publicly).
• #### Re:Linux Compatibility? (Score:1)

The GLX driver was written by David Schmenk of nVidia.
• #### Re:Why? (Score:1)

Interesting idea, but you can probably get more performance out of the GPU the closer it is to the graphics chip.

Still, this does exist in some form right now. Think 3dnow, AltiVec, etc. (Although this has much wider usage than just geometry processing)
• #### Re:Linux Compatibility? (Score:1)

There will be a Myth 2 patch in the near future that will add full Mesa/OpenGL compatibility.
• #### Re:Who cares? Tom is crazy! (Score:1)

Yes, I know - Anand's about one year younger than me. Maybe I'm naturally a better writer than him, but I can recognise that he needs to work on it a lot. Michael is better, but both need a good proofreader/editor to go over their work before they post it.

As it is, though, they get good information out quickly, and once you can look past the piles and piles of irrelevant benchmarks and instances of terrible writing, it's a very good site for tech information.

• #### Re:Who cares? Tom is crazy! (Score:1)

I'm sorry, Anandtech is a good site, but they do not write well. Mixed metaphors, confused tenses, and awkward sentences are the rule of the day there. That's not to say I don't read the reviews with interest (though I do skip over the very long and tedious piles of benchmarking most of the time), but it's very painful reading for someone who knows how to write well, or has read a lot of good writing.
• #### Re:i liked... (Score:1)

Matrox has released all the specs they do not consider to be IP to the linux community. Meaning that we do not have the specs for the 'WARP engine', the triangle setup engine(s) on the G200/G400. Luckily there are some rather good-hearted and persistant engineers inside of Matrox, they got a binary-code version of the setup code out, and specs for loading it in.

3Dfx has released full 2D specs. It is a shame they do not release 3D specs, but it is definately not for lack of people asking :) 3Dfx was also the first to release, and the only people to actually keep their product up-to-date (glide on Linux is just as fast as glide on windows, I believe the entire thing is in assembly with some MMX and 3DNow optimizations for bandwidth reasons, since Glide does not do transformation).

Compare to Matrox's zero released drivers, or NVidia's driver release. Will they release another one? Who knows, hopefully. Maybe when XFree 4.0 comes out.

The only thing you could do with the glide source code is port it to another architecture. Really. Or inline with Mesa. There is absolutely no optimization that can be done to the glide API implementation left (that human beings can comprehend).

NVidia has released no register specs whatsoever. They did release a special version of the internal toolkit that they use to make the windows drivers (i.e. their version of glide, 'cept more object-oriented and all-encompassing, not just 3d but 2d and video). But they have been rather bad about promoting this, the binary drivers on their page for the API are for an old kernel, and the toolkit is not up-to-date with their own internal version. Because of this (And because the Riva XServer/GLX don't even use it) there haven't been that many people messing around with it.

But some people on the GLX mailing list (including me) have been trying to lobby for the release of the 'real' nVidia toolkit. If this gets released, things like the GeForce would be supported day of RTM. So keep your fingers crossed. :)
• #### Re:Linux Compatibility? (Score:1)

Detonator is Win9x and WinNT 4.0 only. All other drivers are written differently. The XFree support was not done by NVidia AFAIK, it was done by one of the people working for ST (Their chip producer, at least at the time). The GLX driver that is out was written by a third party I believe, and does not use any of the NVidia code at all, just #def'd register location/vars.
• #### gforce may not work/ no overlay plane? (Score:1)

i saw nothing on nvidias site about using
the gforce for anyting besides gamming. no
mention of an overlay plane which many 3d
apps need.

if they really do have a good 3d engine for
serious work, hopefully they will make a
high end board.
• #### Re:Why? (Score:1)

Uhg.. Well that attempted work around to the Mozilla M10 word wrap problem didn't work. Sorry about the formatting.
• #### Re:GPU Acceleration (Score:1)

I think the benchmarks show how poorly Direct3D scales, as a well written OpenGL based game (Q3) scaled with the hardware and driver quality available, and Direct3D requires the next version of Direct3D to make use of any new enhancements.
• #### why not open source? (Score:1)

When it comes to drivers, I don't understand why the company wouldn't want to release the souce. It's actually a lot easier for them since some people might actuall look at the source, find bug and/or add improvements.

Hardware manufacturers are in business to sell hardware. Drivers is something they must write to go with their hardware, but they make *no profit* on them. Thus releasing the source would allow them to leverage the OS model of development, and hence lowering the production costs, while loosing *no* revenue!

At least that's the way I see it. Comment's / corrections are welcome.
• #### Re:why not open source? (Score:1)

Perhaps it's in the interest of certain software companies (naming no names, obviou~1) that drivers are released single-platform, binary-only?

Hamish
• #### Yes, NDA expired @ 9:00 am today [nt] (Score:1)

[nt] == No Text, get it?
• #### Re:MPEG 2 support (Score:1)

I think the problem with most software MPEG-2 decoders is that they DON'T take advantage of the CPU multimedia extensions like the Pentium III SSE or the Athlon enhanced 3DNow!

I'd wish someone would write one that does use SSE or 3DNow!, because both of these extensions are well-suited for more efficient MPEG-2 video decoding.
• #### Re:Opinions? (Score:1)

These are the facts that you should have put in the original post. This is informative.

Before, you made it sound like you had a personal beef with something he wrote.

Personally, I don't read Tom's Hardware Guide, I find it a bit sensational.

Thank you for clarifying. This is hella more useful than your previous post.

• #### Re:Linux Compatibility? (Score:1)

I was wondering the same thing. My good old Viper550 (TNT) card seems to work with Mesa-3.0 and NVidia's "glx 1.0" driver, but I'm under the impression this is only a "partial" set of drivers. (Myth II supposedly only works with 3dfx brand boards because they're the only ones that have a 'complete' set of OpenGL drivers - though I've been trying to find out if that's really true or not...)

Has Nvidia said or done anything on the Linux front since the initial release of their drivers?
• #### Re:random, possibly baseless points and conjecture (Score:1)

At the rate things are going, graphics cards will soon be the most expensive component in every system

Maybe the most expensive component in every *gaming* system. Most business PCs (which are most PCs) have pretty crappy graphics - stuff you could buy retail for $10-$20.

The video card is practically the last point of differentation between systems - most of which ship with similar CPUs, the same Intel-based motherboard, similar EIDE disks, and similar sound hardware.
• #### Re:Why? (Score:1)

Ain't it a Graphical Processing Unit?

• #### Re:Athlon Motherboards (Score:1)

(making about 1 Gb/s of bandwidth, about as much as a PC133 SDRAM can churn out).

If that's true that's kinda weak. Sun's UPA pushes well above that. The U2 which has been around for a few years now can do 1.6Gb/s.

Of course the h/w costs a ton more and only runs Solaris, Linux, etc... So I doubt there are any games that use it... yet.

• #### Linux Compatibility? (Score:1)

What's the word on running the GeForce256 under linux? I've looked around but have not been able to find a definitive answer.
• #### Drivers are still in beta! (Score:1)

How can Tom possibly make a scientifically stable benchmark, using beta drivers? nVidia themselves say that the GeForce256 [I still like the name NV10] currently does not have a set of complete [funtionality/optimisation] drivers. Wouldn't it make sense to wait for decent software before running your benchmarks?

Bah.
• #### Re:Opinions? (Score:1)

The Quake3 benchmarks were a bit 'weird' but not broken.

The numbers were as close between runs as could be expected, texture caching and multitasking variables notwithstanding.

The numbers were also as would be expected when compared from one computer to another. A P2-300 and a P2-500 scored only a bit closer together than they would in Q2 benchmarks, etc.

The 'flaw' in the benchmarking was that the demos weren't using the final product, and Q3Test's performance changed significantly from 1.05 to 1.08, let alone to the final, with regard to one 3d card compared to another.

This problem where a TNT might be more handicapped than a Voodoo created a situation where you could compare Q3 TNT number with Q3 Voodoo. But the numbers weren't meaningless. No more than any other benchmark is. You just needed to understand the fundamental point, the only benchmark that accurately indicates performance in your application of choice is that application.

If Q3Test was what you wanted to play, and with it being out for (as we knew at the time) at least one month and probably closer to six, many people probably did buy a 3d card for Q3Test.

And, benchmarks using Q3test can also show you where cards have problems. Even if the TNT worked at half the speed of a V3 due to driver problems, you could get a pretty good indication of where the TNT was limited. Did it have a polygon problem, or was it fill limited, etc.

So Brian Hook was partially in the wrong when he slammed Tom. And he slammed Tom partly because of Tom's use of the Q3IHV (which was pirated) in benchmarks.

Then they started flaming each other and they both came out looking like idiots.

So, to summarize. Broken benchmarks can still be of value if you take a minute to understand them and how they are 'broken'. As long as the numbers aren't derived with a call to random(), they have some meaning.
• #### Comparison to Pro Graphics? (Score:1)

How does a chipset like this compare to something like the SGI's or Intergraph Graphics workstations use?

Also, what is the price for those adaptors?

• #### Re:Tom's...and every other hardware site too (Score:1)

This is NOT truth. Anand tests boards for things such as upgradebality and reliability. Try reading the motherboard comparisons, where he had winners in different categories, that is, he recommended an Abit board for tweakers and overclockers, while saying at the same time that it was not the most stable solution, and not the best option for end-users that do not overclock and for servers. In another category, he had the best all around board for non tweakers, an Asus (I THINK - not sure).

Anand is by far the most reliable and objective reviewer around. I have been following his development since he began his site, literally. He is decent and unbiased, as others are. His tests are methodic and reproducible, and he worries a lot more for quality than for volume. Now and then he voices his opinions, but he is very aware of the community and its needs, and that not everyone is a overclocker.
• #### Re:random, possibly baseless points and conjecture (Score:1)

My policy about upgrading is that I stick to what I have for as long as I can comfortably stand it. Right now I'm still using a eight month old PII350, TNT, 128mb RAM, SCSI rig at home. I will probably be able to use this machine as my primary workstation for about one and a half more years. Then I will make a similar investment and bump up several generations of technology (eg: new athlon, geforce, etc).

Once I upgrade, my old workstation gets deligated to an honorable server role (FreeBSD). Thats the way I'm using my old P166 now. I have found that this works very well, since you not only get a new machine, but you realize that your old machine isn't worthless. You'd be amazed by how many cool projects you can do if you have an extra machine sitting around the house that you are willing to experiment with without fear of losing data/a critical resource.

So, for me at least, the normally vicious cycle of PC upgrades really isn't that bad after all.

• #### Re:Tomzilla (Score:1)

my friend works and 3dfx and basically the word is that tom is on the payroll of nvidia....3dfx gave him a look at one of their marketing docs and he went and leaked it to the world. 3dfx is not angelic but tom essentially is biased due to getting sponsored by nvdia
• #### Re:pricewatch.com (Score:1)

Those prices are only for the Creative 3D Blaster Annihiliator. Here are listings based on the keyword "GeForce". (They include the Guillemot 3D Prophet, Elsa Erazor X and LeadTek WinFast GeForce 256 among others.)

Listings on PriceWatch [pricewatch.com]

As it turns out, the cheapest pre-orders are (as of now) for the Creative 3D Blaster Annihiliator.

• #### Re:Tom's...and every other hardware site too (Score:1)

Don't forget AnandTech.com :) I kinda like this site... Does anyone have any reviews of his reviews? :) Anyways, his review (part 1) is at http://www.anandtech .com/html/review_display.cfm?document=1056 [anandtech.com]
• #### Re:Athlon Motherboards (Score:1)

>133 Mhz x 64 bit = 1 Gb/s

My calculator says that 133*64 equals 8512, not 1000 or 1024. Try again.
• #### Re:Why? (Score:1)

He did read it. He said "on the motherboard" - the GPU is on the graphics card. No mistake

-----------

"You can't shake the Devil's hand and say you're only kidding."

• #### Let's use those spare GigaFlops (Score:1)

Given the amount of FP processing power and memory on recent graphics cards, they could probably run a variety of non-graphics tasks faster than the host computer.

For example, you could run seti@home on your graphics card, instead of just using it to display the results.

How about it NVIDIA? You could leap to the top of the seti@home CPU statistics!

• #### G400 (Score:1)

Check out some benchmarks. I did before buying my video card, and G400 is the fastest video card under linux according to all the sites I saw.
(Fastest meaning: fastest under Xfree, I didn't look at the performance under the commercial X servers)
Sorry I don't have a link. I've been looking for those pages that I found the benchmarks on for the past week because my friend is looking at buying one, and I want to show him how much better the G400 is...
So if you do find a page with that on it, please let me know.
• #### Re:Why? (Score:1)

Dude. They DO call it a GPU. Read the article. Then post.

• #### Re:Who cares? Tom is crazy! (Score:1)

Agreed, Anand is too verbose and tends to ramble about barely related stuff. He needs to be more concise. But go easy on the kid - he's only 16 or something (check out the about section if you don't believe me!)

Daniel.
• #### Re:i liked... (Score:1)

I suppose you support 3dfx's lack of a full OpenGL ICD still as well, despite the original Voodoo chip coming out HOW long ago was it now?

At least nVidia/Matrox have written full OpenGL ICDs, complying with that is considered "THE" standard for 3D graphics - 3dfx continue with their half-baked OpenGL ICD/proprietary Glide system - I know who I'll be supporting......

I used to have a Canopus Pure3D, utilising the Voodoo chip, but when the time came to upgrade, the TNT was the best option in my opinion - and nVidia's products, again IMHO, continue to set the standard by which the others are all measured, and for good reason.

• #### Re:Why? (Score:2)

Because the Geometry Processor Unit (what GPU stands for) will be optomized for processing geometry, which
currently is a task of the CPU. With the GPU, all the
processor will be responsible for will be feeding geometry data to the GPU (well, it's the only graphics
function it'll be responsible for).

In the end, the GPU should be faster at geometry than the CPU, which is the goal.

• #### Re:Linux Compatibility? (Score:2)

I was wondering the same thing.

Me too.

My good old Viper550 (TNT) card seems to work with Mesa-3.0 and NVidia's "glx 1.0" driver, but I'm under the impression this is only a "partial" set of drivers.

It's a complete OpenGL driver AFAIK, but it doesn't do direct rendering (it goes over the X pipe), and it's not nearly as optimized as the Windows drivers yet.

(Myth II supposedly only works with 3dfx brand boards because they're the only ones that have a 'complete' set of OpenGL drivers - though I've been trying to find out if that's really true or not...)

It's not. Myth II uses Glide. 3Dfx is actually unique in being one of the last people to *not* have a complete set of OpenGL drivers; that's why they have their "MiniGL" to run Quake* games.

Has Nvidia said or done anything on the Linux front since the initial release of their drivers?

There was an interview where an Nvidia rep said they'd have GeForce Linux drivers (but X server? Mesa drivers? Who knows?) when the card shipped, but I haven't heard anything since.
• #### Re:MPEG 2 support (Score:2)

but I would've like to see some more support for DVD playback. HDTV support is cool but
Proper HDTV support requires mostly a superset of what is required for DVD support; everything but the subpicture decode.
I think that a full MPEG 2 decoder, that would be a little excessive since that would mean adding an audio output to the card.
No, adding an MPEG 2 decoder to the card doesn't necessitate adding an audio output.

This can be done two ways:

1. Leave the demux of the MPEG transport or program stream to the main CPU, and only hand the video PES (or ES) to the card. Demux is fairly easy and won't suck up but a tiny fraction of the main CPU as compared to doing full decode.
2. Let the card demux the transport or program stream, and hand buffered audio PES (or ES) data back to the main CPU.
As for software decoders, the ones for Windows don't seem very optimized.
They're not very good, but that's not because they're not very optimized. Just compare any of the Windows players to the NIST code if you want to see the huge difference between majorly optimized and non-optimized decoders.
• #### Re:Tomzilla (Score:2)

Why should he have to respond? Because some people question the validity of his OPINIONS?

Q: How do we evaluate someone elses OPINIONS?

A: The same way we evaluate anything else, if you want to, go for it, if you don't like him, don't read him.

Apparently, not reading his articles has expanded to the realm of trash-talking him from the comfort and safety of the AC post.

If you're going to lay into somebody, please, have the courage to accept personal responsibility, and link to the allegations instead of giving a vague, biased (but presented as unbiased) description of what these allegations were.

You're lucky I ran out of moderator points already.

• #### Tom should have mentioned... (Score:2)

The reason why most of the benchmarks were so close is because none of these games (with the exception of parts of Quake3) are using the OpenGL T&L pipeline because at the time they were made there were no hardware T&L engines and so by 'rolling their own' T&L they could get significant speedups. The nVidia Tree demo should be evidence to anyone what a dramatic difference having hardware T&L can make. That tree demo has far more complexity than your average shoot-em-up game, and these are the kind of things we can expect when developers make games for hardware T&L (most new games will use the hardware). So the real problem with the benchmarks was running a bleeding edge graphics card on yesterday's software. It does well, even better than the competition, but don't expect a 3X increase...you can't get much faster than 100FPS no matter how you try. But the GeForce should be able to do 60FPS with 10X the polygon count of current cards (assuming the developer is handling T&L with OpenGL)
• #### Best place to buy? (Score:2)

Where is the cheapest place to buy one of these things? (including shipping)
• #### Re:pricewatch.com (Score:2)

Thanks, I was serching for Ge Force and nothing was turing up. Here is a link to the complete listing at pricewatch.

Prices at Pricewatch [pricewatch.com]

• #### Why? (Score:2)

Why not just slap another processor on the motherboard and call it a "GPU" instead?

• #### G400 Scores (Score:2)

Kinda off subject, but shame shame Tom for not using the new Matrox G400 drivers that were released on Oct 8th that includes the new Turbo GL (mini GL) drivers. Would have liked to see how the G400 Max performed with the newest drivers compared to the GeForce at the higher resolutions. From some of the benchmarking I have seen, it is giving the TNT2 Ultra a run for it's money on OpenGL games at higher resolutions.
• #### Re:Why? (Score:2)

"Why not just slap another processor on the motherboard and call it a "GPU" instead?"

Theoretically, you could do that but you would need a mighty powerful CPU to achieve the level of performance of the GeForce since CPUs aren't optimized for graphics processing (note: GPU stands for Graphics Processing Unit not Geometry Processing Unit as someone earlier posted.) The GeForce is a much more cost effective solution for graphics processing than getting another CPU.

According to Nvidia web page about the GPU [nvidia.com], their technical definition of a GPU is:

"a single-chip processor with integrated transform, lighting, triangle setup/clipping and rendering engines that is capable of processing a minimum of 10 million polygons per second."

The review of the GeForce 256 [aceshardware.com] at Ace's Hardware [aceshardware.com] has good info on comparing CPUs to GPUs. As another poster mentioned, graphics processing exists in a limited form in CPUs (3DNow!,etc.). Possibily in the future CPUs will integrate more advanced graphic processing functions. But, even if you had a CPU with complex graphic processing functions you still need some sort of display adapter. Personally, I think that it makes more sense to have the display adapter and graphics processing integrated on one unit.

• #### GPU Acceleration (Score:3)

by Anonymous Coward on Monday October 11, 1999 @04:41PM (#1622314)
A lot of confusion seems to be going around about that whole GPU T&L thing when applied to Quake3, well Shugashack [3dshack.com] amazingly enough has the answer from one of the developers working with Quake3 technology. Here you go, right from the Shack. His benchmarks [3dshack.com] of the card are pretty good too.

Quake 3 does indeed use T&L and will take advantage of any hardware supporting it. It uses OpenGL's transformation pipeline for all rendering operations, which is exactly what T&L cards such as the GeForce accelerate.

Well what if Q3 used the other stuff besides the transform engine? The other three real features are the per-vertex lighting, the vertex blending, and the cube environment mapping. Since Quake 3 has static world lighting, one of the only places for the lighting to be useful would be for the character lighting, especially for dynamic lights. The current character lighting implementation is pretty quick though, I don't really see *too* much of an improvement there, though it is worth mentioning. The vertex blending may help skeletal animation, but since the current test has no skeletal animation, it would not help it at all in the current benchmarks. And the cube environment mapping won't help the game at all, since the game doesn't use cube environment mapping to begin with.

While I'm at it, the use of OpenGL doesn't necessarily mean that all games will be accelerated by the GeForce's T&L. Such examples are Unreal engine (including UT) based games. Its architecture is very different from QuakeX's and cannot benefit from T&L hardware without rearchitecting the renderer, as Tim Sweeney has said before.

• #### Tomzilla (Score:3)

by Anonymous Coward on Monday October 11, 1999 @02:44PM (#1622315)
Tom Pabst (of Tom's Hardware) has gotten himself mixed up in a lot of tough questions about his journalistic integrity (or lack thereof). There have been many accusations that he was a little too generous with certain reviews in exchange for getting hardware to review before anyone else on the 'net, and there was a big stink about him rushing to publish Q3Test "benchmarks" without even looking into whether such a thing would have any basis in reality. Tom has responded to some of these allegations, and his responses have not been particularly professional.

Anyone on the net can put up a web site and review products. Don't take other people's reviews (which are really opinions) as truth. Question seriously those who are writing. Many online authors have not displayed much professionalism, and those types are probably best avoided.

• #### Re:Comparison to Pro Graphics? Here: (Score:3)

on Monday October 11, 1999 @04:17PM (#1622316)
Riva3D [riva3d.com] ran the GeForce256 through Sense8's Indy32 benchmark. The results are here [riva3d.com].

Far as I can gather, looks pretty promising. (with the right CPU. They used an Athlon.)
• #### Next step (Score:3)

on Monday October 11, 1999 @04:45PM (#1622317)
Seems to me like the next logical step is to have a graphic card that can handle more of the game duties. If a box is built right, the CPU can be slow but everything flies because the work is handed out to chips specialized in different tasks (see: Amiga, mid 1980's, which is still a superior design than any current PC). This chip makes a good first step in that direction, taking over the lighting and such, eliminating the need for faster AGP transfers and such.
Ideally, I would like to see a graphics board that actually takes over some of the program itself. Of course it would be even better to have a NUMA motherboard and have chips dedicated to I/O, another to graphics, another to sound (not through the ISA/PCI slot), thus the CPU itself wouldn't have to be the latest greatest to turn out incredible results. These guys are turning out a chip in the ballpark of $100/piece wholesale that runs circles around any CPU. The whole computer needs to get that way. The only time you should ever need a fast CPU is for science/math, not for a normal desktop machine. ***Of course Transmeta might change the whole scenario, because if their chip can be reprogrammed on the fly to do things like graphics then there's no need for so much hardware. • #### random, possibly baseless points and conjecture (Score:3) on Monday October 11, 1999 @05:54PM (#1622318) Homepage Journal At first, I thought the moderators were all smoking crack again, but I see that they probably ran out of moderation points... Why is it that the subject of 3D graphics cards seems to bring out such obnoxious folk? Frankly, I'm just not interested in these new components. Is an extra$100 enough to justify a 5% increase in performance, and if so, how many generations should be skipped after that before upgrading? Nvidia is talking about a 6 month schedule (though nine months to a year seems more realistic).

At the rate things are going, graphics cards will soon be the most expensive component in every system, even with RAM at its current prices. I'm also willing to bet that NetBSD will be ported to exclusively use the GPU, bypassing most components altogether, before the product is even released...

For me at least, I can't justify the costs of upgrading my system every six months just so I can play the newest rehash of a ten year old game. It doesn't impress me that the *new* version gives you more control, gore, levels, and/or 3D graphics -- I liked the *old* game just fine.

The CPU or component speed haven't been the bottleneck in games for a long, long time. The imagination of game developers has been occupied with utilizing the hardware acceleration buzzword of the moment, not with developing new groundbreaking ideas...

My US\$0.01 (lousy Canadian pennies :)
• #### Re:Linux Compatibility? (Score:4)

on Monday October 11, 1999 @03:05PM (#1622319) Homepage
Doing a search for geforce on www.linuxgames.com revealed this snippet from an irc log:
-----------
([Jar]2) (orlock) WIll they still be supporting Xfree86/Mesa3D/glx/linux/etc like they have in the past?
(nvdaNick) Yes.

(MicroDooD) (LaRz17) Will drivers for multiple operating systems be released at the same time?
(nvdaNick) As for driver releases, I think NVIDIA is planning to release all drivers at once.

([Jar]) (MfA) Will the non windows drivers be open source? (ie not run through the pre-processor)
(nvdaNick) What would you want with open source drivers, by the way?
(nvdaNick) I'm not sure what our plans will be regarding that.
-----------------

\begin{speculation}
Anyway, if this is correct and nVidia is going to be have official support for Linux they are probably going to use SGI sample implementation and thus cannot release their driver as open source.
\end{speculation}
• #### Boy, does he hate 3dfx o_O (Score:4)

on Monday October 11, 1999 @05:15PM (#1622320) Homepage Journal
What happened with that, did they make fun of him or not give him cards to test or something? Like anybody, I have pet vaporware that I'd like to see succeed and become real, and for me that's the next generation 3dfx stuff with the antialiasing and motion blurs (in which the former would work with old games too). It's OK with me if it doesn't fly, I'll still wait and see what happens with it, but it's pretty boggling to see this guy kicking at 3dfx so bad. He was coming up with these big benchmarks for a GeForce card that people can't even get yet, and making nasty remarks about how poorly the Voodoo3 measured up (when actually Glide ran competitively when available), and how old is the V3 by now? Compared with a GeForce that people can't even get ATM?
• #### MPEG 2 support (Score:4)

<eric.brouhaha@com> on Monday October 11, 1999 @03:13PM (#1622321) Homepage Journal
It's nice to see that they've apparently added some of the MPEG 2 motion compensation support that ATI has had for a while. But I really wish they would bite the bullet and add a full MPEG 2 decoder. It would only take about a half million transistors; no one would even notice the extra die area.

Software MPEG 2 decoders for Windows basically suck, and there aren't (yet) any real-time decoders for Linux anyhow. Hardware decode is the way to go.

I keep hoping that someone will ship an inexpensive VIP-compatible MPEG 2 decoder daughterboard that I could use with my Asus V3800 TNT2 card, and it hasn't happened yet, but simply building it into the next generation nVidia chip would be even better.

Eric

• #### Tom's...and every other hardware site too (Score:5)

on Monday October 11, 1999 @02:54PM (#1622322)
Did an NDA expire today or something?

Anandtech GeForce 256 Review [anandtech.com]
Ace's Hardware GEForce 256 Review [aceshardware.com]
RivaExtreme GeForce 256 DDR Review [rivaextreme.com]
GA Source Guillemot 3D Prophet Review [ga-source.com]
3DGPU Geforce 256 DDR Review [3dgpu.com]
Fast Graphics Guillemot 3D Prophet Review [fastgraphics.com]
CGO GeForce 256 Preview [cdmag.com]
Shugashack GeForce, V3 and TNT2 benchmark roundup [shugashack.com]
Riva3D Full GeForce 256 DDR Review [geforceddr...argetblank]
GeForce 256 DDR Review at Planet Riva [planetriva.com]

Any others?

#### Related LinksTop of the: day, week, month.

The most difficult thing in the world is to know how to do a thing and to watch someone else doing it wrong, without commenting. -- T.H. White

Working...