
Nvidia GeForceFX(NV30) Officially Launched 441
egarland writes "Tom's Hardware has a new article previewing the new GeForceFX chip and discussing its architecture. 0.13 Micron, 16 GB/s memory bandwidth, 128-bit DDR2 memory interface, 125 M transistors, support for 8x FSAA. Sounds like an interesting chip. They stuck with a 128 bit memory bus so ATI's R300 still has more memory bandwidth (19.8 GB/s) but NVidia has new lossless memory compression so we will have to wait for benchmarks to see if NVidia comes up a winner here. The reference card also sports a massive new cooling system which is worth a look."
Readers Oliver Wendell and JavaTenor add links to additional stories at The Register and at AnandTech.
Alas.... (Score:2, Funny)
That's nice ... (Score:3, Interesting)
That's nice. Now maybe NVidia will find the time to FIX THEIR FUCKING DRIVERS. Christ, they're becoming the new Diamond when it comes to shitty software.
Re:That's nice ... (Score:2, Insightful)
Why is this flamebait? He's dead on - detonator has started sucking lately. Flame bait would be "NV IS POOP ATI RULES" or even if his claims were false - detonator has been sucking ass with the past few releases.
Doom III (Score:2, Interesting)
Anyone know how it works with Doom III?
Not like Tom's would post benchmarks, but maybe "someone" has tried it
Re:Doom III (Score:3, Informative)
Re:Doom III (Score:5, Informative)
Good now I can afford a Ti4600 (Score:4, Insightful)
Re:Good now I can afford a Ti4600 (Score:5, Informative)
Re:Good now I can afford a Ti4600 (Score:2)
"but honey, if I get this, then you can put the 64 meg in your slot"!
Better grab one soon though. (Score:3, Informative)
In short, better get that Ti4600 card very soon, because they could be gone in a matter of months.
Re:Better grab one soon though. (Score:2)
Where'd you read that?
Re:Better grab one soon though. (Score:2, Funny)
I am glad you summarized that, because I was having trouble digesting the entire *sentence* before that.
Re:Good now I can afford a Ti4600 (Score:5, Insightful)
Not necessarily...
nVidia like to announce things well in advance of shipment in order to convince people to wait. This is perfect timing to keep those gamers from scooping up the 9700s for the Christmas season.
Make note that nVidia announced the nForce 2 way back in July [nvidia.com] and you still can't buy them.
With business practices like that, I like to take my dollar to the competition. ATI is very good about keeping products hush-hush until they are close to shipment. I wouldn't expect the FX anytime soon.
So the prices of the 4600s won't be dropping as a result of nVidia announcing something that won't be on shelves until next spring.
Re:Good now I can afford a Ti4600 (Score:3, Interesting)
Bull. ATI has only recently stopped sucking in many areas, and that used to be one of the worst areas. Ask anyone who had a ATI Rage Pro or other card that very clearly stated OpenGL support on the box but a visit to the website merely announced upcoming support. For nearly a year it was "Soon to be released" until finally support for the card was almost totally dropped.
Hush hush my ass. ATI have always made some good products with some bad features, and they've always talked a whole lot more shit than they should have been able to get away with.
In the past year and a half things have been going really well for ATI, but I'm very convinced ATI would still be breaking promises if they hadn't bought ArtX.
I would also like to say I never really thought ATI's older cards sucked, because on paper they should have been excellent cards, but crappy drivers almost always seemed to be the limiting factor. I owned a few ATI's but broken promises several times over drove me to NVIDIA. Yes, ATI currently makes the fastest card, but you know what? I still get plenty of satisfaction out of my current NVIDIA card and I feel no need to replace it just quite yet, not even with another NVIDIA card.
When the time comes to upgrade, I'll look over my options and decide then. But NVIDIA hasn't let me down in the past, and I still haven't forgotten what ATI was like just a very short time ago.
So will it work on my system? (Score:2)
No reason to fear from NVidia, they've produced Linux drivers for all their cards since befoe they were on shelves since the GF2Ultra, but does anyone have any info?
Doom III (Score:2, Funny)
Well, I can see that it allows blood to drip 2x as fast as my 128 Meg Geforce 4400.
And, wow! You can totally see the eyelids blur as characters blink!!!
What great features in this cool cool engine. I think I can even see the blood polygons underneath the characters' pixelated skin!
Don't even get me started about the quality of reflections in the moving water.
DAAAMN!
-S
Some other useful links (Score:5, Informative)
NVNews has a large group of links to previews [nvnews.net](scroll down to the "Geforce FX Preview" article)
Some impressive images from the release demos [nvidia.com]
Re:Some other useful links (Score:3, Funny)
Kind of adds a whole new meaning to the "Force" in GeForce.
The GeForceFX - so fast it leaves skid marks in your wallet!
Cooling System (Score:4, Interesting)
I'd think it would make more sense to use air inside the case and blow it out the back. With a grill/fan on the front of the PC, you're helping to improve the overall air-flow inside the system instead of just recycling your heat-wash.
Re:Cooling System (Score:2)
It's always easier to work within the confines of a self-contained system such as the one they've created than rely on outside factors being just right.
Re:Cooling System (Score:2)
I hate to imagine... (Score:2)
Have you seen the cooling systems some of these manufacturers have attempted with their Ti4600 cards?! (eek.) I can just see the enormous monstrosities in terms of cooling systems for GeForce FX cards when the production models come out in late January 2003. It could make CPU coolers look downright conservative in comparison.
Re:Cooling System (Score:5, Informative)
It's similar to how you can't feel the air blowing towards a fan intake as well as you can feel the air blowing out. Try it with a household fan sometime. Orient your hand parallel to the intake/output so that you're not blocking the flow much.
So, if they can get the cool air from outside, it's a better solution than using the pre-heated air from in the case.
Re:Cooling System (Score:3, Insightful)
Oops.
You can't suck in air from the case because you can't be sure that there's enough ventilation to let you suck the air in -- you always want to maintain an equal ratio of input and output airflow. The only way Nvidia could do this is to put the intake and the output on the card itself, which leads to the situation we see currently.
Preventing the output being sucked back into the intake is pretty trivial though - take a piece of cardboard and put it between the two. That will solve the majority of the problem. Yes, it's inelegant. But if the cooling problem has gotten to the point where you need a heat pipe with a blower separate from the rest of the system then you're pretty much SOL on elegant solutions anyway.
Wattage (Score:3, Interesting)
I'm just thinking of the power economics of the todays 3D accellerators...
lossless compression (Score:2, Redundant)
Re:lossless compression (Score:2)
Re:lossless compression (Score:5, Informative)
How many watts? (Score:3, Interesting)
I also would love to hear how loud this video card is..blowers are generally pretty noisy.
Re:How many watts? (Score:2)
It seems that the latest GPU's from both major manufacturers are favoring a brute force approach to performance, rather than improving their architecture.
Did you read the article? The NV30 is a completely different design than previous GeForces.
Re:How many watts? (Score:2)
Re:How many watts? (Score:3, Informative)
woot. (Score:5, Funny)
There I was with my Beowulf cluster of GeForceFX(NV30) cards..
The duct tape glistened in the weak 40 watts of light in my parents' basement. "g1bb0r m3 T-Fl0p5!" I screamed but it was not to be. There was no joy in Mudville, the mighty cluster had blown a fuse.
Love the cooling system (Score:5, Funny)
I could hook that thing up to my ductwork and save a fortune on natural gas this winter.
Don't stop there. (Score:2)
I definitely want good graphics but, the cooling problems that these new cards bring with them is just getting ridiculous.
Re: (Score:3, Funny)
To Late For The Fall Leaves (Score:5, Funny)
I've been searching for years for a leaf blower that could run Doom III at acceptable frame rates.
cooling excess... (Score:5, Insightful)
Obviously inserting it wont be easy and expect many breakage and damage returns.
Re:cooling excess... (Score:5, Insightful)
1) Imrove airflow to the Vid Card
2) That first PCI slot often shares an IRQ with the AGP slot - uncool, performance wise.
So for the gamers that the card is targetted for, business as usual.
For everyone else, I'm sure it'll be implemented with a more 'normal' cooler.
If a 1.3ghz tualitan P3 and 1.8ghz P4 can run a low profile cooling setup in a 1U rack, so can this.
Or they could place the GPU back on the 'top' of the card so that heat can rise off it and out of the case, equip it with a more conventional GF4 style sink/fan, and there ya go.
Also note, that this is an optimized, hopped up reference board for Tom, and not something we'll ever be buying. It's like a concept car at a car show.
I've been burned enough with Tom's special 'reviewer edition' hardware ad-hype pieces. Wait for the real thing.
Re:cooling excess... (Score:3, Informative)
Can't do that -- there's not enough clearance between the AGP slot and the CPU slot or other MB components to put in a HS/fan, much less this monstrosity.
Heck, I bet the heatsink on the back renders it incompatible with some motherboards because there are large caps too close to the AGP slot.
Re:cooling excess... (Score:5, Interesting)
Which spec? Would you care to give references? While the heatpipe/blower is indeed massive, I see nothing to indicate that it does not comply to the ATX 2.03 spec.
since when I need to free up two slots to add a graphics card?
Well, with the Voodoo2 I had to clear up 3 - the main video card and 2 more for the dual V2 setup.
And who uses all their slots anyway? Excepting micro ATX systems like Shuttle how many people actually have an AGP card and 4-5 PCI cards? Oh, sure, there will be some here since this is
Another poster made some good comments about why you should leave the PCI slot next to your video empty anyway.
Oh, and would you like to take a guess at how many current cards prevent use of the adjoining PCI slot because of the normal fan/heatsinks? Most of the high-end Ti4600 designs fall into this category.
Obviously inserting it wont be easy and expect many breakage and damage returns
Doubt it. About the only problem with inserting it will be the mass - it's going to be rather ungainly compared to a normal card. The distance between slots is spec'd, so actually lining it up is a non-issue. And it's not actually plugging into the PCI slot either, so alignment isn't a problem there either.
Of course, if this whole thing scares you, or makes too much noise (which it probably will - sigh), then don't buy it. There will be a slower version available that has a more normal profile. I still wouldn't recommend utilizing the PCI slot next to it though.
Dawn demo looks awesome (Score:5, Funny)
1 [nvidia.com]
2 [nvidia.com]
3 [nvidia.com]
Re:Dawn demo looks awesome (Score:3, Funny)
Re:Dawn demo looks awesome (Score:2)
Re:Dawn demo looks awesome (Score:3, Funny)
Re:Dawn demo looks awesome (Score:2)
Thanks alot Nvidia! (Score:2, Funny)
Article At HardOCP.com (Score:3, Informative)
Water cooling needed a killer app... (Score:2)
Cooler similar to Abit OTES (Score:2)
"Officially Launched" (Score:5, Funny)
Dear Timothy,
1. Do you understand what the word 'launch' means?
2. Are you aware it is not yet February 2003?
Re:"Officially Launched" (Score:2)
Sharky Extreme Article (Score:2, Informative)
http://www.sharkyextreme.com/hardware/videocards/
My dog ate my sig.
Still far off (Score:2)
I'm telling people who are prone to buying me gifts to go for the Geforce 4 Ti4200 128MB, which is about $150 right now. The Radeon 8500 is nearly as good if you're not stuck on NVidia like I am, and the 128MB version is under $100.
And for those of you who haven't seen it yet, here's the NVidia promo video [nvidia.com], which has taken a lot of criticism.
always half the story. (Score:3, Insightful)
But has nVidia done anything towards improving 2D and multimedia performance yet?
The difference between the Radeons and the GF4's when it comes to watching DVD, using TV-Out, or just plain desktop computing is night-and-day.
The nVidia offerings always seem plagued with washed-out colors, shimmering refresh rates, albeit not nearly as bad as the 3DFX offerings. ATI cards have always been as good as it gets.
Sure I do alot of gaming, but not all of it is in 3D. I also watch movies, write code, surf the net, etc, etc.. Not only does nVidia never pay attention to any of that, nor do any of the review sites.
Video card != 3D Accelerator alone, IMO.
Re:always half the story. (Score:2)
DAMN YOU ALL TO HELL! (Score:2)
It's the announcement of the announcement, for chris' sake! Can't you wait until there are at least some benchmarks, so I can read my nforce review in peace?!
what gives?? (Score:2)
Most ATX power supplies for the past 2 years have had the special "Video card" power leads and connector... WHY THE HELL doesn't the card makers use this? It's there, It's tie-wrapped up and stuck to the top of most anyone's case to keep it out of the way because noone has been using it.
granted using a standard FDD connector is easier and cheaper, but why did they specify it and never use it?
My prediction (Score:5, Funny)
I predict that we'll soon be buying big metal graphics controller boxes from nVidia complete with heavy duty power supplies and massive cooling capacity. After you get it home, you'll open up your graphics adapter and insert a little motherboard and CPU into an option slot to complete your computer system.
Exciting (Score:5, Funny)
1) Who needs all that power anyway? I'm running Windows XP just fine here on my 486SX/33!
2) Why cares if it's fast? It uses up too much power and has a *fan* on it. God forbid a computer have a fan on it! It sucks because it's not fan-less like my Mac!
3) Sure it might be fast, but I bet it isn't as *efficient* as a G4!
4) NVIDIA sucks because it's drivers are closed source.
Did I forget anything? Anyway, I couldn't care less what the lamers think. This is a genuinely cool piece of hardware. There are a few things that make it so:
1) 500 MHz! That's half a gigahertz! A very large jump in clock-speed here, much more so than the usual 33 MHz pussy-footing the industry (particularly Intel!) is guilty of.
2) Compressed-memory access. Ah, computational power exceeds memory bandwidth to the point that it's more efficient just to compress the data before sending it over the bus... The 16 GB/sec memory bandwidth (which is also quite a big jump from existing machines) is made even more impressive by a lossless compression that can achieve 4:1 ratios. This is very helpful for multisample AA graphics, because it reduces the memory bandwidth hit to just the pixels that occupy the edges of polygons rather than every pixel in the scene.
3) Fully floating point pixel pipelines. Carmack was asking for 64-bit floating-point point pipelines a while ago. While this doesn't quite get there (it's 32-bit floating point) it is a major step, and makes life a lot easier for game developers.
Overall, this card is definately in the cards for me
Re:Exciting (Score:3, Insightful)
I could care less if their drivers were closed or open. I just wish they'd make them stable! The Nvidia drivers have crashed my machine 3 times in the last 6 months. That's unacceptable.
GeforceFX Launch Games (Score:4, Informative)
How in the hell do I use that? (Score:3, Insightful)
Re:How in the hell do I use that? (Score:2)
It is upgrade time... (Score:4, Interesting)
It's funny, practically my entire workstation (P4 2.2GHz, 256MB DDR400, 80GB HD, etc.)has been upgraded in terms of components, however, my video card has remained static. Not that I'm complaining, because I can run pretty much every game out there at (what I consider to be) fairly decent speeds. Take Age of Mythology as an example. It's more than fast enough. Unreal Tournament 2003 is a tad different, as I have to turn down some of the graphics, but it's is still fine for the 'average' game. Plus, my Xbox and PS2 are for my gaming needs
Now, does the theory of diminishing marginal utility apply to video cards, or is it the opposite? How much more powerful can video cards get so that we won't even 'notice' (at least in the loose sense) any difference when playing games? The Radeon 9700 Pro (with a fast CPU) can run pratically every game on the market at max details at most resolutions. Well, so can the GeForce FX 5800. Sure it may be 30-50% faster, but the utility gained for current games is definately marginal.
Since I've held out for 2 generations of video cards, for me, it's definately the time to upgrade. Though, it's not really because my video card is too 'slow'. I suppose it's an issue of just gloating to my friends!
Moreover, in terms of approaching cinematic rendering, nVidia is definately going in the right step. They are quickly approaching the level of "Final Fantasy" in terms of quality of output. Nonetheless, they'll still need to add quite a bit of horsepower to be able to do it all in real-time.
Re:It is upgrade time... (Score:3, Insightful)
I'll agree with that, but now that the two top dogs are both ready for DirectX9, it's time for them to stop adding proprietary extensions and to compete on speed and price.
Ultimately, creeping featuritis is good for no-one, not for the manufacturers, who have to figure out a way to top each other, not for the consumers, who spend top dollar on cards that get obsoleted by superior technology, and most importantly, not for the game companies, who can't make money with products that only work on bleeding edge tech. Fine, GeForceFX has 63356 maximum instructions per vertex, but what if the gamer "only" has a Radeon 9700 Pro? They're limited to 1024 max instructions. What if they have a GeForce3? They're out of the loop altogether. That's why, despite all the advancements we've seen lately, games are just now coming out that list T&L accelerated cards as a requirement. Programmers (excepting id software, who are in the business of selling their engine more than in actually programming "games") aren't going to use the most advanced features until they can be reasonably sure a large segment of the buying public won't be shut out. So, please nVidia and ATI, slow down on the features, let's lock into what we have now (much as AMD and Intel have pretty much locked their feature set) and let's get these cards down to the price level where one doesn't have to take out a second mortgage to afford them. They're only toys, after all.
Oh no (Score:2, Interesting)
Unless they can trim that extra fat off the board I'll stick with ATI's offerings.
Anandtech says it all. (Score:4, Informative)
Myself, I had a GF3 Ti500, I upgraded to a GF4 4600, but it wasnt much faster, returned it. Then a couple games came out (Battlefield 1942, Unreal2003) that really needed some gfx horsepower. So I bought the Ati 9700, Amazing. I can run older games with 6x AA perfectly, and Newer games run at 60FPS with 2x AA enabled. The GFX card works fine with the CVS version of Xfree also. (Or vesa mode for older 4.2.1) Also, I can output to TV at 1024x768, and have it mirror my monitor, great when playing some multiplayer games, or playing some divx/svcds. The Ati 9700 is a very nice product, and found some great forums at Rage3d [rage3d.com] for questions and updated beta drivers. (Like the new DX 9.0 drivers and DX 9.0 demos)
China Syndrome (Score:4, Funny)
nvidias gffx funfacts (Score:2)
i like this one. Can render >100 Jurassic Park dinosaurs at 100 frames per second.
powerful, yeah.
Anyone else disgusted with NVIDIA / NV30 Launch? (Score:2, Insightful)
I'm disgusted with the overabundance of hype with this launch. That's what this launch is. Of course there's no real substance because there's no shipping product!
And maybe it's not just NVIDIA. A lot of companies hype their products when they launch. Gee, even if the launch is three months away. But what really gets me though is the AMOUNT of pure meaningless crap that is spewing from the websites I've seen.
Tell me how it's going to benefit the consumer, by:
1. Comparing the numbers like the "instructions," "constants," and "registers" that this new chip allows. These kinds of numbers mean nothing to the consumer. If nothing else NVIDIA should be pitching this crap to developers.
2. Posting some really pretty pictures of things supposedly rendered with this card. Let me tell you why this is so rediculous.
I did a little test [hardocp.com]. This is what you were supposed to get with your Geforce 3 (according to the picture on a HardOCP preview). Guess what, no games even LOOK like that yet, let alone if you had one could you play it on a Geforce 3 at acceptable frame rates! Sigh. Things are just getting worse.
3. Real performance. I really can't believe that Anandtech posted frame rate numbers from Doom 3 that were supplied by NVIDIA. Data from an alpha game supplied by the card's manufacturer?. Yet no tests were shown of any other game, be it current or old. That is just rediculous.
Maybe it's not realistic to do this since the card is not even in production yet. Yet NVIDIA chooses to 'announce' their card anyway, in the same fashion they have done in the past (usually when the product is available). Right. It's a very clever game NVIDIA is playing; announce this new product and attempt to hurt sales of their competitor's product in the hope that the consumer waits for this new, overly-hyped and untested product. We've seen this before with the Geforce 3 and we're seeing it again on a larger scale, and I'm sick of it.
ok, so please flame me up the arse for bitching about the current state of deception that's going on in the industry. Yeah, lots of companies do it (while I think NVIDIA is the worst), yet people just eat this shit up! What's the point of going to different web sites when they're all supplied with the same incessant crap that NVIDIA created? I don't want to hear that it's just "the way things are" because I'm saying that they shouldn't be this way.
Thanks for reading.
All that copper looks expensive (Score:4, Funny)
Regarding those comments about the cooling system not having a filter, this is a pre-production model. Give it some time, it will have to use a filter to keep the small space between the copper fins free of dust.
Hey Bob, while you're out at Murray's Automotive, get me a new oil filter model number P3160 for a Saturn SL2 dual overhead cam and FX160 filter for my NVidia graphics card, 128MB DDR2 RAM, and be sure to read the serial number information. My FX card is post 4375XXX, so it doesn't need a finotany rod or a muffler bearing.
Nvidia picked up a 3Dfx trait... (Score:2)
The result seems to be making bloody huge cards! I think they need to concentrate on finding ways to keep these cards SENSIBLY cool - not bolting on huge copper coolers, which expand onto a 2nd PCI slot, just to keep the GPU cool.
Its crazy I tells ya!
Turbo power, the problem (Score:3, Insightful)
From a developer perspective, we're headed for a shader fight between NVidia's Cg, OpenGL 2.0 shader languages (shader assembler, ISL, and Quartz Extreme) and Microsoft's HLSL. It's not enough to have shader languages; they have to be supported in the content creation tools, so the artists can see what they're doing. This will take a while.
Developers need to buy this thing, but everybody else can wait a year.
Will FX’s deeper shaders ever get used? (Score:3, Interesting)
"However, two questions remain - will developers use the extra shader capabilities over R300 and will shaders of the full length of GeForce FX actually be sensible to run in real-time? Undoubtedly there will be some developers who will choose to go for as much as the hardware will allow, but if the past is any indication then it will likely be the API specifications that will be the leveller and many developers may just opt to code for the base VS/PS2.0 DirectX9 specifications."
Hasn't the R300 and NV30 just established D3D's vanilla pixel/vertex shader 2.0 as the LCD for mainstream gaming development? Will all that 2.0+ hotness of the new FX actually end up never getting used? What say ye, developers?
Hell, I'm still waiting for something (anything) DX9 to push my 2-months-old 9700 pro.
Ex-3DFX Engineers Strike Back (Score:4, Funny)
Its amazing!
The specs for this board should include a noise dampener to counter the hoover that they have strapped to its circuit board.
The ex-3DFX engineers that NVidia acquired somehow managed to brainwash the NVidia guys into releasing a gigantic monster of a board that can only rival the VooDoo 5000 in its unpracticality and ungainliness.
Those 3DFX guys have had their revenge.
Buh bye, SGI (Score:3, Interesting)
I believe that current Nvidia Ti4600s have 128MB (256?) of memory, so I hope that a professional level of this new card might scale to the half Gig we need.
Additionally, the SGI is 12-bits per color channel, which is a bummer since the interface it is simulating is 16-bit monochrome. Sure, you can try and do tricks, but from a quick glance over the FX's specs, I see 32 bits per channel, which would be very nice.
With this FX card, a reasonably setup AMD Clawhammer system, and the scalability and preemption stuff that's going into 2.6/3.0 Linux kernels, we might be able to move from SGI within the next year or two, thus saving taxpayers on the order of $40-80k or more per system. A lot of development is already done on Linux, but it sure would be nice to move over fully.
Questionable Name (Score:5, Interesting)
Re:Questionable Name: Superstitious? (Score:3, Insightful)
Remember that company called 3DFX whose "last" card was the Voodoo5? Then a powerhouse called Nvidia took over as highend "King of the Hill".
Funny how that works, eh?
I mean it is not like Nvidia has anything to worry about with ATI taking the performance cro
.
Quit moaning about the fan... (Score:4, Informative)
"NVIDIA has hinted at offering another version of the GeForce FX at a lower clock speed that would only occupy a single slot cutout, but we will have to wait until the product line is announced before we can find out what the differences will be. Our initial guess would indicate that a simple reduction in clock speed would be enough to go with a more conventional cooling setup."
And:
"The other issue that users may have is noise, luckily NVIDIA has taken steps to make sure that the GeForce FX is one of the most quiet running cards they've ever produced. Borrowing technology from their mobile parts and combining it with the FX Flow cooling system, NVIDIA is able to dynamically reduce the speed of the fan based on the graphical needs of the system. When sitting in a 2D situation the card will scale back the clock speed of parts of the 3D pipeline that aren't in use, thus allowing the fan to spin much slower. As soon as you start using the GPU for games or any other 3D intensive applications, the clock speeds up as does the fan. The idea is that if you're gaming you're not as concerned with noise as when you are typing in Word."
Link: http://anandtech.com/video/showdoc.html?i=1749&p=6 [anandtech.com]
Flip-chip technology (Score:3, Interesting)
Apple Leaked Documents! (Score:3, Funny)
Official MacOSX 10.2.7 Patch schedule
Because many new GPUs are reaching a stage where they are faster than our G4s, code has been added to swap the GPU into a CPU and the CPU(G4) into a GPU. We anticipate a 15-30% boost in Photoshop.
Re:Cooling system (Score:2)
Re:Cooling system (Score:3, Interesting)
Re:Cooling system (Score:5, Informative)
Instead of using a filter simply buy either:
1: A can of compressed air every now and then (expensive, but easy and reliable)
or
2: A small air compressor (however this can get much more expensive in the short term especially considering you need not only a compressor, but also, hose, fittings, an air chuck and most importantly a dryer (aka de-humidifer), so unless you have alot of stuff that needs cleaning and you live in a place that makes it needed fairly often you should probably stick with #1)
I must say though, what a cooling system! I don't know about everybody else, but I used to have a nice voodoo 3500 that would get so hot that you could burn yourself on it, I was always worried about that thing.... I finally rigged up a cooling system for it (yeah I know, buy one.... but it's more fun to make it out of old parts
Re:Cooling system (Score:2)
Re:Here's the no advertisement version (Score:4, Informative)
Re:Here's the no advertisement version (Score:2, Insightful)
There's thousands of people hammering their servers, costing them money for bandwidth and power, and all you can think about is bypassing their MAIN SOURCE OF REVENUE, because it inconveniences you? That's great.
Way to go mods, +5 for stealing advertising revenues.
Re:Here's the no advertisement version (Score:2)
People do not want advertisments, print, radio, TV or internet. Futhermore nobody needs advertisments. Companies need to advertise to compete with other companys.
Futhermore, I cannot think of one industry where the generation of revenue from advertising has not affected that industry in a negative way. Can you?
Re:Here's the no advertisement version (Score:2)
I'm sure that even YOU go to the bathroom or grab drinks during the commercials on TV.. well shame on you for not holding it or watching the commercials THEN going to the bathroom during the show!
Its pointless to yell at people for doing something you probably do yourself.
Re:Here's the no advertisement version (Score:2)
First of all - this post assumes that the fundamental reason for the internet is Advertising Revenue - instead of information sharing.
And you get modded up insightful?!
Its funny how things can shift so subtley (sp) yet so significantly.
On the one hand - they need to have revenue in order to stay in business - and provide us with the reviews we want to see... on the other hand the internet is about sharing information, without a bias from marketing.
Yet - now we see the both are so dependant on eachother we even get people who are upset over the bypassing of Advertiser Sponsored information in favor of just the raw information we are talking about in the first place.
WTF has our perception of the way the internet should be come to?!
Just because its the way it is - doesn't mean its the way it should be.
Re:WTF? (Score:2, Informative)
Re:WTF? (Score:2)
Re:Okay, this is getting crazy... (Score:2)
Re:Okay, this is getting crazy... (Score:2)
If this were a product aimed at video edit suites or database systems, you'd be correct.
Re:Crazy World (Score:5, Insightful)
I agree. We're not getting huge, usable leaps in computing capabilities, we're getting continual, incremental improvements. Even these incremental improvements are not coming for free, we're getting them at the cost of increased power consumption, and millions of people throwing away motherboards and video cards every few years. And the incremental nature of it all keeps developers back a couple of generations. It's just barely getting to the point where you can realistically ignore everyone who doesn't have hardware T&L, several years after the introduction of the GeForce 2. But this is still a questionable choice, as a large number of PCs from Dell and Gateway still ship with generic video chipsets that don't have hardware support for T&L. Doom 3, which isn't even on the release radar yet (2003? 2004?), is the first game that's going to require the pixel shaders of the GeForce 3 and beyond. No other developer is going out on such a limb, as cool as shaders may be.
I'd love to see a quantum leap in desktop PC capability that isn't a one-to-one trade of MIPS for wattage. It's very possible, but we're running down this bizarre path where everyone gets all excited about a 9% increase in raw clockspeed (which translates into maybe 4% in benchmarks), even though it increases power consumption by 9% or more.
I'm at the point where I'd be willing to chuck the historic trappings of desktop PCs--x86, UNIX-like operating systems, C++, gcc, etc--for something simpler and cooler running, whose blatant wrongness doesn't eat away at your soul every time you use it. The whole Windows vs. Linux nonsense is a complete red herring in that regard.
Re:Crazy World (Score:2)
Re:Crazy World (Score:3, Insightful)
2.5 years to be precise. The GF2 was released in May, 2000. I wound up having to buy one the 2nd day it was out, so I remember (old V2 setup wouldn't work in new system).
Doom 3, which isn't even on the release radar yet (2003? 2004?), is the first game that's going to require the pixel shaders of the GeForce 3 and beyond
Doom3 is allegedly scheduled for Christmas of 2003. I'd be surprised if they missed that, but id software is usually more focused on getting it done right than on time, so who knows.
As for the features - by that time everyone will be going out on the same limb. As usual, the D3 engine will be licensed by many people and all those games will require the same level of hardware. D3 will take advantage of most of the features present in the GF4/GFFx as well, so now we're back to the games being only a year behind the hardware.
I'd love to see a quantum leap in desktop PC capability that isn't a one-to-one trade of MIPS for wattage
Well, I have no idea what the power consumption of the GF Fx is, but it's not a 1:1 trade of speed to MHz - the GF Fx runs at a 500 MHz core, which is roughly a 40% improvement over the Ti4600. For that speed improvement you get (allegedly) up to 400% of the speed. Not bad.
Realistically, though, you've got to be kidding. Science and technology rarely deal with sudden massive jumps in capability or performance. It's all building blocks. If you want a sudden massive jump then you have to skip a few iterations.
Did I mention that I'm still using the aforementioned GF2? Yes, I'm looking to upgrade right now and I do expect a considerable leap in capability and performance.
I'm at the point where I'd be willing to chuck the historic trappings of desktop PCs
So vote with your wallet and stop buying stuff you don't need. The only blatent wrongness is in buying crap you don't need and then whining about it being evil.
Re:Best Value? (Score:2)
Re:Making nVidia work for you (Score:3, Interesting)
I've done this for the GeForce series up until now, sticking with my Riva TNT until GeForce 2 came out and then keeping my GF2 until I could afford a Radeon 9000 (which is a GF4 equivalent). I've always been happy with my affordable, yet cheap graphics performance (my last three cards have been less than $100 apiece).