Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Hardware

Nvidia GeForceFX(NV30) Officially Launched 441

egarland writes "Tom's Hardware has a new article previewing the new GeForceFX chip and discussing its architecture. 0.13 Micron, 16 GB/s memory bandwidth, 128-bit DDR2 memory interface, 125 M transistors, support for 8x FSAA. Sounds like an interesting chip. They stuck with a 128 bit memory bus so ATI's R300 still has more memory bandwidth (19.8 GB/s) but NVidia has new lossless memory compression so we will have to wait for benchmarks to see if NVidia comes up a winner here. The reference card also sports a massive new cooling system which is worth a look." Readers Oliver Wendell and JavaTenor add links to additional stories at The Register and at AnandTech.
This discussion has been archived. No new comments can be posted.

Nvidia GeForceFX(NV30) Officially Launched

Comments Filter:
  • ...poor Tom's Hardware, we knew him well. :-(
  • That's nice ... (Score:3, Interesting)

    by Anonymous Coward on Monday November 18, 2002 @04:27PM (#4700403)

    That's nice. Now maybe NVidia will find the time to FIX THEIR FUCKING DRIVERS. Christ, they're becoming the new Diamond when it comes to shitty software.
    • Re:That's nice ... (Score:2, Insightful)

      by Anonymous Coward

      Why is this flamebait? He's dead on - detonator has started sucking lately. Flame bait would be "NV IS POOP ATI RULES" or even if his claims were false - detonator has been sucking ass with the past few releases.
  • Doom III (Score:2, Interesting)

    by vasqzr ( 619165 )

    Anyone know how it works with Doom III?

    Not like Tom's would post benchmarks, but maybe "someone" has tried it
    • Re:Doom III (Score:3, Informative)

      by L0rdJedi ( 65690 )
      Actually, Anandtech has a benchmark that was provided by Nvidia showing a 40% increase over ATIs 9700.
    • Re:Doom III (Score:5, Informative)

      by mmacdona86 ( 524915 ) on Monday November 18, 2002 @04:31PM (#4700454)
      AnandTech's [anandtech.com] coverage includes an nVidia-supplied benchmark that shows the NV30 beating the 4600 by 2.5x in Doom 3 (and the Radeon 9700 by about 40%). Of course, no one knows under what circumstances these benchmarks were obtained. I don't think any "independent" benchmarks will be available for awhile.
  • by NetNinja ( 469346 ) on Monday November 18, 2002 @04:28PM (#4700410)
    Time to buy a Ti4600 :)
    • by Camulus ( 578128 ) on Monday November 18, 2002 @04:30PM (#4700439) Journal
      Wait till February if you are going to do that. They haven't even released test samples yet. They have just finalized the design.
    • pricegrabber.com has the Ti4600 128 meg cards on sale for 236... I cant wait till X-mas!!

      "but honey, if I get this, then you can put the 64 meg in your slot"!

    • I've read that nVidia has stopped GeForce4 Ti4600 production and is only selling the GeForce4 Ti4200 GPU.

      In short, better get that Ti4600 card very soon, because they could be gone in a matter of months.
    • by swordboy ( 472941 ) on Monday November 18, 2002 @04:52PM (#4700686) Journal
      Time to buy a Ti4600 :)

      Not necessarily...

      nVidia like to announce things well in advance of shipment in order to convince people to wait. This is perfect timing to keep those gamers from scooping up the 9700s for the Christmas season.

      Make note that nVidia announced the nForce 2 way back in July [nvidia.com] and you still can't buy them.

      With business practices like that, I like to take my dollar to the competition. ATI is very good about keeping products hush-hush until they are close to shipment. I wouldn't expect the FX anytime soon.

      So the prices of the 4600s won't be dropping as a result of nVidia announcing something that won't be on shelves until next spring.
      • ATI is very good about keeping products hush-hush until they are close to shipment.

        Bull. ATI has only recently stopped sucking in many areas, and that used to be one of the worst areas. Ask anyone who had a ATI Rage Pro or other card that very clearly stated OpenGL support on the box but a visit to the website merely announced upcoming support. For nearly a year it was "Soon to be released" until finally support for the card was almost totally dropped.

        Hush hush my ass. ATI have always made some good products with some bad features, and they've always talked a whole lot more shit than they should have been able to get away with.

        In the past year and a half things have been going really well for ATI, but I'm very convinced ATI would still be breaking promises if they hadn't bought ArtX.

        I would also like to say I never really thought ATI's older cards sucked, because on paper they should have been excellent cards, but crappy drivers almost always seemed to be the limiting factor. I owned a few ATI's but broken promises several times over drove me to NVIDIA. Yes, ATI currently makes the fastest card, but you know what? I still get plenty of satisfaction out of my current NVIDIA card and I feel no need to replace it just quite yet, not even with another NVIDIA card.

        When the time comes to upgrade, I'll look over my options and decide then. But NVIDIA hasn't let me down in the past, and I still haven't forgotten what ATI was like just a very short time ago.
    • Sounds like a good card. I'm assuming it has Linux drivers, otherwise it wont work on my system and I will have to buy a 4600.

      No reason to fear from NVidia, they've produced Linux drivers for all their cards since befoe they were on shelves since the GF2Ultra, but does anyone have any info?
  • Doom III (Score:2, Funny)

    by viper21 ( 16860 )
    How does it run Doom III?

    Well, I can see that it allows blood to drip 2x as fast as my 128 Meg Geforce 4400.

    And, wow! You can totally see the eyelids blur as characters blink!!!

    What great features in this cool cool engine. I think I can even see the blood polygons underneath the characters' pixelated skin!

    Don't even get me started about the quality of reflections in the moving water.

    DAAAMN!

    -S
  • by JavaTenor ( 232983 ) on Monday November 18, 2002 @04:31PM (#4700450)
    NVidia's official Geforce FX site [nvidia.com]

    NVNews has a large group of links to previews [nvnews.net](scroll down to the "Geforce FX Preview" article)

    Some impressive images from the release demos [nvidia.com]
    • Heh- Anyone else see the "NVFairy" on the offical site? If the GeForceFX can do that...then I forsee an entirely new market segment opening up in "Hardware Accelerated" software.

      Kind of adds a whole new meaning to the "Force" in GeForce.

      The GeForceFX - so fast it leaves skid marks in your wallet!
  • Cooling System (Score:4, Interesting)

    by killmenow ( 184444 ) on Monday November 18, 2002 @04:31PM (#4700451)
    I'm not so sure about that cooling system. Why put the intake right next to the output? Seems to me like it'll just be sucking that hot air right back in.

    I'd think it would make more sense to use air inside the case and blow it out the back. With a grill/fan on the front of the PC, you're helping to improve the overall air-flow inside the system instead of just recycling your heat-wash.
    • Who's to say everyone had a fan in the front of their case? I don't bother.

      It's always easier to work within the confines of a self-contained system such as the one they've created than rely on outside factors being just right.
    • I think it's better to do this this way. If I ever get one (ha ha) I'll probably add an exhaust vent and an intake filter. On the other hand, I'm currently running a 4U aluminum rack case (mostly because it was roomy) and it's got a filter on the input fan, so in this box I prefer that everything else simply get air from inside the case. The big fan blows right over the hard drives...
    • ...What ASUS, Gainward, LeadTek, PNY and other nVidia chipset graphics card manufacturers will do in terms of cooling the graphics card for the new GeForce FX 5800 cards.

      Have you seen the cooling systems some of these manufacturers have attempted with their Ti4600 cards?! (eek.) I can just see the enormous monstrosities in terms of cooling systems for GeForce FX cards when the production models come out in late January 2003. It could make CPU coolers look downright conservative in comparison.
    • Re:Cooling System (Score:5, Informative)

      by tbmaddux ( 145207 ) on Monday November 18, 2002 @06:17PM (#4701570) Homepage Journal
      Why put the intake right next to the output? Seems to me like it'll just be sucking that hot air right back in.
      Assuming that you leave enough space behind the PC the card is installed in (that may or may not be a fair assumption), the turbulent jet of air blowing out will penetrate quite a bit farther into the surrounding still air around the PC than the intake is able to draw back in.

      It's similar to how you can't feel the air blowing towards a fan intake as well as you can feel the air blowing out. Try it with a household fan sometime. Orient your hand parallel to the intake/output so that you're not blocking the flow much.

      So, if they can get the cool air from outside, it's a better solution than using the pre-heated air from in the case.

  • Wattage (Score:3, Interesting)

    by haxor.dk ( 463614 ) on Monday November 18, 2002 @04:32PM (#4700459) Homepage
    How many Watts does this monster dissipate?

    I'm just thinking of the power economics of the todays 3D accellerators... :/
  • by mikeee ( 137160 )
    Won't compressing that data win bandwidth at the expense of latency? And if it's really lossless, the compression will be worse than useless on some data sets (maybe they can optimize so those are unlikely/invalid ones, I dunno..)
    • I am sure that it does add some latency. However, the GeforceFX should actually have a quicker access time (2.2 ns) over the 9700 (2.9 ns) because it is using DDR-II and if I remember correctly, the Radeon is still using the first generation. So, yeah, you might be right, but it will still be faster then any thing on the martket.
    • by mmacdona86 ( 524915 ) on Monday November 18, 2002 @04:43PM (#4700591)
      Compression within graphics boards is very different than other kinds of compression. They aren't really trying to make the amount of data you need to store smaller; they are just interested in making the amount of data you need to shuffle between the chip and the card memory smaller. They also know that in some circumstances (multi-sampling) the data is going to be redundant in very predictable ways. This lets them take some shortcuts that let them have good average compression ratios, lossless, with very low latency. The risk of very bad cases is small--people aren't going to run games where everything looks like TV snow--and the worst-case penalty isn't too bad.
  • How many watts? (Score:3, Interesting)

    by SClitheroe ( 132403 ) on Monday November 18, 2002 @04:32PM (#4700467) Homepage
    So how many watts is this GPU drawing, to require an active cooling system that major? It seems that the latest GPU's from both major manufacturers are favoring a brute force approach to performance, rather than improving their architecture. I wonder what implications this will have for power supplies in your average PC - are we getting to the point that a fast P4 or Athlon system is going to require a 600 watt or more power supply to be adequately stable?

    I also would love to hear how loud this video card is..blowers are generally pretty noisy.
    • It seems that the latest GPU's from both major manufacturers are favoring a brute force approach to performance, rather than improving their architecture.

      Did you read the article? The NV30 is a completely different design than previous GeForces.

      • Yes, I realize it's a totally new design, but you'd think that the chips would be getting more power efficient as the technology advances. The thing is maybe 2-3 times faster than the previous GeForce models, but it needs a lot more power, as evidenced by the blower and off-board power connector. Other boards, like the Kyro based ones, seemed to get impressive performance for the power they consumed by using new or different techniques for rendering pixels.
    • Re:How many watts? (Score:3, Informative)

      by be-fan ( 61476 )
      I doubt you know enough about GPU architecture to make that sort of bullshit comment. Graphics is a very simple, very parallizable system, when you get down to it. What matters, (assuming good drivers and adequate memory bandwidth, which isn't always the case) is clock_speed * pixel-pipelines. This has been the case since the Riva 128! Improving the architecture means adding more pixel pipelines (not always useful, if the developer can't use that many pipelines) or upping the clock-speed. Most operations in a modern GPU already take one cycle, so it's not like they're just pushing along inefficient architectures.
  • woot. (Score:5, Funny)

    by grub ( 11606 ) <slashdot@grub.net> on Monday November 18, 2002 @04:35PM (#4700491) Homepage Journal

    There I was with my Beowulf cluster of GeForceFX(NV30) cards..
    The duct tape glistened in the weak 40 watts of light in my parents' basement. "g1bb0r m3 T-Fl0p5!" I screamed but it was not to be. There was no joy in Mudville, the mighty cluster had blown a fuse.

  • by grub ( 11606 ) <slashdot@grub.net> on Monday November 18, 2002 @04:37PM (#4700525) Homepage Journal

    I could hook that thing up to my ductwork and save a fortune on natural gas this winter.

    • With a good amount of hose you could clear your yard of any leaf litter. Also, by plugging the hose into the intake side, with a small inline filter, you could have a central vacuum system for your home.

      I definitely want good graphics but, the cooling problems that these new cards bring with them is just getting ridiculous.
    • Re: (Score:3, Funny)

      Comment removed based on user account deletion
  • by scotay ( 195240 ) on Monday November 18, 2002 @04:39PM (#4700546)
    Damn, Nvidia, why couldn't you have this thing ready for fall?

    I've been searching for years for a leaf blower that could run Doom III at acceptable frame rates.

  • cooling excess... (Score:5, Insightful)

    by sapgau ( 413511 ) on Monday November 18, 2002 @04:40PM (#4700553) Journal
    This board is clearly out of spec... since when I need to free up two slots to add a graphics card?

    Obviously inserting it wont be easy and expect many breakage and damage returns.
    • by stratjakt ( 596332 ) on Monday November 18, 2002 @05:21PM (#4700965) Journal
      Most enthusiasts know to leave the next PCI slot next to the AGP free as it is, for at least 2 reasons.

      1) Imrove airflow to the Vid Card

      2) That first PCI slot often shares an IRQ with the AGP slot - uncool, performance wise.

      So for the gamers that the card is targetted for, business as usual.

      For everyone else, I'm sure it'll be implemented with a more 'normal' cooler.

      If a 1.3ghz tualitan P3 and 1.8ghz P4 can run a low profile cooling setup in a 1U rack, so can this.

      Or they could place the GPU back on the 'top' of the card so that heat can rise off it and out of the case, equip it with a more conventional GF4 style sink/fan, and there ya go.

      Also note, that this is an optimized, hopped up reference board for Tom, and not something we'll ever be buying. It's like a concept car at a car show.

      I've been burned enough with Tom's special 'reviewer edition' hardware ad-hype pieces. Wait for the real thing.

      • Re:cooling excess... (Score:3, Informative)

        by Zathrus ( 232140 )
        Or they could place the GPU back on the 'top' of the card so that heat can rise off it and out of the case, equip it with a more conventional GF4 style sink/fan, and there ya go.

        Can't do that -- there's not enough clearance between the AGP slot and the CPU slot or other MB components to put in a HS/fan, much less this monstrosity.

        Heck, I bet the heatsink on the back renders it incompatible with some motherboards because there are large caps too close to the AGP slot.
    • Re:cooling excess... (Score:5, Interesting)

      by Zathrus ( 232140 ) on Monday November 18, 2002 @05:46PM (#4701287) Homepage
      This board is clearly out of spec...

      Which spec? Would you care to give references? While the heatpipe/blower is indeed massive, I see nothing to indicate that it does not comply to the ATX 2.03 spec.

      since when I need to free up two slots to add a graphics card?

      Well, with the Voodoo2 I had to clear up 3 - the main video card and 2 more for the dual V2 setup.

      And who uses all their slots anyway? Excepting micro ATX systems like Shuttle how many people actually have an AGP card and 4-5 PCI cards? Oh, sure, there will be some here since this is /., but most people have video, sound, and network. And nowadays you can do without the network and perhaps the sound - it's called the magic of integration.

      Another poster made some good comments about why you should leave the PCI slot next to your video empty anyway.

      Oh, and would you like to take a guess at how many current cards prevent use of the adjoining PCI slot because of the normal fan/heatsinks? Most of the high-end Ti4600 designs fall into this category.

      Obviously inserting it wont be easy and expect many breakage and damage returns

      Doubt it. About the only problem with inserting it will be the mass - it's going to be rather ungainly compared to a normal card. The distance between slots is spec'd, so actually lining it up is a non-issue. And it's not actually plugging into the PCI slot either, so alignment isn't a problem there either.

      Of course, if this whole thing scares you, or makes too much noise (which it probably will - sigh), then don't buy it. There will be a slower version available that has a more normal profile. I still wouldn't recommend utilizing the PCI slot next to it though.
  • by Anonymous Coward on Monday November 18, 2002 @04:42PM (#4700584)
    NVIDIA has a few more shots of that Fairy:
    1 [nvidia.com]
    2 [nvidia.com]
    3 [nvidia.com]
  • Now that they have a video card that has more impressive specs than my PC I have to upgrade or be made fun of by my rich friends.
  • by AskedRelic ( 620439 ) on Monday November 18, 2002 @04:45PM (#4700614) Journal
    Another preview at HardOCP here. [hardocp.com]
  • And now it has one. With the noise that card's air cooler is sure to generate, perhaps this is the card that will spur DIY types to implement water cooling and make it commonplace. Once it's commonplace, it should become cheaper (one would hope anyway)...
  • Abit's OTES line of GeForce4 cards has coolers similar to the NV30 reference board linked in the post. Abit OTES link: here [abit-usa.com].
  • by nakaduct ( 43954 ) on Monday November 18, 2002 @04:48PM (#4700652)
    Release Date: February 2003


    Dear Timothy,

    1. Do you understand what the word 'launch' means?
    2. Are you aware it is not yet February 2003?

  • by Tidan ( 541596 )
    Here's another one by Sharky Extreme:
    http://www.sharkyextreme.com/hardware/videocards/a rticle.php/1502451

    My dog ate my sig.

  • You won't see the GeforceFX in stores until next February, and then it will probably be around $360 according to NVidia. The Radeon 9700 came out a couple of months ago at about $400, and the mid-range version won't be out until next month at under $200. So the mid-range GeforceFX will probably be out some time next summer.

    I'm telling people who are prone to buying me gifts to go for the Geforce 4 Ti4200 128MB, which is about $150 right now. The Radeon 8500 is nearly as good if you're not stuck on NVidia like I am, and the 128MB version is under $100.

    And for those of you who haven't seen it yet, here's the NVidia promo video [nvidia.com], which has taken a lot of criticism.
  • by stratjakt ( 596332 ) on Monday November 18, 2002 @04:55PM (#4700728) Journal
    3D graphics are fine and good, I do play enough games to want some polygon-smashing horsepower.

    But has nVidia done anything towards improving 2D and multimedia performance yet?

    The difference between the Radeons and the GF4's when it comes to watching DVD, using TV-Out, or just plain desktop computing is night-and-day.

    The nVidia offerings always seem plagued with washed-out colors, shimmering refresh rates, albeit not nearly as bad as the 3DFX offerings. ATI cards have always been as good as it gets.

    Sure I do alot of gaming, but not all of it is in 3D. I also watch movies, write code, surf the net, etc, etc.. Not only does nVidia never pay attention to any of that, nor do any of the review sites.

    Video card != 3D Accelerator alone, IMO.
  • I was trying to look for benchmarks on my new nforce board, and half-way through the review, what do I get? slashdotted! That's what!

    It's the announcement of the announcement, for chris' sake! Can't you wait until there are at least some benchmarks, so I can read my nforce review in peace?!
  • Just like ATI's Radeon 9700 Pro, the GeForce FX will require a HDD/FDD power connector to operate. If you fail to connect a power cable the card will still work, just at a lower speed and it will display an error on your screen.

    Most ATX power supplies for the past 2 years have had the special "Video card" power leads and connector... WHY THE HELL doesn't the card makers use this? It's there, It's tie-wrapped up and stuck to the top of most anyone's case to keep it out of the way because noone has been using it.

    granted using a standard FDD connector is easier and cheaper, but why did they specify it and never use it?
  • by Waffle Iron ( 339739 ) on Monday November 18, 2002 @05:11PM (#4700876)
    There seems to be a trend lately of graphics adapters kludging ever bigger chips and heatsinks onto a PCI card. Motherboards seem to get smaller and more integrated.

    I predict that we'll soon be buying big metal graphics controller boxes from nVidia complete with heavy duty power supplies and massive cooling capacity. After you get it home, you'll open up your graphics adapter and insert a little motherboard and CPU into an option slot to complete your computer system.

  • Exciting (Score:5, Funny)

    by be-fan ( 61476 ) on Monday November 18, 2002 @05:13PM (#4700891)
    New hardware mentioned on Slashdot. Now it's time for all the lamers to come up with the following posts:

    1) Who needs all that power anyway? I'm running Windows XP just fine here on my 486SX/33!

    2) Why cares if it's fast? It uses up too much power and has a *fan* on it. God forbid a computer have a fan on it! It sucks because it's not fan-less like my Mac!

    3) Sure it might be fast, but I bet it isn't as *efficient* as a G4!

    4) NVIDIA sucks because it's drivers are closed source.

    Did I forget anything? Anyway, I couldn't care less what the lamers think. This is a genuinely cool piece of hardware. There are a few things that make it so:

    1) 500 MHz! That's half a gigahertz! A very large jump in clock-speed here, much more so than the usual 33 MHz pussy-footing the industry (particularly Intel!) is guilty of.

    2) Compressed-memory access. Ah, computational power exceeds memory bandwidth to the point that it's more efficient just to compress the data before sending it over the bus... The 16 GB/sec memory bandwidth (which is also quite a big jump from existing machines) is made even more impressive by a lossless compression that can achieve 4:1 ratios. This is very helpful for multisample AA graphics, because it reduces the memory bandwidth hit to just the pixels that occupy the edges of polygons rather than every pixel in the scene.

    3) Fully floating point pixel pipelines. Carmack was asking for 64-bit floating-point point pipelines a while ago. While this doesn't quite get there (it's 32-bit floating point) it is a major step, and makes life a lot easier for game developers.

    Overall, this card is definately in the cards for me :) Maybe along with a dual Opteron machine. And before you scream excess, have you checked Pricewatch lateley? I remember paying $3300 for a single processor PII-300 with 64MB of RAM and a Riva 128 in January of 1998. If the Opterons don't cost that much more than the high-end Athlons today, I could put together this machine for significantly less than that!
    • Re:Exciting (Score:3, Insightful)

      by ivan256 ( 17499 )
      4) NVIDIA sucks because it's drivers are closed source.

      I could care less if their drivers were closed or open. I just wish they'd make them stable! The Nvidia drivers have crashed my machine 3 times in the last 6 months. That's unacceptable.
  • by DeadBugs ( 546475 ) on Monday November 18, 2002 @05:15PM (#4700911) Homepage
    NVidia [nvidia.com] has a list of "Lauch Games" for the GeforceFX. Command & Conquer: Generals, Unreal II, Rallisport Challenge, Sea Dogs II & Splinter Cell. Screen shots and some movies are included.
  • by Longinus ( 601448 ) on Monday November 18, 2002 @05:22PM (#4700980) Homepage
    I am the only one with my AGP slot as the first slot at the top on my motherboard? That means there's no open slot on the back of my case for that fan to stick out of. The only way I can see a contraption like that working is if it was taking up two PCI slots, which of course it doesn't... Any ideas?
  • by Fulg0re- ( 119573 ) on Monday November 18, 2002 @05:23PM (#4700997)
    Well, I've had my GeForce2 for almost 2 years now, and with this announcement of the GeForce FX, it's finally a sign to upgrade.

    It's funny, practically my entire workstation (P4 2.2GHz, 256MB DDR400, 80GB HD, etc.)has been upgraded in terms of components, however, my video card has remained static. Not that I'm complaining, because I can run pretty much every game out there at (what I consider to be) fairly decent speeds. Take Age of Mythology as an example. It's more than fast enough. Unreal Tournament 2003 is a tad different, as I have to turn down some of the graphics, but it's is still fine for the 'average' game. Plus, my Xbox and PS2 are for my gaming needs :)

    Now, does the theory of diminishing marginal utility apply to video cards, or is it the opposite? How much more powerful can video cards get so that we won't even 'notice' (at least in the loose sense) any difference when playing games? The Radeon 9700 Pro (with a fast CPU) can run pratically every game on the market at max details at most resolutions. Well, so can the GeForce FX 5800. Sure it may be 30-50% faster, but the utility gained for current games is definately marginal.

    Since I've held out for 2 generations of video cards, for me, it's definately the time to upgrade. Though, it's not really because my video card is too 'slow'. I suppose it's an issue of just gloating to my friends!

    Moreover, in terms of approaching cinematic rendering, nVidia is definately going in the right step. They are quickly approaching the level of "Final Fantasy" in terms of quality of output. Nonetheless, they'll still need to add quite a bit of horsepower to be able to do it all in real-time.
    • it's definately the time to upgrade

      I'll agree with that, but now that the two top dogs are both ready for DirectX9, it's time for them to stop adding proprietary extensions and to compete on speed and price.

      Ultimately, creeping featuritis is good for no-one, not for the manufacturers, who have to figure out a way to top each other, not for the consumers, who spend top dollar on cards that get obsoleted by superior technology, and most importantly, not for the game companies, who can't make money with products that only work on bleeding edge tech. Fine, GeForceFX has 63356 maximum instructions per vertex, but what if the gamer "only" has a Radeon 9700 Pro? They're limited to 1024 max instructions. What if they have a GeForce3? They're out of the loop altogether. That's why, despite all the advancements we've seen lately, games are just now coming out that list T&L accelerated cards as a requirement. Programmers (excepting id software, who are in the business of selling their engine more than in actually programming "games") aren't going to use the most advanced features until they can be reasonably sure a large segment of the buying public won't be shut out. So, please nVidia and ATI, slow down on the features, let's lock into what we have now (much as AMD and Intel have pretty much locked their feature set) and let's get these cards down to the price level where one doesn't have to take out a second mortgage to afford them. They're only toys, after all.
  • Oh no (Score:2, Interesting)

    by lewp ( 95638 )
    This goofy two-slot setup reminds me way too much of what 3dfx started doing when they couldn't keep up with a "normal" board. We all know what happened next...

    Unless they can trim that extra fat off the board I'll stick with ATI's offerings.
  • by BrookHarty ( 9119 ) on Monday November 18, 2002 @05:24PM (#4701007) Journal
    So there you have it; the elusive NV30 has surfaced in the form of GeForce FX. ATI has won the first round with the Radeon 9700 Pro, what will be most interesting will be what ATI has up their sleeves when the GeForce FX hits the shelves in February.

    Myself, I had a GF3 Ti500, I upgraded to a GF4 4600, but it wasnt much faster, returned it. Then a couple games came out (Battlefield 1942, Unreal2003) that really needed some gfx horsepower. So I bought the Ati 9700, Amazing. I can run older games with 6x AA perfectly, and Newer games run at 60FPS with 2x AA enabled. The GFX card works fine with the CVS version of Xfree also. (Or vesa mode for older 4.2.1) Also, I can output to TV at 1024x768, and have it mirror my monitor, great when playing some multiplayer games, or playing some divx/svcds. The Ati 9700 is a very nice product, and found some great forums at Rage3d [rage3d.com] for questions and updated beta drivers. (Like the new DX 9.0 drivers and DX 9.0 demos)
  • by binaryDigit ( 557647 ) on Monday November 18, 2002 @05:25PM (#4701018)
    My goodness, can you imagine a "workstation" running one of these nVidia cards with dual Itanic processors? Heck, if you got a university to run this configuration, you could bring Enron back from the brink. I see 20amp fuses in many homes going "POP" right now.
  • here [nvidia.com]

    i like this one. Can render >100 Jurassic Park dinosaurs at 100 frames per second.

    powerful, yeah.

  • Damn, I'm just gonna come out and say it (and risk major flames):
    I'm disgusted with the overabundance of hype with this launch. That's what this launch is. Of course there's no real substance because there's no shipping product!
    And maybe it's not just NVIDIA. A lot of companies hype their products when they launch. Gee, even if the launch is three months away. But what really gets me though is the AMOUNT of pure meaningless crap that is spewing from the websites I've seen.
    Tell me how it's going to benefit the consumer, by:

    1. Comparing the numbers like the "instructions," "constants," and "registers" that this new chip allows. These kinds of numbers mean nothing to the consumer. If nothing else NVIDIA should be pitching this crap to developers.
    2. Posting some really pretty pictures of things supposedly rendered with this card. Let me tell you why this is so rediculous.
    I did a little test [hardocp.com]. This is what you were supposed to get with your Geforce 3 (according to the picture on a HardOCP preview). Guess what, no games even LOOK like that yet, let alone if you had one could you play it on a Geforce 3 at acceptable frame rates! Sigh. Things are just getting worse.
    3. Real performance. I really can't believe that Anandtech posted frame rate numbers from Doom 3 that were supplied by NVIDIA. Data from an alpha game supplied by the card's manufacturer?. Yet no tests were shown of any other game, be it current or old. That is just rediculous.

    Maybe it's not realistic to do this since the card is not even in production yet. Yet NVIDIA chooses to 'announce' their card anyway, in the same fashion they have done in the past (usually when the product is available). Right. It's a very clever game NVIDIA is playing; announce this new product and attempt to hurt sales of their competitor's product in the hope that the consumer waits for this new, overly-hyped and untested product. We've seen this before with the Geforce 3 and we're seeing it again on a larger scale, and I'm sick of it.

    ok, so please flame me up the arse for bitching about the current state of deception that's going on in the industry. Yeah, lots of companies do it (while I think NVIDIA is the worst), yet people just eat this shit up! What's the point of going to different web sites when they're all supplied with the same incessant crap that NVIDIA created? I don't want to hear that it's just "the way things are" because I'm saying that they shouldn't be this way.
    Thanks for reading.
  • by alchemist68 ( 550641 ) on Monday November 18, 2002 @05:33PM (#4701144)
    That new graphics card sure looks pretty and EXPENSIVE with all that copper. This will certainly add to the cost of that product. I wonder what percent by weight of the entire product is copper, seeing that copper is a commodity metal.

    Regarding those comments about the cooling system not having a filter, this is a pre-production model. Give it some time, it will have to use a filter to keep the small space between the copper fins free of dust.

    Hey Bob, while you're out at Murray's Automotive, get me a new oil filter model number P3160 for a Saturn SL2 dual overhead cam and FX160 filter for my NVidia graphics card, 128MB DDR2 RAM, and be sure to read the serial number information. My FX card is post 4375XXX, so it doesn't need a finotany rod or a muffler bearing.
  • Well many people were interested to see what elements of 3Dfx might show up in Nvidias latest design..

    The result seems to be making bloody huge cards! I think they need to concentrate on finding ways to keep these cards SENSIBLY cool - not bolting on huge copper coolers, which expand onto a 2nd PCI slot, just to keep the GPU cool.

    Its crazy I tells ya!
  • by Animats ( 122034 ) on Monday November 18, 2002 @06:01PM (#4701425) Homepage
    This is impressive, but it may exceed the heat and power consumption acceptable in a consumer product. Especially with power supplies out there from those slimeballs who forge UL certifications. Remember the article about power supplies catching fire when loaded up just to their rated load?

    From a developer perspective, we're headed for a shader fight between NVidia's Cg, OpenGL 2.0 shader languages (shader assembler, ISL, and Quartz Extreme) and Microsoft's HLSL. It's not enough to have shader languages; they have to be supported in the content creation tools, so the artists can see what they're doing. This will take a while.

    Developers need to buy this thing, but everybody else can wait a year.

  • by scotay ( 195240 ) on Monday November 18, 2002 @06:09PM (#4701503)
    From Beyond 3d:

    "However, two questions remain - will developers use the extra shader capabilities over R300 and will shaders of the full length of GeForce FX actually be sensible to run in real-time? Undoubtedly there will be some developers who will choose to go for as much as the hardware will allow, but if the past is any indication then it will likely be the API specifications that will be the leveller and many developers may just opt to code for the base VS/PS2.0 DirectX9 specifications."

    Hasn't the R300 and NV30 just established D3D's vanilla pixel/vertex shader 2.0 as the LCD for mainstream gaming development? Will all that 2.0+ hotness of the new FX actually end up never getting used? What say ye, developers?

    Hell, I'm still waiting for something (anything) DX9 to push my 2-months-old 9700 pro.
  • by bryanbrunton ( 262081 ) on Monday November 18, 2002 @06:21PM (#4701607)

    Its amazing!

    The specs for this board should include a noise dampener to counter the hoover that they have strapped to its circuit board.

    The ex-3DFX engineers that NVidia acquired somehow managed to brainwash the NVidia guys into releasing a gigantic monster of a board that can only rival the VooDoo 5000 in its unpracticality and ungainliness.

    Those 3DFX guys have had their revenge.

  • Buh bye, SGI (Score:3, Interesting)

    by Anonymous Coward on Monday November 18, 2002 @07:01PM (#4701961)
    Right now I'm on a project where we are reluctantly (well, I'm reluctant: others are quite happy) using SGIs: we just dropped mid-five figures, and will probably come close to six before we're done (on this machine, we have about another $500k or so worth already). A lot of this is because of SGI's graphics pipe: we're doing some convolution and other stuff where we use pretty much all of the 512MB of texture memory that we have.

    I believe that current Nvidia Ti4600s have 128MB (256?) of memory, so I hope that a professional level of this new card might scale to the half Gig we need.

    Additionally, the SGI is 12-bits per color channel, which is a bummer since the interface it is simulating is 16-bit monochrome. Sure, you can try and do tricks, but from a quick glance over the FX's specs, I see 32 bits per channel, which would be very nice.

    With this FX card, a reasonably setup AMD Clawhammer system, and the scalability and preemption stuff that's going into 2.6/3.0 Linux kernels, we might be able to move from SGI within the next year or two, thus saving taxpayers on the order of $40-80k or more per system. A lot of development is already done on Linux, but it sure would be nice to move over fully.

  • Questionable Name (Score:5, Interesting)

    by Galahad2 ( 517736 ) on Monday November 18, 2002 @07:29PM (#4702168) Homepage
    Why didn't they name it the GeForce5? That sounds soo much cooler than FX. FX doesn't sound powerful at all, especially when their low end chip is called the "MX." Pronouncing the two isn't that different, too. Which sounds faster: Radeon 9700 Pro or GeForce FX? Sheesh.
    • Why didn't they name it the GeForce5?

      Remember that company called 3DFX whose "last" card was the Voodoo5? Then a powerhouse called Nvidia took over as highend "King of the Hill".

      Funny how that works, eh?

      I mean it is not like Nvidia has anything to worry about with ATI taking the performance cro ...errr...hey wait a minute...

      .
  • by Grandal ( 216720 ) on Monday November 18, 2002 @07:44PM (#4702297) Homepage
    It will be on their enthusiast-level card, but it looks like there will be a version for you mainstreamers too:

    "NVIDIA has hinted at offering another version of the GeForce FX at a lower clock speed that would only occupy a single slot cutout, but we will have to wait until the product line is announced before we can find out what the differences will be. Our initial guess would indicate that a simple reduction in clock speed would be enough to go with a more conventional cooling setup."

    And:

    "The other issue that users may have is noise, luckily NVIDIA has taken steps to make sure that the GeForce FX is one of the most quiet running cards they've ever produced. Borrowing technology from their mobile parts and combining it with the FX Flow cooling system, NVIDIA is able to dynamically reduce the speed of the fan based on the graphical needs of the system. When sitting in a 2D situation the card will scale back the clock speed of parts of the 3D pipeline that aren't in use, thus allowing the fan to spin much slower. As soon as you start using the GPU for games or any other 3D intensive applications, the clock speeds up as does the fan. The idea is that if you're gaming you're not as concerned with noise as when you are typing in Word."

    Link: http://anandtech.com/video/showdoc.html?i=1749&p=6 [anandtech.com]

  • Flip-chip technology (Score:3, Interesting)

    by doug363 ( 256267 ) on Tuesday November 19, 2002 @05:34AM (#4704582)
    I'm suprised noone has commented on NVidia's change to flip-chip technology yet. It's the first time that I've seen it used in consumer computer technology. Instead of having small legs like surface mount chips, the chip has blobs of solder underneath it, and the solder bonds to the PCB when the chip is pressed against the board during manufacturing. It's important because it lowers the capacitance of the external pins, which means that the chip can interface to the outside world at higher clock rates. It's an important shift in packaging technology.
  • by Isldeur ( 125133 ) on Tuesday November 19, 2002 @06:51AM (#4704793)
    I just found these leaked from a "reliable" source!! :)

    Official MacOSX 10.2.7 Patch schedule

    Because many new GPUs are reaching a stage where they are faster than our G4s, code has been added to swap the GPU into a CPU and the CPU(G4) into a GPU. We anticipate a 15-30% boost in Photoshop.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...