Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware

Nvidia Geforce 4 (NV25) Information 351

msolnik writes: "nV News has a brief article about the long-awaited NV25-based video adapters. These graphics processors have similar capabilities compared to the XGPU, and are a lot more powerful than GeForce3 Ti500. Since they are manufactured using .13 micron technology, they will probably be clocked at very high levels."
This discussion has been archived. No new comments can be posted.

Nvidia Geforce 4 (NV25) Information

Comments Filter:
  • Tiny Little Item (Score:5, Informative)

    by 1alpha7 ( 192745 ) on Sunday November 25, 2001 @09:03PM (#2611546) Homepage

    In case you miss it 3/4 down the page:

    NV25 Information

    I was browsing nVidia's forum over @ Fools, and there was a link to Reactor Critical. Here's what they have to say about NV25.

    Long-awaited NV25 based adapters. This graphics processor that have similar capabilities compared with XGPU is a lot more powerful than GeForce3 Ti500. Since it is manufactured using 0.13 microns technology, it has a lot of chances to be clocked at the very high levels. The GPU comes in January/February 2002, while professional boards should be available in the second quarter.

    ELSA is going to launch two boards based on NV25GL processor, both supports two LCD monitors, though, we do not know whether there are two integrated TMDS transmitters or only one and the second is external.

    NV25 that works on 275 MHz. 128 MB DDR SDRAM @ 250 MHz.
    NV25 that works on 300 MHz. 128 MB DDR SDRAM @ 330 MHz.

    So, this is what a high-end NV25 part *might* look like...

    * Rumoured 6 Pixel pipelines
    * Core freq: 300 MHz.
    * Memory: 660 MHz. (eff) ~ 10.5 GB/sec BW, assuming they stay with 128-bit data paths.
    * Supports TwinView
    * Supports (finally) Hardware iDCT
    * More powerful T&L unit, to include a second Vertex Shader
    * Can't find the link, but there's a rumour stating that we can expect Voodoo5 5500-esque Anti-Aliasing feature. The presumption is that the NV25 will bring a Rotated-Grid AA implementation to the table.
    * .13u Manufacturing process

    It really does sound like a pretty amazing chip. I would be willing to bet we'll be hearing a lot more in the way of rumours as the New Year approaches.

  • Re:3dfx... (Score:4, Informative)

    by fault0 ( 514452 ) on Sunday November 25, 2001 @09:10PM (#2611563) Homepage Journal
    Yeah, apparently they will:

    from the article:

    ".. A Voodoo5 5500-esque Anti-Aliasing feature. The presumption is that the NV25 will bring a Rotated-Grid AA implementation.."

    This is probably to compete with Ati's SmoothVision FSAA implementation, which is really quite slick. However, 3dfx was rumored to have really advanced FSAA implementations for their future Voodoo5 6000/6500 series. Perhaps the NV25 will include that.
  • Re:3dfx... (Score:2, Informative)

    by SquierStrat ( 42516 ) on Sunday November 25, 2001 @09:21PM (#2611595) Homepage
    ATI is no where near filling that void sir! Take a look at their drivers. Quite simply, they suck! Particularly the linux drivers. My apologies, but NVIDIA is the only company to have decent (speedwise) linux drivers on the market. I can't say that is true for stability, however, I have zero stability problems due to the drivers, although others have said otherwise. ATI's windows drivers, however are cripplingly unstable ( have been for many years too. ) Not to mention, the card/driver mixer is just behind NVIDIA as far as speed, and only gets ahead in FSAA benchmarks, which personally, I couldn't give a rip about. I want a card that runs stable and runs fast in Linux AND windows (shudder...) not one that runs stable in one OS and fairly fast on one, but hey, the image quality is good. If that was the case I'd have used ATI a long time ago.
  • by hound3000 ( 238628 ) on Sunday November 25, 2001 @09:29PM (#2611614) Journal
  • by Anonymous Coward on Sunday November 25, 2001 @09:40PM (#2611633)
    according to the inquirer [theinquirer.net], nvidia is having problems with the foundry that supplies its' chips
  • by Anonymous Coward on Sunday November 25, 2001 @10:04PM (#2611682)
    Unreal Warfare, the new licenseable engine from the people that brought you UT and Unreal, brings an Athlon 1.4 GHz / GeForce3 to its knees. Sure, it is a game engine still in flux, with content that may need to be optimized.

    Developers will continually find ways to make these boards work. The game production pipeline is getting slower, thats all.

  • Re:3dfx... (Score:3, Informative)

    by mz001b ( 122709 ) on Sunday November 25, 2001 @10:06PM (#2611687)
    nvidia's drivers under linux pretty much as fast as they are under windows. There are not open source, but they work very well. OpenGL apps are fast under linux with them.
  • Re:3dfx... (Score:2, Informative)

    by jason_watkins ( 310756 ) on Sunday November 25, 2001 @10:32PM (#2611749)
    negative

    3dfx was trying to ski uphill with a set of gear that had been obsolete for some time. nVidia was no longer worried about the, rather about ATI and competing in the OEM deals. Buying 3dfx just gave them protection against someone else buying 3dfx's IP and starting patent litigation against them.

    And the fact of the matter is, you don't know what 3dfx was working on, wheither it was "killer routines" or not. Gigapixels technology may have breathed some extra life into t-buffer, but it's hardware transform and programability that will be the forseeable future, 2 things 3dfx was not making much progress towards.

    Look, I liked 3dfx as well. I replaced my rendition with a voodoo when there were exactly 3 games that ran on it. However, I don't let my fondness for 3dfx pull the wool over my eyes and convince me that they were somehow not doomed, were going to come back and do great things if only nasty nvidia hadn't bought them.

    Remember, 3dfx chose to sell.

    What will be the next big innovations? If someone can figure out how to get realisitc sizes of embeded dram, that'll be nice for fill rate. Some people might try heirarchical zbuffering, but that hasn't been a win for a while on most architectures. As vertex processing throughput approaches fillrate throughput, I expect to see things move to randomized splat samping, since it appears the best way to get logarithmic complexity vs number of primatives. I've not seen any other way to get realistic rendering times on datasets involving billions of polygons or more.
  • Re:LCD Tangent (Score:3, Informative)

    by Jeffrey Baker ( 6191 ) on Sunday November 25, 2001 @10:42PM (#2611774)
    Yeah you might say that Viewsonic's LCDs are not up to scratch. Take a side-by-side comparison between them and a Samsung screen. I don't know who makes the actual LCD, but Samsung's monitors look vastly better than Viewsonic's.

    Of course, you'll pay real money for the Samsung, but I don't know anyone else selling a 24" LCD monitor these days.

  • by Vagary ( 21383 ) <jawarrenNO@SPAMgmail.com> on Sunday November 25, 2001 @11:34PM (#2611897) Journal

    The problem (as Hideo Kojima says in this interview [slashdot.org]), is that each of those pores will have to be designed. So as detail increases, so does game development cost.

    Games won't be able to keep up with graphics cards until designs scale above the latest hardware. Some kind of fractal / organic method seems the only way to go.

  • Re:Video capture (Score:3, Informative)

    by ukyoCE ( 106879 ) on Monday November 26, 2001 @12:17AM (#2612029) Journal
    You have to go for the Asus Deluxe line of cards(they make the same cards in multiple versions, Deluxe has a crappy TV tuner and digital VCR) or for an ATI All-in-Wonder line. The ATI has by far the best multimedia suite, it's tv-guide-like recording of shows and on-board mpeg-2 encoding kick-ass. the program is still a little slow and buggy, but incredible nonetheless.
    honestly though, I say save 50-100$ on the video card, and buy the tv-card seperate. that way a) save money on the video card b)no need to keep buying the 50-100$ extra card every year, since the capture card will still work just as well.
    As for Linux compatibility, I've heard mixed reports about all the Asus Deluxe and the ATI All In Wonders, so you'll have to search around online and take a guess.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...