Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Displays Businesses Graphics Software Apple

The New Nvidia 6800 Ultra DDL Graphics Card 217

Dr. q00p writes "Since Apple doesn't offer much information on the new Nvidia 6800 Ultra DDL graphics card (and NVIDIA even less) which is required to drive the new 30-inch Cinema HD Display the readers of Slashdot might be interested to read a reply from Ujesh Desai, Nvidia's General Manager of Desktop GPUs, to a series of questions from Accelerate Your Mac."
This discussion has been archived. No new comments can be posted.

The New Nvidia 6800 Ultra DDL Graphics Card

Comments Filter:
  • by Anonymous Coward on Wednesday July 21, 2004 @09:39AM (#9758981)
    Q & A with Nvidia on the Mac
    Nvidia 6800 Ultra DDL Graphics card
    Posted: 7/20/2004

    Shortly after Apple announced the Mac Nvidia 6800 Ultra DDL card for the PowerMac G5s (which is required to drive the 30in Cinema Display), I sent a series of questions to a contact at Nvidia on the card. Yesterday I received the reply from Ujesh Desai, Nvidia's General Manager of Desktop GPUs. Although some questions didn't get as complete an answer as I hoped (often due to the fact Apple controls OEM Mac Nvidia products), I appreciate his taking the time to reply.

    * How does the NVIDIA GeForce 6800 Ultra DDL card for the Mac differ from the PC version (i.e. Does the PC version have dual link DVI?)

    The GeForce 6800 Ultra DDL card was designed specifically for the Mac to provide two dual-link outputs to support Apple's displays.

    * Does the Apple version of the GeForce 6800 Ultra GPU run at the same core/memory clock as the PC version?

    The Apple cards run at 400/550, just like the GeForce 6800 Ultra GPU on the PC.
    (Note: Some vendor's 6800 cards are clocked higher than the standard/reference design.)

    * The GeForce 6800 Ultra for the PC has two Molex power connectors - does the Mac version source all the power from the G5's AGP pro slot? (or does it have a aux power connector?)

    There is an on-board power connector on the graphics card and the motherboard to provide power, so there is no need for an external power connector from the power supply.
    (although the only Mac 6800 photos I've seen are tiny, it appears there's a stub connector on the card that (I suspect) uses the ADC (28V or 24V usually) DC power connector on the motherboard that's normally used for ADC display power to provide additional power (regulated down) for the 6800 card. That eliminates the need for Aux. (Molex) P.S. connector(s) like the PC/standard 6800 card versions have.)

    * Does the GeForce 6800 Ultra DDL have a low-noise fan?

    Yes, the GeForce 6800 Ultra DDL runs very quiet.

    * Will there ever be a control panel with 3D/GL/FSAA controls for the NVIDIA cards on the Mac platform? (ATI's retail Radeon cards (and OEM models with the 3rd party patch) have a '3D/GL overrides' feature - which is seen as a big plus by many end users.)

    Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver.

    * Regarding the previous question - if there's no chance of an Apple supplied NVIDIA card control panel (for advanced features/FSAA, etc.) - if a 3rd party wanted to do this, can NVIDIA provide some assistance?

    Apple is our customer, so if this is something that they requested, then we would support it.

    * There's been talk of previous NVIDIA cards taking a bigger than expected performance hit from using some types of shaders (on the Mac) - is this a concern with the GeForce 6800 Ultra DDL?

    GeForce 6 Series GPUs have the most advanced shading engines on the market. Compared to previous generation parts, the vertex and pixel shader engines on GeForce 6 Series GPUs have been completely redesigned from the ground-up. Pixel Shader performance is 4 to 8 times faster. Vertex shader performance is twice as fast. Performance numbers have been outstanding.

    * Will there updated/new drivers for the GeForce 6800 Ultra?

    Yes. Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver. Apple will control the release schedules for drivers that provide even more performance, features and image quality enhancements.

    * Do you have any idea how performance compares on the Mac between the GeForce 6800 Ultra and the ATI 9800 Pro/XT card?

    GeForce 6800 Ultra represents the largest leap forward in graphics performance in our company's history. As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves.

    (Note: There's no Mac 6800 perf
  • Re:Set up (Score:3, Informative)

    by angrist ( 787928 ) on Wednesday July 21, 2004 @09:42AM (#9759004)
    Nope, they should each show up as a single monitor.
    The "dual-link" label is misleading, it's mearly an update to the DVI standard (like DVI-I, DVI-A, etc) too allow for more data.
  • Promises promises (Score:1, Informative)

    by Anonymous Coward on Wednesday July 21, 2004 @09:50AM (#9759076)
    We always give significant performance increases after we have leveled out the stability of the new architecture. GeForce 6 should continue that trend.

    They made the same promises regarding the NV30/NV35 series and the shader performance NEVER approached the shader performance of the R300 series. Even Carmack was talking potential scheduling efficiencies during the NV30 launch that never materialized.

    ATI may have similar problems as R500+ are going to the pool of ALUs approach where software scheduling becomes paramount to delivering on the performance of the hardware.
  • Re:Set up (Score:2, Informative)

    by lordDallan ( 685707 ) on Wednesday July 21, 2004 @09:59AM (#9759151)
    Actually - The issue is that there is too much data to drive the screen over one cable connection/channel (don't know the right technical term) - so there are two DVI connectors for each screen (four on the card).

    Only the 30 inch display requires the two connections per screen - so this card is really only for the 30 inch.

    IANAE - so I have no idea if the card could ever be hacked to drive four displays - but that would be pretty cool.

  • Tom's Hardware (Score:5, Informative)

    by pajamacore ( 613970 ) on Wednesday July 21, 2004 @10:00AM (#9759154)
    There was actually a really great, informative article about the 6800 on Tom's Hardware [tomshardware.com] a few weeks ago.

    "NVIDIA has seemingly pulled out all stops in an attempt to deliver cutting edge graphics with its GeForce 6800 Ultra. After gamers for too long have had to be content with mere incremental improvements to graphics performance, NVIDIA new card delivers a performance jump not seen for a long time. The device is also solidly engineered as well as insanely fast."
  • Re:Set up (Score:2, Informative)

    by angrist ( 787928 ) on Wednesday July 21, 2004 @10:08AM (#9759226)
    You're wrong and you're right.

    Yes, the issue is data throughput. DVI-D doesn't support high enough resolution.

    But, the 30 inch display only needs ONE connector.
    DVI-Dual Link is just a protocal/standard that allows that one connector to send twice the data of DVI-D. Think double density.

    So... one card, two DVI-Dual Link Connectors, one display (including 30 inch) per connector.
  • Re:It's costs... (Score:5, Informative)

    by Quobobo ( 709437 ) on Wednesday July 21, 2004 @10:09AM (#9759227)
    Argh. No, it's not. There's 2 (two) dual-link DVI ports, each of which can drive 1 (one) 30 inch monitor. Take a look at the pictures from WWDC where they had a G5 driving two of those monitors.
  • by badriram ( 699489 ) on Wednesday July 21, 2004 @10:09AM (#9759231)
    I would look into the Matrox Parhelia series of cards. They are designed for high end use in DV, CAD GIS etc.
  • by Synesthesiatic ( 679680 ) on Wednesday July 21, 2004 @10:17AM (#9759301) Homepage
    A Mac-specific ROM is required for full Open Firmware support. Apparently a card will work without an OF ROM but won't be plug and play [apple.com]. That's pretty important for a Macintosh.

    Since Sun uses OF as well, I wonder if the same card could be used for Macs and Sun workstations.

  • by mfago ( 514801 ) on Wednesday July 21, 2004 @10:18AM (#9759318)
    why can't we buy and use "PC" Video cards? What is it that makes vendors have to build EPROMs differently (Different?) for the Mac vs. Windows machines for the exact same card otherwise?

    Because x86 stores data backwards (the big/little endian thing) as compared to almost every other processor, including the PowerPC.

    Thus the card firmware needs to be different...

  • Re:Flamebait... (Score:3, Informative)

    by Have Blue ( 616 ) on Wednesday July 21, 2004 @10:28AM (#9759407) Homepage
    I don't know why I'm replying to this but...

    It's not just for gaming. Mac OS X's GUI can be accelerated by the GPU. 10.4 will also ship with video- and image-processing libraries that use the GPU.

    And even if you don't care about gaming at all, this is the only card on any platform that supports the 30" cinema display, so if you want one of those you need the card anyway.
  • Re:Flamebait... (Score:2, Informative)

    by javax ( 598925 ) on Wednesday July 21, 2004 @10:28AM (#9759408)
    I've got two words for you: Core Image [apple.com]
  • Re:Flamebait... (Score:2, Informative)

    by moofus ( 55880 ) on Wednesday July 21, 2004 @10:45AM (#9759552)
    2 words: WoW
  • Radeon X800 Series (Score:1, Informative)

    by Anonymous Coward on Wednesday July 21, 2004 @10:49AM (#9759585)
    For the Mac, dunno.

    But Radeon X800 XT certainly matches (wins some, loses some) the GeForce 6800 Ultra in PC land. Reviews confirming this are in abundance.

    But like said, dunno about Macs and driving 30" screens thru two dual-link DVI ports... Maybe not. I follow the developments in 3D hardware, and there haven't been any rumours or info about such a Radeon card (by ATI, Apple, or somebody else).

    It would need four TDMS transmitters on board. Then again, the Evans & Sutherland four-way R300 card has eight ;-)
  • by addaon ( 41825 ) <addaon+slashdot.gmail@com> on Wednesday July 21, 2004 @11:06AM (#9759761)
    PCI and AGP are both specified to be little endian regardless of platform.
  • by shawnce ( 146129 ) on Wednesday July 21, 2004 @11:28AM (#9759961) Homepage
    The 30" monitors from Apple have a resolution that cannot be fed by a single-link DVI connection. So they use dual-link DVI. Both single-link and dual-link are part of the DVI 1.0 standard, nothing Apple specific about them.

    The difference between single-link and dual-link is how many of the pins in the connector is used for transmitting data, in a nut shell 12 pins for the former and 24 pins for the later.

    Apple is using DVI-D (digital only) DVI connectors with a dual-link pin out for the 30" display. So one dual-link DVI-D connection is capable of driving one 30" display. The 6800 adapter used for these displays provides two dual-link DVI-D outputs, so one adapter can drive two 30" displays.

    As a reference...

    DVI connector type summary [ddwg.org]
    DVI 1.0 specification (PDF) [unc.edu]
  • by Anonymous Coward on Wednesday July 21, 2004 @11:44AM (#9760114)
    You could with voodoo(3/4/5?) card for example, put a PC version in a PC, boot, flash the EPROM with a MAC version, shut down the PC, remove the voodoo card, put it in a MAC, and it works. It was the same for GeForce2 and 3. Just wait for someone that release MAC ROM version and flash it on your PC in dos/windows.
  • by shawnce ( 146129 ) on Wednesday July 21, 2004 @01:11PM (#9761184) Homepage
    Humm not fully sure... dual-link works by sending even and odd pixels for a given color channel down two different links. Single link transmits even and odd pixels over a single link.

    Basically DVI defines 6 signal pairs for pixel data, in single-link 3 of the 6 are used, one for each color channel (RGB). In dual-link even pixels go down one bank of 3 while odds go down the other bank of 3.

    From what I can see the channel definition for connections is the same for single-link and dual-link. So in theory it could work if the adapter could toggle between sending just even pixels to sending even and odd pixels on the first set of links.
  • by shawnce ( 146129 ) on Wednesday July 21, 2004 @02:39PM (#9762129) Homepage
    A single dual-link DVI-I port can drive any DVI-I monitor, so the answer to the grandparent is Yes, as long as you get the ADC -> DVI adapter.

    Actually this isn't true or we wouldn't have dual-link for example. It depends on the resolution and refresh rate, basically the bandwidth needs of the display.

    Also...
    DVI-I = connector carrying both a digital and analog signal
    DVI-D = connector carrying just a digital signal
    DVI-A = connector carrying just analog (extra to DVI specification)

    For the digital aspect of DVI connections you can have either single-link or dual-link (supported by either DVI-I or DVI-D connectors, at least fully connected ones). So don't confuse DVI-D as implying dual-link, it just implies a digital only connector.

    Apple's new displays use DVI-D connectors (at least that is what I recall seeing) with the 20" and 23" screens using single-link and the 30" using dual-link. The older displays used ADC connectors (basically single-link DVI with pin out for usb and display power). Apple's DVI to ADC converter has a DVI-D connector on it (looked at the one under my desk). Apple doesn't provide a ADC to DVI converter but third parties do (also ones for ADC->VGA).

    DVI -> ADC converter converts a DVI output to an ADC output (what you need to drive an ADC only monitor if you adapter doesn't sport ADC)

    ADC -> DVI converter converts a ADC output to a DVI output

    I am not sure if the adapter in question sports DVI-I or DVI-D outputs (traditionally I believe adapters have DVI-I, at least high end ones). You can plug a DVI-D cable into either a DVI-D or DVI-I output. Also having DVI-I outputs allows the easy split out of VGA if needed.
  • by istewart ( 463887 ) on Wednesday July 21, 2004 @07:38PM (#9765227)
    This is quite possible with any number of cards. I have a blue-and-white G3 and my original video card purchase for it was a PC Radeon 7000 PCI. Unlike the then-current Mac version, it lacked an extra DVI port. However, it had 64MB DDR as compared to 32MB and cost less than half as much. Currently, I'm using a Radeon 9100 PCI with a hacked Mac ROM courtesy of this guy. [mrspyonline.com] Overall a much better purchase than ATI's upcoming 9200 PCI for Mac.

Living on Earth may be expensive, but it includes an annual free trip around the Sun.

Working...