Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Displays Businesses Graphics Software Apple

The New Nvidia 6800 Ultra DDL Graphics Card 217

Dr. q00p writes "Since Apple doesn't offer much information on the new Nvidia 6800 Ultra DDL graphics card (and NVIDIA even less) which is required to drive the new 30-inch Cinema HD Display the readers of Slashdot might be interested to read a reply from Ujesh Desai, Nvidia's General Manager of Desktop GPUs, to a series of questions from Accelerate Your Mac."
This discussion has been archived. No new comments can be posted.

The New Nvidia 6800 Ultra DDL Graphics Card

Comments Filter:
  • by sebi ( 152185 ) on Wednesday July 21, 2004 @08:32AM (#9758927)
    Reading that "interview" I can almost see the lawyer going over every answer and neutering it before it went out. Either that or Mr Desai is the most boring and lifeless fellow in the history of electronics.
    • Worthless read (Score:4, Insightful)

      by Achoi77 ( 669484 ) on Wednesday July 21, 2004 @08:46AM (#9759037)
      Heh. After reading that interview I get the feeling that this guy doesn't know anything about the product he is selling. Generic one-liner answers that dance around the questions with emphasis on market speak. Here's an excerpt:

      GeForce 6 Series GPUs have the most advanced shading engines on the market. Compared to previous generation parts, the vertex and pixel shader engines on GeForce 6 Series GPUs have been completely redesigned from the ground-up. Pixel Shader performance is 4 to 8 times faster. Vertex shader performance is twice as fast. Performance numbers have been outstanding.

      Absolut (tm) Garbage!! Here's another, this time with the question:

      * Do you have any idea how performance compares on the Mac between the GeForce 6800 Ultra and the ATI 9800 Pro/XT card?

      GeForce 6800 Ultra represents the largest leap forward in graphics performance in our company's history. As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves.

      Talk about trash!! A simple NO would have sufficed. Looks like he's made the most of his Business-for-dummies Manual. Man, why am I so angry over this?

      • Re:Worthless read (Score:3, Insightful)

        by foidulus ( 743482 ) *
        Talk about trash!! A simple NO would have sufficed. Looks like he's made the most of his Business-for-dummies Manual. Man, why am I so angry over this?
        Probably because he gets paid much more for spouting bs than any of us do for real work....
      • You're angry because you have difficulty reading english. Go calm down. He says it's much faster.
    • by Anonymous Coward
      The interviewer is not any better. I was particularly annoyed at this exchange:

      Mr Desai:
      * We are the first chip with an onboard video processor

      Interviewer's commentary:
      (Note: Some previous ATI cards like their AIW models have shipped with the "Rage Theater" video encoder/decoder chip on the card. It was first intro'd in 1998, and revised I'm sure since then. Of course the current generation of GPUs have more advanced features.)

      Now, how exactly is that comment relevant? Mr Desai claimed theirs was
      • Clearly not, as the interviewer asked if nVidia would be introducing any retail cards for the Mac. nVidia doesn't make cards, but GPUs, as Mr. Dull-ai explained.

        • Actually, Mike Breeden (the "interviewer") did not ask if nVidia itself will be introducing retail Mac cards. He is asking for Mr. Desai's opinion on whether graphics card manufacturers have any interest in introducing nVidia retail cards to the Mac market.

          In fact, Mr. Breeden foresaw that Mr. Desai would probably answer with the standard "We don't make cards. We make GPUs." answer, so he flatly asked the question, parenthetically. Read the actual quoted passage again:

          * This is an old question, but

    • by Anonymous Coward on Wednesday July 21, 2004 @08:54AM (#9759113)
      Mr Desai is the most boring and lifeless fellow in the history of electronics.

      And that's saying a lot.
  • by Gentoo Fan ( 643403 ) on Wednesday July 21, 2004 @08:35AM (#9758948) Homepage
    Also, I liked this:

    * Do you have any idea how performance compares on the Mac between the GeForce 6800 Ultra and the ATI 9800 Pro/XT card?

    GeForce 6800 Ultra represents the largest leap forward in graphics performance in our company's history. As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves.


    Translated: We'll release some actual numbers when we sell more of these mini-space heaters.
  • by _PimpDaddy7_ ( 415866 ) on Wednesday July 21, 2004 @08:35AM (#9758955)
    more than some PCs. Amazing!

    From the site:
    "The combination of a GeForce 6800 Ultra with a dual processor Power Mac G5 driving two 30-inch Apple Cinema HD Displays is the definitive tool for the creative professional."

    Yes because I need 2 30" screens to watch Carrie Ann Moss on one screen and Natalie Portman on the other :)
  • by nighty5 ( 615965 ) on Wednesday July 21, 2004 @08:37AM (#9758965)
    It won't replace my S3 - 1 meg

    Never..

    Never......

    Never !!!!
  • As expected, they are much faster than previous generation products from ATI

    Thats basically like saying "Hey, this new souped Mustang is much faster than a 1992 Taurus!"

    I mean, it better be whole hell of a lot faster than the old cards for the huge premium you are paying right now.
  • All I know is that the 6800 won't in in my iMac, or (soon to arrive) PowerBook ..... Damn you nVidia!

    Slightly off topic, has anyone seen a way to upgrade (even if it includes needed a new case) the video card on an iMac? (lamp type)
  • by Anonymous Coward on Wednesday July 21, 2004 @08:39AM (#9758981)
    Q & A with Nvidia on the Mac
    Nvidia 6800 Ultra DDL Graphics card
    Posted: 7/20/2004

    Shortly after Apple announced the Mac Nvidia 6800 Ultra DDL card for the PowerMac G5s (which is required to drive the 30in Cinema Display), I sent a series of questions to a contact at Nvidia on the card. Yesterday I received the reply from Ujesh Desai, Nvidia's General Manager of Desktop GPUs. Although some questions didn't get as complete an answer as I hoped (often due to the fact Apple controls OEM Mac Nvidia products), I appreciate his taking the time to reply.

    * How does the NVIDIA GeForce 6800 Ultra DDL card for the Mac differ from the PC version (i.e. Does the PC version have dual link DVI?)

    The GeForce 6800 Ultra DDL card was designed specifically for the Mac to provide two dual-link outputs to support Apple's displays.

    * Does the Apple version of the GeForce 6800 Ultra GPU run at the same core/memory clock as the PC version?

    The Apple cards run at 400/550, just like the GeForce 6800 Ultra GPU on the PC.
    (Note: Some vendor's 6800 cards are clocked higher than the standard/reference design.)

    * The GeForce 6800 Ultra for the PC has two Molex power connectors - does the Mac version source all the power from the G5's AGP pro slot? (or does it have a aux power connector?)

    There is an on-board power connector on the graphics card and the motherboard to provide power, so there is no need for an external power connector from the power supply.
    (although the only Mac 6800 photos I've seen are tiny, it appears there's a stub connector on the card that (I suspect) uses the ADC (28V or 24V usually) DC power connector on the motherboard that's normally used for ADC display power to provide additional power (regulated down) for the 6800 card. That eliminates the need for Aux. (Molex) P.S. connector(s) like the PC/standard 6800 card versions have.)

    * Does the GeForce 6800 Ultra DDL have a low-noise fan?

    Yes, the GeForce 6800 Ultra DDL runs very quiet.

    * Will there ever be a control panel with 3D/GL/FSAA controls for the NVIDIA cards on the Mac platform? (ATI's retail Radeon cards (and OEM models with the 3rd party patch) have a '3D/GL overrides' feature - which is seen as a big plus by many end users.)

    Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver.

    * Regarding the previous question - if there's no chance of an Apple supplied NVIDIA card control panel (for advanced features/FSAA, etc.) - if a 3rd party wanted to do this, can NVIDIA provide some assistance?

    Apple is our customer, so if this is something that they requested, then we would support it.

    * There's been talk of previous NVIDIA cards taking a bigger than expected performance hit from using some types of shaders (on the Mac) - is this a concern with the GeForce 6800 Ultra DDL?

    GeForce 6 Series GPUs have the most advanced shading engines on the market. Compared to previous generation parts, the vertex and pixel shader engines on GeForce 6 Series GPUs have been completely redesigned from the ground-up. Pixel Shader performance is 4 to 8 times faster. Vertex shader performance is twice as fast. Performance numbers have been outstanding.

    * Will there updated/new drivers for the GeForce 6800 Ultra?

    Yes. Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver. Apple will control the release schedules for drivers that provide even more performance, features and image quality enhancements.

    * Do you have any idea how performance compares on the Mac between the GeForce 6800 Ultra and the ATI 9800 Pro/XT card?

    GeForce 6800 Ultra represents the largest leap forward in graphics performance in our company's history. As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves.

    (Note: There's no Mac 6800 perf
  • Not that it's in any livable price-range; but two 30" displays connected to a card like that running MacOS-X, can you image that?
    The beauty, the beauty... ;-)
    • probably still comes in at less than a comparable Sun or SGI workstation without a display =)
    • It's $10,047.00 at the Apple store to get a dual 2.5ghz G5 + the card + two 30" displays.

      That's not so bad. I remember when a Macintosh II or one of the early PowerBooks cost about that much.

      I plan to buy one 30" display and run with that and my existing 23" display (assuming it can be done, but I think it can). I can always upgrade later to the dual setup, but to be honest I'm sure it will be awesome no matter what.

      D
  • by hcdejong ( 561314 ) <hobbes@x[ ]et.nl ['msn' in gap]> on Wednesday July 21, 2004 @08:47AM (#9759045)
    For n=1 to 12 Q: Blah[n] A: 42! Next n
  • by Erwos ( 553607 ) on Wednesday July 21, 2004 @08:48AM (#9759056)
    The 6800 DDL is just a 6800 that supports the new ADC. Apple releases the drivers, don't bitch at us if you don't like the drivers. No, we're not going to tell you about our contract with Apple. The X800 sucks.

    Much faster to read, no PR speak to deal with.

    -Erwos
    • If you meant... (Score:5, Insightful)

      by daveschroeder ( 516195 ) * on Wednesday July 21, 2004 @09:17AM (#9759305)
      ..."the new ACD", as in "the new 30" Apple Cinema Display", ok.

      But if you actually meant ADC, or "Apple Display Connector", that is no longer used. With the new line of displays, Apple has (thankfully) gone back to standard DVI for the displays and for their future OEM video cards.
    • I don't mean to be a pendant, but

      ADC == Apple Display Connector

      ACD == Apple Cinema Display

      The new 6800 supports the latter, not the former (Dual DVI is the connection standard for the newest Apple systems).
  • by for_usenet ( 550217 ) on Wednesday July 21, 2004 @08:50AM (#9759075)
    I'm a heavy mac user, and I read this site pretty much on a daily basis, as the guy responsible for the site puts up a LOT of decent Mac hardware and software info on there. But this has got to be one of the most UNinformative, useless things he's posted. I know there's a desire for info about this card - but shouldn't we wait till some more detailed specs are released, or till someone has some actual silicon so benchmarks can be run ?

    Yet another example of "no news" being news ... As many other people have said, "Nothing to see here. Move along !!"
  • Tom's Hardware (Score:5, Informative)

    by pajamacore ( 613970 ) on Wednesday July 21, 2004 @09:00AM (#9759154)
    There was actually a really great, informative article about the 6800 on Tom's Hardware [tomshardware.com] a few weeks ago.

    "NVIDIA has seemingly pulled out all stops in an attempt to deliver cutting edge graphics with its GeForce 6800 Ultra. After gamers for too long have had to be content with mere incremental improvements to graphics performance, NVIDIA new card delivers a performance jump not seen for a long time. The device is also solidly engineered as well as insanely fast."
  • by DeckerEgo ( 520694 ) on Wednesday July 21, 2004 @09:02AM (#9759174) Homepage
    My brain kept thinking that they were talking about the old Motorola 6800 chipsets that Apple used nine years ago... not a GPU marketed as "6800"... I got so confused...

    Wait - I sold those things nine years ago!?!? Damn I'm old.
  • Can someone explain to me why a Mac would need such a powerful gaming card?!

  • by gsfprez ( 27403 ) * on Wednesday July 21, 2004 @09:07AM (#9759216)
    why can't we buy and use "PC" Video cards? What is it that makes vendors have to build EPROMs differently (Different?) for the Mac vs. Windows machines for the exact same card otherwise?

    It reduces our choices and makes $100 cost $400.
    • by Synesthesiatic ( 679680 ) on Wednesday July 21, 2004 @09:17AM (#9759301) Homepage
      A Mac-specific ROM is required for full Open Firmware support. Apparently a card will work without an OF ROM but won't be plug and play [apple.com]. That's pretty important for a Macintosh.

      Since Sun uses OF as well, I wonder if the same card could be used for Macs and Sun workstations.

    • why can't we buy and use "PC" Video cards? What is it that makes vendors have to build EPROMs differently (Different?) for the Mac vs. Windows machines for the exact same card otherwise?

      Because x86 stores data backwards (the big/little endian thing) as compared to almost every other processor, including the PowerPC.

      Thus the card firmware needs to be different...

      • Yeah, but Mac users are gouged on the prices, for just a different firmware (hence the many guides to flashing "PC" video cards for use in Macs). It makes no sense to charge so much more for simply having some different data on the card.
        • I'm a Mac user myself.

          That being said, it makes sense that you would charge more for different firmware because it has to be split among fewer buyers for the cards. Remember, someone still has to develop the firmware, and that person needs to be paid just like you and me.

          I don't mind paying a few extra bucks for Apple-compatible stuff because I appreciate the extra effort that goes into supporting it.

          More to the point, I normally would have no need to replace my video card anyway, except I really, reall
      • Comeon, calling little-endian "backwards" is flamebait...


        Actually there is a justificative for little-endian, just like there is one for the British driving on the left. Casting values from 16 bits to 8 bits and vice-versa in little-endian machines is automatic. In the old days of limited memory this was an advantage. (As for driving on the left side of the road, it came from horse riding: one mounts a horse from the left side)

        • Another way of doing the same thing, and one that I think would be interesting to try out, is to use "big endian" storage (which is more natural for people, or at least people who read strings from left to right and read numbers with the most significant digits on the left), but the address of a value is the last (least significant) byte.

          One problem with either scheme, that isn't shared by big endian references, is that accessing a field using the wrong sized reference is more immediately obvious in more c

      • by addaon ( 41825 ) <addaon+slashdotNO@SPAMgmail.com> on Wednesday July 21, 2004 @10:06AM (#9759761)
        PCI and AGP are both specified to be little endian regardless of platform.
  • by saddino ( 183491 ) on Wednesday July 21, 2004 @09:12AM (#9759259)
    My answers were designed specifically to provide little information, so there is no need for criticism. The site provided questions and I supply them with answers, if more details are requested, then I would support it. Compared to previous generation interviews, I redesigned my answers from the ground up and I think my word count was outstanding. Yes, Apple provides the answers sometimes. We supply them with talking points and let our quotes speak for themselves. The guys at ATI do a good job of squeezing out interesting information during their interviews, but our answers have a lot more headroom. Other differences include:
    • I support my pants with suspenders and they do not.
    • I speak marketing-speak fluently, and they don't.
    • I am the first one to make my points using bullets.

    I answer questions with no add-ins of emotion. There is no technical reason why I would answer otherwise.

    Sincerely,
    Ujesh Desai
  • by shawnce ( 146129 ) on Wednesday July 21, 2004 @10:28AM (#9759961) Homepage
    The 30" monitors from Apple have a resolution that cannot be fed by a single-link DVI connection. So they use dual-link DVI. Both single-link and dual-link are part of the DVI 1.0 standard, nothing Apple specific about them.

    The difference between single-link and dual-link is how many of the pins in the connector is used for transmitting data, in a nut shell 12 pins for the former and 24 pins for the later.

    Apple is using DVI-D (digital only) DVI connectors with a dual-link pin out for the 30" display. So one dual-link DVI-D connection is capable of driving one 30" display. The 6800 adapter used for these displays provides two dual-link DVI-D outputs, so one adapter can drive two 30" displays.

    As a reference...

    DVI connector type summary [ddwg.org]
    DVI 1.0 specification (PDF) [unc.edu]
    • Since you seem to know what you're talking about ... does this mean I can drive any DVI monitor with this card, not just the 30"?

      I'm curious because I have 23" Cinema HD Display and would like to drive it alongside the 30" when I buy it. Don't want to waste the old technology, don't you know.

      Can I do this, assuming that I get a ADC to DVI adapter for the Cinema Display?

      Thanks!

      D
      • Humm not fully sure... dual-link works by sending even and odd pixels for a given color channel down two different links. Single link transmits even and odd pixels over a single link.

        Basically DVI defines 6 signal pairs for pixel data, in single-link 3 of the 6 are used, one for each color channel (RGB). In dual-link even pixels go down one bank of 3 while odds go down the other bank of 3.

        From what I can see the channel definition for connections is the same for single-link and dual-link. So in theory it
      • This is Apple we're talking about. It just works.
  • The same card is available (albeit in limited quantites right now for developers) for Windows machines!

Sigmund Freud is alleged to have said that in the last analysis the entire field of psychology may reduce to biological electrochemistry.

Working...