The New Nvidia 6800 Ultra DDL Graphics Card 217
Dr. q00p writes "Since Apple doesn't offer much information on the new Nvidia 6800 Ultra DDL graphics card (and NVIDIA even less) which is required to drive the new 30-inch Cinema HD Display the readers of Slashdot might be interested to read a reply from Ujesh Desai, Nvidia's General Manager of Desktop GPUs, to a series of questions from Accelerate Your Mac."
That was interesting (Score:5, Interesting)
Worthless read (Score:4, Insightful)
GeForce 6 Series GPUs have the most advanced shading engines on the market. Compared to previous generation parts, the vertex and pixel shader engines on GeForce 6 Series GPUs have been completely redesigned from the ground-up. Pixel Shader performance is 4 to 8 times faster. Vertex shader performance is twice as fast. Performance numbers have been outstanding.
Absolut (tm) Garbage!! Here's another, this time with the question:
* Do you have any idea how performance compares on the Mac between the GeForce 6800 Ultra and the ATI 9800 Pro/XT card?
GeForce 6800 Ultra represents the largest leap forward in graphics performance in our company's history. As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves.
Talk about trash!! A simple NO would have sufficed. Looks like he's made the most of his Business-for-dummies Manual. Man, why am I so angry over this?
Re:Worthless read (Score:3, Insightful)
Probably because he gets paid much more for spouting bs than any of us do for real work....
Re:Worthless read (Score:2)
Re:Worthless read (Score:2)
Re:Worthless read (Score:5, Insightful)
Re:Worthless read (Score:4, Insightful)
Re:Worthless read (Score:2)
Re:Worthless read (Score:2)
DVI doesn't support the Apple resolution.
And VGA? Feh. Nobody's going to want the distortion/poor quality of VGA on a $3300 monitor.
Re:That was interesting (Score:2, Insightful)
Mr Desai:
* We are the first chip with an onboard video processor
Interviewer's commentary:
(Note: Some previous ATI cards like their AIW models have shipped with the "Rage Theater" video encoder/decoder chip on the card. It was first intro'd in 1998, and revised I'm sure since then. Of course the current generation of GPUs have more advanced features.)
Now, how exactly is that comment relevant? Mr Desai claimed theirs was
Re:That was interesting (Score:2, Interesting)
Re:That was interesting (Score:2)
Actually, Mike Breeden (the "interviewer") did not ask if nVidia itself will be introducing retail Mac cards. He is asking for Mr. Desai's opinion on whether graphics card manufacturers have any interest in introducing nVidia retail cards to the Mac market.
In fact, Mr. Breeden foresaw that Mr. Desai would probably answer with the standard "We don't make cards. We make GPUs." answer, so he flatly asked the question, parenthetically. Read the actual quoted passage again:
Re:That was interesting (Score:4, Funny)
And that's saying a lot.
Re:That was interesting (Score:2)
If I were the boss of a marketing guy (or gal), and he was out there showing genuine enthusiasm for the product, I'd be happy as a clam because I know that results in sales.
If you like the guy selling something, you'll buy. That's the foundation of all kinds of sales and it works just the same here.
Although in all fairness, this interview isn't going to prevent me from buying the card and 30" display anyway
D
Man that card is HUGE! (Score:5, Funny)
* Do you have any idea how performance compares on the Mac between the GeForce 6800 Ultra and the ATI 9800 Pro/XT card?
GeForce 6800 Ultra represents the largest leap forward in graphics performance in our company's history. As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves.
Translated: We'll release some actual numbers when we sell more of these mini-space heaters.
It's costs... (Score:5, Funny)
From the site:
"The combination of a GeForce 6800 Ultra with a dual processor Power Mac G5 driving two 30-inch Apple Cinema HD Displays is the definitive tool for the creative professional."
Yes because I need 2 30" screens to watch Carrie Ann Moss on one screen and Natalie Portman on the other
Re:It's costs... (Score:2, Funny)
Or something...
Re:It's costs... (Score:5, Informative)
you can't replace me (Score:5, Funny)
Never..
Never......
Never !!!!
Re:you can't replace me (Score:5, Funny)
Re:you can't replace me (Score:2)
Re:you can't replace me (Score:2)
Don't worry, Andrew Wiles is working on a fix. Should be available in about ten years.
Re:you can't replace me (Score:2)
Re:you can't replace me (Score:2)
Re:you can't replace me (Score:2)
1. What's a Playstation?
2. I never owned a Playstation, you insensitive clod!
3. Playstations used S3 Virge chips?
=D
Re:you can't replace me (Score:2)
Wow, what useless responses... (Score:2, Interesting)
Thats basically like saying "Hey, this new souped Mustang is much faster than a 1992 Taurus!"
I mean, it better be whole hell of a lot faster than the old cards for the huge premium you are paying right now.
Re:Wow, what useless responses... (Score:4, Insightful)
* Does the GeForce 6800 Ultra DDL have a low-noise fan?
Yes, the GeForce 6800 Ultra DDL runs very quiet.
I think this was the only question he was capable of answering.
Cram that thing into an iMac?! (Score:1)
Slightly off topic, has anyone seen a way to upgrade (even if it includes needed a new case) the video card on an iMac? (lamp type)
Re:Cram that thing into an iMac?! (Score:2)
Article Text: Im AC 'cause i dont want the karma (Score:3, Informative)
Nvidia 6800 Ultra DDL Graphics card
Posted: 7/20/2004
Shortly after Apple announced the Mac Nvidia 6800 Ultra DDL card for the PowerMac G5s (which is required to drive the 30in Cinema Display), I sent a series of questions to a contact at Nvidia on the card. Yesterday I received the reply from Ujesh Desai, Nvidia's General Manager of Desktop GPUs. Although some questions didn't get as complete an answer as I hoped (often due to the fact Apple controls OEM Mac Nvidia products), I appreciate his taking the time to reply.
* How does the NVIDIA GeForce 6800 Ultra DDL card for the Mac differ from the PC version (i.e. Does the PC version have dual link DVI?)
The GeForce 6800 Ultra DDL card was designed specifically for the Mac to provide two dual-link outputs to support Apple's displays.
* Does the Apple version of the GeForce 6800 Ultra GPU run at the same core/memory clock as the PC version?
The Apple cards run at 400/550, just like the GeForce 6800 Ultra GPU on the PC.
(Note: Some vendor's 6800 cards are clocked higher than the standard/reference design.)
* The GeForce 6800 Ultra for the PC has two Molex power connectors - does the Mac version source all the power from the G5's AGP pro slot? (or does it have a aux power connector?)
There is an on-board power connector on the graphics card and the motherboard to provide power, so there is no need for an external power connector from the power supply.
(although the only Mac 6800 photos I've seen are tiny, it appears there's a stub connector on the card that (I suspect) uses the ADC (28V or 24V usually) DC power connector on the motherboard that's normally used for ADC display power to provide additional power (regulated down) for the 6800 card. That eliminates the need for Aux. (Molex) P.S. connector(s) like the PC/standard 6800 card versions have.)
* Does the GeForce 6800 Ultra DDL have a low-noise fan?
Yes, the GeForce 6800 Ultra DDL runs very quiet.
* Will there ever be a control panel with 3D/GL/FSAA controls for the NVIDIA cards on the Mac platform? (ATI's retail Radeon cards (and OEM models with the 3rd party patch) have a '3D/GL overrides' feature - which is seen as a big plus by many end users.)
Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver.
* Regarding the previous question - if there's no chance of an Apple supplied NVIDIA card control panel (for advanced features/FSAA, etc.) - if a 3rd party wanted to do this, can NVIDIA provide some assistance?
Apple is our customer, so if this is something that they requested, then we would support it.
* There's been talk of previous NVIDIA cards taking a bigger than expected performance hit from using some types of shaders (on the Mac) - is this a concern with the GeForce 6800 Ultra DDL?
GeForce 6 Series GPUs have the most advanced shading engines on the market. Compared to previous generation parts, the vertex and pixel shader engines on GeForce 6 Series GPUs have been completely redesigned from the ground-up. Pixel Shader performance is 4 to 8 times faster. Vertex shader performance is twice as fast. Performance numbers have been outstanding.
* Will there updated/new drivers for the GeForce 6800 Ultra?
Yes. Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver. Apple will control the release schedules for drivers that provide even more performance, features and image quality enhancements.
* Do you have any idea how performance compares on the Mac between the GeForce 6800 Ultra and the ATI 9800 Pro/XT card?
GeForce 6800 Ultra represents the largest leap forward in graphics performance in our company's history. As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves.
(Note: There's no Mac 6800 perf
Re:Article Text: Im AC 'cause i dont want the karm (Score:2)
A dream... (Score:1)
The beauty, the beauty...
Re:A dream... (Score:2)
Re:A dream... (Score:2)
That's not so bad. I remember when a Macintosh II or one of the early PowerBooks cost about that much.
I plan to buy one 30" display and run with that and my existing 23" display (assuming it can be done, but I think it can). I can always upgrade later to the dual setup, but to be honest I'm sure it will be awesome no matter what.
D
Re:A dream... (Score:2)
There's only one AGP slot in PowerMacs, thus if it took both of the card's DVI ports, it would be impossible to have two 30" monitors hooked up to a single G5.. At WWDC, Apple demoed dual 30" configurations on a stock G5 tower. Thus, by contradiction, we conclude that you're wrong. QED.
Re:A dream... (Score:2)
i was personally expecting to see 4 DVI ports on one of theses cards, as they do take up one of the PCI slots as well. i was somewhat disappointed when i took a peek behind and saw it set up with the normal apple elegance when they could have taken the easy rout i was expecting.
Re:A dream... (Score:2)
It has 2 dual DVI ports on it to allow for such a feat.
Mmmmmm, two 30" montiors at 4.1 million pixels each. More screen rela estate than any one person should be allowed.
Interview in a nutshell (Score:5, Funny)
Re:Interview in a nutshell (Score:2)
Re:Interview in a nutshell (Score:2)
Article Summary: (Score:5, Funny)
Much faster to read, no PR speak to deal with.
-Erwos
If you meant... (Score:5, Insightful)
But if you actually meant ADC, or "Apple Display Connector", that is no longer used. With the new line of displays, Apple has (thankfully) gone back to standard DVI for the displays and for their future OEM video cards.
Re:If you meant... (Score:2)
Re:Article Summary: (Score:2)
ADC == Apple Display Connector
ACD == Apple Cinema Display
The new 6800 supports the latter, not the former (Dual DVI is the connection standard for the newest Apple systems).
Re:Article Summary: (Score:2, Funny)
usually good, but ... (Score:4, Insightful)
Yet another example of "no news" being news
Tom's Hardware (Score:5, Informative)
"NVIDIA has seemingly pulled out all stops in an attempt to deliver cutting edge graphics with its GeForce 6800 Ultra. After gamers for too long have had to be content with mere incremental improvements to graphics performance, NVIDIA new card delivers a performance jump not seen for a long time. The device is also solidly engineered as well as insanely fast."
Re:Tom's Hardware (Score:2)
Wait... not a Motorola 6800... an NVidia 6800.... (Score:3, Funny)
Wait - I sold those things nine years ago!?!? Damn I'm old.
Re:Wait... not a Motorola 6800... an NVidia 6800.. (Score:2)
I had those chips powering my amiga's, a 7.14mhz 68000 in my a500, a 14mhz 68020 in my 1200 and a 50mhz 68030/68882 in the blizzard board for my 1200.
damn, I feel old too now
dave
Flamebait... (Score:2, Funny)
Re:Flamebait... (Score:3, Informative)
It's not just for gaming. Mac OS X's GUI can be accelerated by the GPU. 10.4 will also ship with video- and image-processing libraries that use the GPU.
And even if you don't care about gaming at all, this is the only card on any platform that supports the 30" cinema display, so if you want one of those you need the card anyway.
Re:Flamebait... (Score:3, Interesting)
The new 3DLabs Realizm cards have a DDL connector. I wonder if that means they can support the display.
Re:Flamebait... (Score:2)
By a quirk of fate, I bought a used PowerBook G4/400 about a month ago and it's actually pretty zippy running Panther. So it looks like they've optimized their OS to the point where it will live happily on slower hardware, which definitely wasn't true a few years
Re:Flamebait... (Score:2, Informative)
Re:Flamebait... (Score:2, Informative)
Re:Flamebait... (Score:2)
Doom3... coming in Decem^H^H^H^H^HJuly 2005... (Score:2)
Though, UT2004 runs quite well on a Mac. As do Call of Duty and BF 1942. Halo is a bit slow.
This at 1920x1200 resolution (23" Cinema Display)... played at WWDC on a 2x2.0 G5 with a Radeon 9800. Frame rates at that res were pretty consistent 70 FPS, never dropping below 40. So it's not ALL bad.
Though BUYING a Mac specifically to be a gaming machine, I might not advise that
Chess.app is now OpenGL, rev up that GPU! (Score:2)
Yessir, the 10.3 version of Chess now has a true OpenGL-baed 3D board. You can view from any angle and can even adjust the textures! Hooray!
For those not familar with Chess.app, it is the (opensource) bundled game for Mac OS X. Until the most recent version, it had been largely unchanged from its original form in the NeXTSTEP OS.
(BTW, the Apple Puzzle game, which has been
Re:Flamebait... (Score:3, Informative)
So mac users can happily play games like Command & Conquer: Generals, Doom 3, Shrek 2, Battlefield 1942, Delta Force: Black Hawk Down, Space Colony, Unreal Tournament 2004, Warcraft 3, Call of Duty, etc etc... [apple.com]
Mac user's number 1 hardware question (Score:4, Interesting)
It reduces our choices and makes $100 cost $400.
Re:Mac user's number 1 hardware question (Score:5, Informative)
Since Sun uses OF as well, I wonder if the same card could be used for Macs and Sun workstations.
Re:Mac user's number 1 hardware question (Score:2)
Re:Mac user's number 1 hardware question (Score:2)
Not anymore. IBM removed one of the particularly useful instructions for endian-agnosticity from the PPC970 (G5). It's why VPC7 has been so delayed, MS has to figure a workaround.
Re:Mac user's number 1 hardware question (Score:3, Informative)
Because x86 stores data backwards (the big/little endian thing) as compared to almost every other processor, including the PowerPC.
Thus the card firmware needs to be different...
Re:Mac user's number 1 hardware question (Score:3, Interesting)
Re:Mac user's number 1 hardware question (Score:2)
That being said, it makes sense that you would charge more for different firmware because it has to be split among fewer buyers for the cards. Remember, someone still has to develop the firmware, and that person needs to be paid just like you and me.
I don't mind paying a few extra bucks for Apple-compatible stuff because I appreciate the extra effort that goes into supporting it.
More to the point, I normally would have no need to replace my video card anyway, except I really, reall
little-endian is the "right way" (TM) (Score:2)
Actually there is a justificative for little-endian, just like there is one for the British driving on the left. Casting values from 16 bits to 8 bits and vice-versa in little-endian machines is automatic. In the old days of limited memory this was an advantage. (As for driving on the left side of the road, it came from horse riding: one mounts a horse from the left side)
Re:little-endian is the "right way" (TM) (Score:2)
Another way of doing the same thing, and one that I think would be interesting to try out, is to use "big endian" storage (which is more natural for people, or at least people who read strings from left to right and read numbers with the most significant digits on the left), but the address of a value is the last (least significant) byte.
One problem with either scheme, that isn't shared by big endian references, is that accessing a field using the wrong sized reference is more immediately obvious in more c
Re:Mac user's number 1 hardware question (Score:4, Informative)
Re:Mac user's number 1 hardware question (Score:2)
Nice attemp to distort reality.
Re:Mac user's number 1 hardware question (Score:2)
Re:Mac user's number 1 hardware question (Score:2)
Suddenly, your point looks pretty puny.
Re:Mac user's number 1 hardware question (Score:3, Informative)
A note from the author (Score:5, Funny)
I answer questions with no add-ins of emotion. There is no technical reason why I would answer otherwise.
Sincerely,
Ujesh Desai
Seems to be a lot of confusion over dual-link DVI (Score:5, Informative)
The difference between single-link and dual-link is how many of the pins in the connector is used for transmitting data, in a nut shell 12 pins for the former and 24 pins for the later.
Apple is using DVI-D (digital only) DVI connectors with a dual-link pin out for the 30" display. So one dual-link DVI-D connection is capable of driving one 30" display. The 6800 adapter used for these displays provides two dual-link DVI-D outputs, so one adapter can drive two 30" displays.
As a reference...
DVI connector type summary [ddwg.org]
DVI 1.0 specification (PDF) [unc.edu]
Re:Seems to be a lot of confusion over dual-link D (Score:2)
I'm curious because I have 23" Cinema HD Display and would like to drive it alongside the 30" when I buy it. Don't want to waste the old technology, don't you know.
Can I do this, assuming that I get a ADC to DVI adapter for the Cinema Display?
Thanks!
D
Re:Seems to be a lot of confusion over dual-link D (Score:3, Informative)
Basically DVI defines 6 signal pairs for pixel data, in single-link 3 of the 6 are used, one for each color channel (RGB). In dual-link even pixels go down one bank of 3 while odds go down the other bank of 3.
From what I can see the channel definition for connections is the same for single-link and dual-link. So in theory it
Re:Seems to be a lot of confusion over dual-link D (Score:2)
It would require that the adapter understand that the connected display is single-link and hence not do the even/odd splitting. I don't know if the adapter supports that or not. Pin out wise it should work if the adapter does the right thing.
Re:Seems to be a lot of confusion over dual-link D (Score:2)
I cannot find conformation of that in any docs, ideally it should but...
Re:Seems to be a lot of confusion over dual-link D (Score:2)
If it says it conforms to the DVI standard, then yes, it must support single-link devices properly. It doesn't necessarily have to support any particular resolutions, but it seems unlikely they'd deliberately cripple it.
The DVI spec says that if a particular resolution CAN be done on a single-link, the adapter MUST use single-link. It can only shift up to dual-link if the bandwidth is too high at the chosen resolution, refresh rate and pixel color depth.
Re:Seems to be a lot of confusion over dual-link D (Score:3, Informative)
Actually this isn't true or we wouldn't have dual-link for example. It depends on the resolution and refresh rate, basically the bandwidth needs of the display.
Also...
DVI-I = connector carrying both a digital and analog signal
DVI-D = connector carrying just a digital signal
DVI-A = connector carrying just analog (extra to DVI specification)
For the digital aspect
Re:Seems to be a lot of confusion over dual-link D (Score:2)
Re:Seems to be a lot of confusion over dual-link D (Score:2)
Re:Seems to be a lot of confusion over dual-link D (Score:2)
Why is this an Apple story? (Score:2)
Set up (Score:2)
Are these Cinema Displays essentially a dual-monitor-in-one setup (from the computer's POV, that is.)
(YFI, BTW)
Re:Set up (Score:3, Informative)
The "dual-link" label is misleading, it's mearly an update to the DVI standard (like DVI-I, DVI-A, etc) too allow for more data.
Re:Set up (Score:2, Informative)
Only the 30 inch display requires the two connections per screen - so this card is really only for the 30 inch.
IANAE - so I have no idea if the card could ever be hacked to drive four displays - but that would be pretty cool.
Re:Set up (Score:2, Informative)
Yes, the issue is data throughput. DVI-D doesn't support high enough resolution.
But, the 30 inch display only needs ONE connector.
DVI-Dual Link is just a protocal/standard that allows that one connector to send twice the data of DVI-D. Think double density.
So... one card, two DVI-Dual Link Connectors, one display (including 30 inch) per connector.
Re:Set up (Score:2, Interesting)
Re:Set up (Score:2)
Re:Set up (Score:2)
Re:Dual-Link DVI for PC? (Score:2, Informative)
Re:Apple is dying: Sell stock now. (Score:2, Funny)
I thought the release date for the OSX version of Doom III was still up in the air...
Re:Annoying marketing regression (Score:2, Insightful)
Re:Annoying marketing regression (Score:2)
Wow, that's sure like the CRT monitors with their normally over 1" less viewing space than advertised (17" CRT's have 15.9" viewable, for example).
Re:PC??? (Score:2)
Out the door, the display + video card upgrade is almost $4k plus tax.
D
Re:Apple Fanatics are retarded (Score:2)