Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

NVIDIA Performance On Linux, Solaris, & Vista 231

AtomBOB suggests a Phoronix review comparing the performance of a Quadro graphics card on Windows Vista Ultimate, Solaris Express Developer, and Ubuntu Linux. The graphics card used was a NVIDIA Quadro FX 1700 mid-range workstation part. The cross-platform benchmark used was SPECViewPerf 9.0 from SPEC. Quoting Phoronix: "Using the Quadro FX1700 512MB and the latest display drivers, Windows Vista wasn't the decisive winner, but the loser... Ubuntu 8.04 Alpha 5 with the 169.12 driver had overall produced the fastest results within SPECViewPerf. In only three benchmarks had Solaris Express Developer 1/08 outpaced Ubuntu Linux, but with two of these tests the results were almost identical.""
This discussion has been archived. No new comments can be posted.

NVIDIA Performance On Linux, Solaris, & Vista

Comments Filter:
  • by moreati ( 119629 ) <alex@moreati.org.uk> on Sunday March 09, 2008 @08:27PM (#22695840) Homepage
    I've wondered this a while. What is the difference between the gaming cards and the workstation cards from Nvidia and ATI? Do they just have better DACs? Certified driver support for business apps? Or is the GPU itself somehow?

    Alex
    • by Anonymous Coward on Sunday March 09, 2008 @08:39PM (#22695896)
      The difference between the Quadros and the consumer cards used to come down to hardware OpenGL overlay support, if I remember right.
    • by sxeraverx ( 962068 ) on Sunday March 09, 2008 @08:41PM (#22695922)
      They have different priorities. Gaming cards try to keep the framerate up by degrading image (not showing every single texture, e.g.), if need be, while cards for stuff like CAD and the like lower the framerate to show every detail requested of them.
      • by LoRdTAW ( 99712 ) on Monday March 10, 2008 @12:46AM (#22697272)
        Gaming cards try to keep the framerate up by degrading image (not showing every single texture, e.g.), if need be

        Thats called culling and it is implemented in software, not hardware.

        If I remember correctly there was a simple hack posted on Toms Hardware a while back for converting a Radeon to a FireGL. You simply solder an SMT resistor to a certain trace on the chip package and it pulls a line low. That line actually signals the BIOS to report the card as a Radeon or a FireGL. So in essence the Radeon and FireGL are the EXACT SAME CARD! The only difference is the FireGL drivers look for a Radeon reporting itself as a FireGL. This keeps production simple and even the video card BIOS versions the same.

        The FireGL and Quattro cards come with optimized drivers for specific 3D programs like AutoCAD, Maya, 3DSMax, Light Wave etc. There is a drop down box that lets you select the program your using and it loads the finely tuned driver for that program.
      • by maz2331 ( 1104901 ) on Monday March 10, 2008 @02:48AM (#22697736)
        The Quadro boards allow OpenGL stereoscopic images to be displayed in a window, and the non-Quadro boards do not. If you want really good 3D, you need a Quadro.

        I use them for my stereoscopic video stuff with either a pair of shutter glasses or 3D HMD goggles, and can do a live, 3D viewfinder to compose the scene, align cameras, etc.

      • What sort of rubbish is that? Even if it would speed things up (which it wouldn't, not really) there's no way a card could figure out which textures to "hide".

        The difference is partly in the capabilities, eg. Pro cards can do two-sided lighting, and partly in the drivers. Drivers for "pro" cards are more conservative (not always the very latest release), do more validation of input data, and are therefore a little bit slower.

        PS: The difference in features is completely artificial, I've "added" two sided lig
    • by alex4u2nv ( 869827 ) on Sunday March 09, 2008 @10:28PM (#22696574) Homepage
      I had the very same question, and this article from Nvidia turned out to be very enlightening.
      Quadro vs FX -- http://www.nvidia.com/object/quadro_geforce.html [nvidia.com]

      According to the article, there are some major differences between the two architectures. Where features are programmed either at the hardware layer (quadro), or at the driver layer.

      • Of course, it doesn't help that they've started using the Quadro name for business laptops. As far as I know, chips like the Quadro NVS 110M are far closer to gaming cards than workstations.
  • Surprised.. (Score:4, Interesting)

    by LingNoi ( 1066278 ) on Sunday March 09, 2008 @08:28PM (#22695842)
    I am surprised by this as I would have thought Nvidia would have put more effort into their Vista driver with Linux drivers being mostly on the back burner. I am assuming it is because their Linux driver is old code (which we all know contains less bugs then new code) whereas the Vista driver is written from scratch?

    Either way I think this shows the awesomeness of Ubuntu and Linux. ^_^
    • Why be suprised? (Score:5, Insightful)

      by EmbeddedJanitor ( 597831 ) on Sunday March 09, 2008 @08:40PM (#22695900)
      It isn't just the code that impacts performance, but the driver architecture too.

      Vista has a new driver architecture and it is goiing to take some time for MS to improve the graphic subsystem performance. It will also take NVidia a while to optimise their code for Vista.

      Even then, the Vista architecture might just have some inherent issues that are hard to code around.

    • by bcmm ( 768152 )
      It's been known to Linux gamers for a while that games that run on both Windows and Linux will generally perform better, often by 10-15% (by FPS), on Linux, at least on NVIDIA hardware.
    • Re: (Score:3, Interesting)

      by Svartalf ( 2997 )
      It has a little less to do with them putting effort into the driver and more to do with the interrupt handling model and how OpenGL
      ties into the OS as a whole.

      And, you'd be assuming wrong. Neither NVidia nor AMD have old or differing code, from what I understand, for EITHER OpenGL API layer.
    • I'd have been interested to see where WinXP would have stacked up against the others.
      • I'd have been interested to see where WinXP would have stacked up against the others.
        Whoops, clicked submit too early. I was also going to add that since the codebase for the Linux and WinXP drivers are nearly identical (according to nvidia), it seems like it would have been more of an OS-dependent benchmark.
        • According to this comment [slashdot.org] they are the same, but if you look at my comment you'll notice that the graph says XP is around 2-10% better then vista.
    • "I am surprised by this as I would have thought Nvidia would have put more effort into their Vista driver with Linux drivers being mostly on the back burner."

      Nvidia is putting a lot of effort into their Vista drivers. The problem is that Windows Vista just plain sucks ass, and there's nothing Nvidia can do about that. They're probably thinking what most other people (including Microsoft, more than likely) are thinking... write Vista off as another WinME-type loser, and wait for the next Windows OS, which pr
    • Re: (Score:3, Insightful)

      by JohnBailey ( 1092697 )

      I am surprised by this as I would have thought Nvidia would have put more effort into their Vista driver with Linux drivers being mostly on the back burner. I am assuming it is because their Linux driver is old code (which we all know contains less bugs then new code) whereas the Vista driver is written from scratch? Either way I think this shows the awesomeness of Ubuntu and Linux. ^_^

      Except these are workstation graphics cards. And Windows is the one on the back burner. The CGI industry has been using Unix variants for years, and more recently many are moving to Linux for cost considerations.

  • the past few drivers had been getting better and better, but this one broke about half my 3D apps.
    the graphics start ok, but when i make any inputs(keyboard or mouse) what ever it is crashes.

    this is on a HP Pavillion Amd turion64 running 64bit Debian at Testing
    • by Curtman ( 556920 )

      the past few drivers had been getting better and better, but this one broke about half my 3D apps.

      Same here. I can use Maya for 5 or 10 minutes, and then X goes nuts. I can move the mouse, but can't click on, or type anything. I have to ssh in and kill the X process.

      I got a nVidia card to make Maya easier to work with. Time to end this experiment I think.
      • Re: (Score:3, Informative)

        by darkjedi521 ( 744526 )
        If you're running Maya, would should be running the drivers/distro that Autodesk blesses. Last I checked, that was 2-3 year old drivers on RHEL 4/SLES 9/Fedora Core 5. I run the blessed packages for a small animation studio and only have problems when people out of memory their system (8GB RAM should be enough for anybody). http://usa.autodesk.com/adsk/servlet/item?siteID=123112&id=9683256 [autodesk.com] has the list of blessed stuff.
  • OpenGL? (Score:3, Interesting)

    by LingNoi ( 1066278 ) on Sunday March 09, 2008 @08:33PM (#22695868)
    This is serious question, I heard a while back that Vista had done something to make OpenGL slower.

    Could Vista's bad performance be due to its nerfing of OpenGL on Vista in order to get developers to pick DX?
    • Re:OpenGL? (Score:4, Interesting)

      by zappepcs ( 820751 ) on Sunday March 09, 2008 @08:41PM (#22695918) Journal
      While I tend to agree with you, it would be stupid on the part of MS to hobble openGL because it will only make Windows look sucky. The news for nerds crowd on the Internet (not just /.) will ensure that *ANY* Linux drivers get vicarious face time with the masses and hobbling that experience is like a huge marketing blunder on the scale of the Sony rootkit without so much of the legal problems.

      One thing that I like, recently it is not a case of Linux and Solaris having to be as good as MS, but a case of hmmm lets just see which performs better without the a priori conclusion that everyone has to keep up with MS.

      I think that very soon, if not now, we can start thinking of MS as an angel with a tarnished halo, if I can put it so gently?

      We are slowly moving in to an era of REAL competition, where all OSs are competing for the leading edge and the masses waiting for news each quarter of who is winning rather than everyone not really caring since no other OS is as good as MS. At that point, I think you can clearly and safely declare a win for F/OSS. A battle win if not the war.
      • by WK2 ( 1072560 )

        it would be stupid on the part of MS to hobble openGL because it will only make Windows look sucky.

        That's never stopped them before.

      • by Detritus ( 11846 )
        It may be stupid, but I read that Microsoft has deprecated OpenGL in favor of their own graphics technology. They supported it when they were trying to move people from Unix workstations to NT/2000/XP. With that accomplished, it's time to cut off OpenGL's air supply.
    • Re:OpenGL? (Score:5, Informative)

      by glob ( 23034 ) on Sunday March 09, 2008 @08:59PM (#22696000) Homepage Journal
      http://www.opengl.org/pipeline/article/vol003_9/ [opengl.org]

      "Some have suggested that OpenGL performance on Windows Vista is poor compared to Windows XP. This is not the case."
      • Yet, this graph [67.15.50.109] from that page shows Vista performance to be slightly worse and if you compare that graph to the data from this article then Windows XP must really suck as well.
    • by Svartalf ( 2997 )
      Considering that the workstation app crowd are, in many cases, still using immediate mode OpenGL calls...

      Do you honestly think that this is going to make things work to make them change things?

      It has more to do with it's interrupt handling, etc. than anything else. Vista doesn't do so hot, even with
      DirectX, because it's been rewritten in a few ways that don't help them any.
    • As I understand it, what Microsoft did was add an OpenGL implementation as a wrapper for DirectX. There are two things that I understand about this wrapper, though I don't really know:

      First, it's overridden by any driver that chooses to implement OpenGL by itself.

      And second, it's used for Aero -- the theory being that you can't have two 3D APIs controlling the hardware at once, so if you have a windowed GL app, it'll use the wrapper, whereas a fullscreen GL app will run normally. This is kind of like me run
  • I thought the whole deal with Vista is that it has a new driver model. Thus, its going to be some time before drivers can really be completely optimized for it.
    • by figleaf ( 672550 ) on Sunday March 09, 2008 @09:13PM (#22696072) Homepage
      Its an OpenGL test. The perf. difference between OpenGL and DirectX Nvidia implementations has always very large -- even in Windows XP.
      • Re: (Score:3, Interesting)

        Not really, OpenGL and DirectX have always been more than competitive.

        Also OpenGL technically benefits MORE from the new WDDM in Vista because of the RAM allocation system and GPU scheduling as the OS handles all these details for OpenGL and OpenGL applications.

        The ICD still has to be optimized to pass through and work with the new Vista WDDM model, so as Vista was first released to now, just like with DirectX - OpenGL on current drivers is considerably faster than the horrid RTM drivers from both NVidia an
    • by Ilgaz ( 86384 ) *
      It could be the same deal on OS X Leopard too. Leopard (10.5) is still yet not to Tiger (10.4) levels on OpenGL performance. Of course we have another issue, we can't bug NVidia and ATI, they tell us "Apple does drivers". These are the same guys who sells exactly same chip for 30-40% more expensive price even after Intel switch.

      Of course Tiger is at .11 point release while Leopard is just .2 along with lots and lots of changes at kernel. Leopard is more like Vista 64bit while thanks to Apple, no end user ha
  • No XP? (Score:2, Insightful)

    If you're a quattro user, your OS choice would surely be on software available for whatever particular professional application you are using the card for. As a sound designer, that would be for me, XP. I don't think many professionals are ready to jump to vista quite yet so I'm surprised that they have not included it. We are, after all, looking for stability.
  • What?! Windows did not have the best NVIDIA performance?!

    This is a new one. No, really. Usually NVIDIA makes their Windows drivers their best drivers, and Linux is supported as an afterthought because they can make a few percentage points more in sales this way, and because it discourages reverse engineering their hardware, since those who would take the time and effort to do so won't on account of there being a working solution.

    In other words, I am surprised that although Windows Vista has been such a mess

    • Usually NVIDIA makes their Windows drivers their best drivers, and Linux is supported as an afterthought because they can make a few percentage points more in sales this way...

      Not really. Nvidia's drivers are designed incredibly well. Their drivers were designed ground-up to abstract all of the rendering code such that porting it to different platforms is a simple process of designing a shim [wikipedia.org] to connect the driver engine to a specific OS's API. So, with the exception of the driver shim, the codebase is almost identical.

    • I wouldn't be surprised if Nvidia pays developers to make Quadro cards run fast on Unix and Linux in particular. Many purchasers of that hardware want it.
  • I thought the nVidia Linux drivers don't get enough performance out of the cards to do good video framerates on Linux, or good alpha blending for compositing "picture in picture" or GUI overlays on top of the images.

    Maybe that's on the latest (higher) models of cards, which actually have the performance to do TV. How come those frequently-complained driver limits don't appear in these benchmarks?
    • For any normal use - including accelerated video - the NVidia Linux drivers are solid and have been for years.

      There may be edge cases where they have worse performance than other drivers, but not in any area that I've personally seen using the drivers.

    • You're almost certainly thinking of ATI, who's drivers for Linux have historically been of very poor quality (though I understand they're working to fix these issues, and in fact may already have done so). This would be why Nvidia cards have generally been the 3D accelerator of choice for Linux users for many years, now.
      • Sorry, I should say, not just 3D accelerator, but also accelerated video and MPEG2 decoding card of choice. For example, the Geforce FX5200 (or newer) has been the recommended card for MythTV for years.
    • Re: (Score:3, Insightful)

      by batkiwi ( 137781 )
      This is an openGL test. Nvidia's linux drivers for openGL have been really fast for a long time now. In fact they've confirmed that they use the same driver code for windows and linux, just with a different API exposed.

      What you're talking about is that the video acceleration APIs are not exposed for linux (purevideo). This is still the case, and annoying.
    • Re: (Score:3, Informative)

      by dbIII ( 701233 )
      There has been hardware rescaling to TV modes on their cards for a few years so you'll find the cheapest models with TV-out do a good job. Other features have improved a lot in the linux and other drivers - look at the README on the nvidia download site for the long list and how to turn some on or off.
    • What is a "good video framerate"? Video is 30 fps. No faster, no slower.

      The 5200 FX is able to display SD (standard definition) video with no problems. Of course, cards of this class do not have HD encoders.

      The 6000 and up series is able to do HD (high definition) video with no problems. I am using a 7300 (AGP 4x bus interface) to do 1080i display (the machine I am typing on, which happens to be my PVR). I am not sure if the card will drive 1080p, but that isn't a "mode" that my TV will do.

      The card/driver d
  • One more step... (Score:2, Insightful)

    by mebrahim ( 1247876 )
    One more step towards Desktop Linux. But we need some real games to use these 3D capabilities!
    • Desktop Linux distros like Ubuntu have worked well for a long time now. They aren't something always on the horizon. You're not going to wake up one morning and realize that this year is the year of Linux on the desktop, and see mass conversions.

      And FWIW, the best Quadro performance isn't going to make a difference unless you're doing high-performance rendering or some similar task (and you actually have a Quadro card).
  • by skeptictank ( 841287 ) on Sunday March 09, 2008 @09:55PM (#22696370)
    I have it on good authority that the next Windows Driver Model will run Crysis on 3 SLI 8800GTs and render it in 8-bit color at 640x480 resolution at over 50 FPS! So take that you Linux/Unix hippy beatnik freaks!
  • "Using the chessboard, the retarded monkey wasn't the decisive winner, but the loser ... The college-level physics student overall produced the best results."

    Who's actually surprised by this? Bueller? Bueller?

What is algebra, exactly? Is it one of those three-cornered things? -- J.M. Barrie

Working...