Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware Entertainment Games

NVIDIA's Lead Scientist Interviewed 222

rtt writes "bit-tech.net has up an interview with NVIDIA's chief scientist, David Kirk, about the PlayStation 3, next-generation architectures and what to expect in PC gaming. From the article: 'We're going to see the next generation of shader-based games. At the first generation, we saw people using a shader to emulate the hardware pipeline, and finding "Hey - this really is programmable". After that, they tried to do a few things with more lights, using perhaps eight instead of ten. Then they started to write material shaders, and they made great cloth and metal effects that we saw. People are now starting to change the lighting model, and are exploring the things that they can do with that.'"
This discussion has been archived. No new comments can be posted.

NVIDIA's Lead Scientist Interviewed

Comments Filter:
  • Ha ha, lights. (Score:4, Insightful)

    by robyannetta ( 820243 ) * on Thursday July 14, 2005 @09:32AM (#13062313) Homepage
    Who cares how many lights the chipsets can emulate when the games themselves still suck?
    • A mod point, a mod point, my root password for a mod point. I'm back playing smaller games that are addictive: www.happypenguin.org [happypenguin.org]
    • Re:Ha ha, lights. (Score:4, Insightful)

      by paulsgre ( 890463 ) on Thursday July 14, 2005 @09:55AM (#13062493)
      The worst part is that rendering 10 lights instead of two means five programmers instead of one. Rising costs of development and demand for more glorified tech demos is demeaning the art form, and preventing widespread recognition as such. The potential creative geniuses of our time will be turned off games as a medium, or the next Stravinsky may end up coding 5 more shaders for the reflection in a visor instead of writing the algorithm that rocks the interactive world like the next "Rite of Spring"
    • Gee, how did I know someone would say that?

      A kinda unfair blanket statement, don't you think? Sure a lot of game makers are going to focus more on graphical tricks and less on gameplay, but that's almost always been the case, regardless of the level of technology. It's unfair to those who can balance good gameplay and graphics without compromising either. And yes, they do still exist, and there are plenty of them.
    • Re:Ha ha, lights. (Score:4, Insightful)

      by mccalli ( 323026 ) on Thursday July 14, 2005 @10:05AM (#13062572) Homepage
      He's at nVidea - he's describing his job, and gameplay isn't it. Lack of gampleay is an accusation to be thrown at the software houses, not at nVidea.

      Cheers,
      Ian

    • There are only... FOUR... lights...
    • If that was true then why do they sell as well? Why don't good gameplay sell games? We're spoiled. I admit it, after seeing what a game *could* look like, you want games that *do* look like that. I once thought VGA graphics and 386 speeds rocked too. But if I ever go back there, it sucks. Bigtime. I play old games and wish they'd been done in ultra-super-extra-something-something-resolution. It is for nostalgia I play them, even if the gameplay is good.

      Besides, I don't think the games "suck" as such. The r
      • Re:Ha ha, lights. (Score:4, Interesting)

        by grumbel ( 592662 ) <grumbel+slashdot@gmail.com> on Thursday July 14, 2005 @11:40AM (#13063441) Homepage
        ### I once thought VGA graphics and 386 speeds rocked too. But if I ever go back there, it sucks.

        A few month ago I played XCom:UFO for the first time ever, so no nostalgica involved and suprise, suprise it didn't suck, it was simple one of the best games I have played in the last few years, even by todays standards. An interesting side node it that XCom has completly destroyable terrain, sure its all just 2d tile graphics, but destroyable terrain is something that almost no 3d game these days has gotten right.

        I don't mind if graphics are good, but quite often the better graphics actually limit the gameplay in harmfull ways (no destroyable terrain, no huge outdoor szenarios, etc.).
    • Who cares how many lights the chipsets can emulate when the games themselves still suck?

      So, please enlighten us on what you'd like to see then? Maybe we'd even get a discussion going? You'd like to see Civilization IV? Or what?? You're apparently not pleased with FPS games since the market is choking on them?
    • I've been playing games for approximately 20 years, including classics like the "Elite", "Lemmings", "Space Quest", "Monkey Island" etc etc. As a matter of fact I like to collect older games, often sold in low prices. Have you ever tried going back (instead of remembering) and actually playing these games now? There has been considerable progress which we like to forget. I fondly remember finishing Eye of the Beholder I/II/III, for example. Now, compare this with Baldur's Gate I/II and Neverwinter Nights (o
      • Yes, I have. Lemmings is still fun. Original Prince of Persia was still awesome (although I admittedly did not play the new one much, so I won't compare). I thought Baldur's gate and Morrowind were awful. NWN had potential, but the default campaign sucked and the mod community took too long to get running.

        I find realism and graphics to be inversely proportionate to game experience. Realism isn't fun. A game should only try and be real if the point of the game is to emulate the real world- a historic
    • 1995: Walk around shooting monsters.


      2005: Walk around shooting monsters with more realistic-looking clothing.

  • by Anonymous Coward on Thursday July 14, 2005 @09:33AM (#13062318)
    "After that, they tried to do a few things with more lights, using perhaps eight instead of ten. "

    I wish I had more money. Like 50 bucks instead of 100 bucks.
    • by Anonymous Coward
      I wish I had more money. Like 50 bucks instead of 100 bucks.

      Greetings! May I interest you in the myriad of financial services I offer?
    • I have to assume that was an honest mixup, either on the part of the interviewer or just absent-mindedness on the part of the engineer.

      In my experience, OpenGL ( and presumably DirectX since the two are just APIs for the same hardware, but I could be wrong ) default to a max of eight lights. So, using shaders to emulate 10 or more lights would make sense.
      • For basic bump-mapping effects with OpenGL shaders you can have any number of single diffuse point or directional light sources, with optional gloss maps and specular lighting. You can skip using the OpenGL lightsource state and just store the light source positions/directions as texture data instead.

        But when you want to use hardware-assisted shadow mapping, the number of light sources is limited by the number of free texture units. Since one texture unit is used for the base detail with transparency, and
  • by Chordonblue ( 585047 ) on Thursday July 14, 2005 @09:35AM (#13062331) Journal
    If the XBOX 360 gets a 6 month jump on Sony, the results by the time the PS3 launches will be obvious. Sony's hardware may be more powerful in some respects, but the amount of work that needs to be done by the programmers is daunting.

    While actual code is being written on the 360 side, my guess is the coders on the PS3 side are doing what this article suggests - feeling out the hardware. It means that a lot of the development environment is unfinished or at least unkempt. You've got a lot of power there, but learning to wield it is going to take quite some time - ESPECIALLY with the Cell processor.

    • The real life picture is, however, exactly oposite. On XBox you will have to re-design your game to use 3 threads(!) (not 2, not 4) to get predictable fluid parallel performance. This is *very* difficult to do (debugging nightmare). Game (and other) developers are very much used to single thread. Sony came up with better idea: Cell chip has parallel vector units that will be used by low-level libs (well tested and stable). Libs will be both provided by Sony and later by engine companies themself. Game progr
      • Except that I don't trust Sony to come through with well documented, well written libraries - especially if the past is any indication. I think the 'reality' is that it will take time for all this to come together. A long time.

      • What makes you think the same thing can't be done on the 360? Game developers have been gathering and using libraries for years.
      • The Cell processor still has two general-purpose threads. As for the SPEs - sure, you can use them in libs and stuff, but that means you'll be limited by Amdahl's law. If, let's say, half of your CPU resources were consumed by library stuff, and you achieved an infinite speedup on this half, you'd only have a program that's twice as fast.

        Cell is, in fact, more difficult to program than the Xbox, because the worker threads have a different instruction set then the main threads. IBM was, at some point, prom

      • These people must be Monthy Python fans:

        First, shalt thou take out the holy pin. Then shalt thou count to three. No more, no less. *Three* shall be the number of the counting, and the number of the counting shall be three. *Four* shalt thou not count, and neither count thou two, excepting that thou then goest on to three. Five is RIGHT OUT. Once the number three, being the third number be reached, then lobbest thou thy Holy Hand Grenade towards thy foe, who, being naughty in my sight, shall snuff

    • by mcc ( 14761 )
      In order to see what will happen in this upcoming generation, we simply have to look at the last generation, when the Dreamcast came out nearly a year before the PS2; and the Dreamcast was released with a programmer-friendly focus, while the PS2 (at and around launch) had an unclearly documented and byzantine architecture which required writing large chunks of assembly code to get many basic things done. And so what happened?

      Of course, the Playstation 2 failed horribly and Sega went on to dominate the mark
      • But my point is that it took a while - and let's not forget Sega's history. The Dreamcast happened right on the heels of the Saturn, a dismally designed failure.

        The PS2 succeeded in spite of itself because game companies had more faith in Sony's ability to stay afloat and support (however poorly) it's machine for the long haul. That's a success of marketing more than anything.

        RTFA and you'll see that the programmers are 'playing' with the new shader tech - it IS different in that it offers so much that co
  • opinion? (Score:2, Interesting)

    by kc0re ( 739168 )
    I am still of the opinion that Doom 3 was the finest lit and rendered game to date. I believe that Doom 3 will change the face of games.

    The other game that did alot with lighting was Spliter Cell.. I'd like to hear other's opinions...
    • Re:opinion? (Score:5, Funny)

      by Anonymous Coward on Thursday July 14, 2005 @09:42AM (#13062380)
      Doom3 is not a game. It's a slightly interactive lighting simulator.
    • Re:opinion? (Score:3, Interesting)

      My opinion, you're trolling... but I can't resist...

      Doom 3 had some decent static lights in it. But they screwed up soooo much with the light that mattered - the flashlight. I don't mean not being able to hold the flashlight and gun at the same time. I mean that the flashlight was technically poorly implemented. For starters, the realism was killed for me immediately by the fact that I could look through the SIDE of the light beam, and the wall I was looking at was illuminated even though the flashli
    • It will indeed change the face of games , ala sticking a Porsche body kit on a Lada
    • Re:opinion? (Score:4, Funny)

      by Psiren ( 6145 ) on Thursday July 14, 2005 @09:56AM (#13062508)
      I am still of the opinion that Doom 3 was the finest lit and rendered game to date.

      Which bit? The dark bit at the start, the very dark bit in the middle, or the super dark bit at the end? While there were a few glimpses of very nicely rendered scenes, for the most part it was just too dark to see anything. Plus the game was crap, but that's another matter.
    • Doom 3 had excellent lighting tech, but did virtually nothing with it coz the levels were all dark.

      Deus Ex 2, on the other hand, was rubbish but had a use for good lighting - seeing bad guys round corners because of the shadows they cast, etc. Similar tech, less good looking, rubbish game, but I reckon that's where this kind of thing is taking us :)
  • by Wills ( 242929 ) on Thursday July 14, 2005 @09:45AM (#13062415)
    What I would like is for nVidia (and ATI) to start making lower power consumption a big goal for their new products. Can't we leave the era of 100-110Watts being the norm for new graphics card such as the GeForce 7800 GTX?
    • IMHO, low power consumption might never make sense for nVidia or ATI. Unlike CPUs, cutting-edge GPUs are primarily targetted towards the avid gamer, who's playing his/her game on a desktop. Given this usage model, do you think that ATI or nVidia would refactor their entire design strategy to put power consumption ahead of performance? I don't think so, unless we're talking about mobile GPUs.

      For both these companies, their technology leadership is currently defined by the performance of their top-end graphi
    • You need to look at cards for what they are, not what video was.

      Today's video cards have much higher transistor counts than the processors of the systems they go into.

      A standard P4 is around 60million, the Extreme Edition with all its built in cache is 180million

      A 6800 series is 220+ million. The X800 is 160+ million.

      A 7800 is over 300 million.

      What you really have in a video card is a computer within your computer complete with its requsite power and cooling requirements.
    • by Chirs ( 87576 ) on Thursday July 14, 2005 @10:37AM (#13062885)
      Actually, they already are considering it. The 7800GTX has 50% more transistors than the 6800 Ultra, but runs cooler.

      Basically they're shutting off portions of the chip when not in use to cut down on power consumption.

      This is mentioned briefly at http://www.bit-tech.net/news/2005/07/07/g70_clock_ speed/ [bit-tech.net]
      and also at http://www.hardocp.com/article.html?art=Nzg0LDI= [hardocp.com]
    • Top of the line cards? What are you smoking, if you can lower power output, increase performance. It's the same as Intel/AMDs flagship space heaters. But it all trickles down to us that don't need the bleeding edge of performance too. However, because they can sell low-power chips to laptops for a premium, expect to pay extra for it being very cool. Personally I'm quite happy with a mainstream card, not a huge power drain, but too much to put into a laptop. For a desktop, the value is clearly best there.

      Kj
    • I wouldn't worry about these kind of optimizations really; won't they run into trouble if they don't try to keep down power consumption and heat these days? Or maybe not now, but pretty soon...
  • by Zobeid ( 314469 ) on Thursday July 14, 2005 @09:45AM (#13062417)
    Here's the most important word that didn't appear anywhere in that article: OpenGL
    • What is there to mention? When talking about the unified shader model, I got the impression that this guy's focus is more on NVIDIA's hardware design and not so much on how its features are exposed via its APIs. If you want someone from NVIDIA to talk about OpenGL, you probably want an interview from Mark Kilgard.
  • ATI interview (Score:3, Interesting)

    by AngryScot ( 795131 ) on Thursday July 14, 2005 @09:47AM (#13062433)
    They also had an interview with Richard Huddy from ATI [bit-tech.net] a little back
  • Isn't that overstating the job title a little bit? Engineer sure, but scientist? It's not like increasing the number of bump maps is going to lead to cold fusion or the cure for cancer.
    • Sure, it's plausible, engineers do production work, and scientists do research.

      It's fathomable that this fellow does research for Nvidia, i.e. researching new ways to increase performance, etc...
    • Re:Scientist? (Score:3, Insightful)

      by CynicalGuy ( 866115 )
      Have you ever read any of the proceedings from SIGGRAPH? Yes, people do get their Ph.D's in that stuff.
    • Re:Scientist? (Score:4, Insightful)

      by i7dude ( 473077 ) on Thursday July 14, 2005 @10:18AM (#13062697)
      among other things, designing next generation graphics cards is a serious exercise in computer architecture, vlsi design, and algorithm development; these people arent just system integrators or product engineers...next generation stuff has to come from somewhere other than a reference design...these people are absolutely scientists.

      you dont need a beaker and a lab coat to be considered a scientist.

      dude.
      • is a serious exercise in computer architecture, vlsi design, and algorithm development;

        Architecture ... design ... development. All words about the creation or design of something new, which is engineering or applied science.

        "Science", used alone, means the use of the scientific method to discover new information about the nature of reality. (Or, in the case of mathematics or computer science, the nature of abstract logical contrstucts ... which makes it debatable depending on your precise defini

    • Yeah seriously!

      It's not like designing systems and algorithms that can render a scene with millions of polygons, accurate shadows, bump maps specular shading takes any special knowledge... What's next?! John Carmack being called a rocket scientist!?
  • It seems all development efforts goes into 3D gaming and no brains into vanilla PC requirements. Why is it impossible to find a reasonably priced, fanless graphics card with two DVI connectors? Why can't I have dual head graphics with hardware video acceleration/overlay on either monitor? Why don't Nvidia and ATI at least take care that the non-3D features of their cards are fully supported under Linux and X11? Yes, Matrox's cards come close, but even their vintage G550 require buggy binary X11 drivers.
  • by suitepotato ( 863945 ) on Thursday July 14, 2005 @10:01AM (#13062542)
    First, as others have noted, games still tend to suck overall so who cares how beautiful the graphics are? Beautiful crap is still crap.

    Second, now that GPUs are competitive with CPUs for heat generation and electrical energy waste, are we giving up altogether on efficiency and just consigning ourselves to needing ever better coolers, paying more electrical costs, etc., just to play some beautiful crap?

    Not me. Gone are the days of being able to stick all these game machines, DVD players, media PCs, etc. in a small enclosed space of an entertainment center. Now I'll have to place my TV near to a window and buy a standalone air conditioner so I can pipe the hot air flow out and cool all my stuff to keep it from immolating my living room.

    I don't think so. If we're going to use up so much horsepower for this, we might as well at least get someone to use it as the power source for a lava lamp. That might be more fun to watch than Doom 3.
  • With the PS3 + disk drive being a Linux machine, are we STILL going to be stuck with closed-source binary-only kernel modules, or will NVIDIA actually start to make good drivers.

    (of course, I already know the answer.)
  • Anybody else notice the large number of times the word "whilst" was used? I thought my Firefox translation plugin was accidentally set to English->MiddleEnglish. It's like author just got done cramming for an exam on Shakespeare and feels compelled to write the same way, but then gives up halfway through sentences and goes back to regular english:

    Whilst their relationship with Microsoft has become publicly tenuous, what about NVIDIA's relationship with their new console partner?

  • by mcc ( 14761 )
    And, uh... just curious... is this any different at all from how things work with ATI chips right now? It doesn't really sound like it...
  • Where the heck are the FreeBSD/amd64 drivers for NVidia?

    At least 2D would be good for starters. And no, do not reach for the "Reply" link below to point me to the open source driver (nv) -- it does not support secondary heads...

"More software projects have gone awry for lack of calendar time than for all other causes combined." -- Fred Brooks, Jr., _The Mythical Man Month_

Working...