Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Games Hardware

NVIDIA Shows New Doom Demo On GeForce GTX 1080 (hothardware.com) 142

MojoKid shares a video showing the upcoming Doom game on NVIDIA's new GeForce GTX 1080 graphics card using the Vulkan API, quoting this report from HotHardware: At a private briefing with NVIDIA, representatives from id software came out on stage to show off the upcoming game...the first public demonstration of the game using both NVIDIA's new flagship and the next-gen API, which is a low-overhead, cross-platform graphics and compute API akin to DirectX 12 and AMD's Mantle. In the initial part of the demo, the game is running smoothly, but its frame rate is capped at 60 frames per second. A few minutes in, however, at about the :53 second mark...the rep from id says, "We're going to uncap the framerate and see what Vulkan and Pascal can do".

With the framerate cap removed, the framerate jumps into triple digit territory and bounces between 120 and 170 frames per second, give or take. Note that the game was running on a projector at a resolution of 1080p with all in-game image quality options set to their maximum values. The game is very reminiscent of previous Doom titles and the action is non-stop.

This discussion has been archived. No new comments can be posted.

NVIDIA Shows New Doom Demo On GeForce GTX 1080

Comments Filter:
  • The system requirements for the new Doom are ridiculous.

    • It'll run find on an 8320 and a $150 gpu. The only reason you can't run it on an i3 is they're dual core and it uses the actual core. I could play doom 3 on $500 worth of hardware. Or just play it on the PS4 for $300. And give it a year after launch after it's been optimized and I'll be it'll run on my 4 year old 5800k and 2 year old 660 gtx.
      • by Khyber ( 864651 )

        "I could play doom 3 on $500 worth of hardware."

        You're doing it wrong. You can run D3 maxed the fuck out on a P4, 1GB RAM, and a midrange 512MB GeForce 5 or 6.

        You'd barely break past $100.

        • Wow, that is crazy high specs for Doom 2, you could run that on a Pentium 1 75 mhz, you could probably even find one for free on craigslist.

      • It'll run find on an 8320

        But not on an i5-750.

  • Great video. (Score:4, Insightful)

    by LMariachi ( 86077 ) on Sunday May 08, 2016 @10:44PM (#52073179) Journal

    That high-framerate max-everything 1080p footage sure looked impressive shot through someone's phone camera. Nvidia couldn't have provided actual video capture?

    • The video capturing software would have interfered with the benchmarking. It almost certainly would have eaten up quite a few of those FPS, so if you want to demonstrate raw power, you can't be running video capturing software at the same time.

      • by GNious ( 953874 )

        ya, an HDMI splitter + high-end capture-card would have interfered with the rendering....

      • You know streamers have solved this by using a second PC to capture the video output from the first PC before it's output to a monitor, right? The second PC doesn't even need to be particularly powerful in comparison so sometimes you'll see some pretty old systems being used as 'capture devices'. So really there isn't much excuse for not capturing the feed themselves, other than just not wanting to.

        • You know GPU vendors have solved that by putting a hardware encoder on the die? ;)

          • Even with a hardware decoder as part of the GPU's processing ability, to stream or record it takes system resources and so reduces performance somewhat. Hence the comments before mine. Streamers use 'streaming PCs' to reduce this performance hit to nil and optimize the output they can stream. It also lets them do overlays, effects, and other visual alterations with no performance hit as well.

    • Re:Great video. (Score:5, Insightful)

      by dave420 ( 699308 ) on Monday May 09, 2016 @06:18AM (#52074289)

      It is HotHardware's own video, embedded in HotHardware's article, posted to Slashdot by HotHardware's editor-in-chief (MojoKid). So it has nothing to do with nvidia and everything to do with HotHardware.

  • by Gravis Zero ( 934156 ) on Sunday May 08, 2016 @11:04PM (#52073239)

    one of the big blockers for gaming via WINE has always been DirectX, specifically translating DirectX Graphics to OpenGL. Now with the Vulkan API, we'll be able to implement the various DirectX API versions and OpenGL versions in a completely portable way as function calls to RISC-V GPU code. The only thing left is for someone to make open source firmware that implements the Vulkan API and we'll finally have a truly open source video card.

    as for non-gaming, looking over how our desktops are rendered, we should implement a minimalistic window rendering API using the Vulkan API that UI libraries can build upon. this reduces the number of layers involved in rendering and can solve the accelerated vs software only problem via the LLVM implementation that runs RISC-V code. at the same time, the desktop API allowing you to choose a target GPU could forward calls from the remote system to your local system so that the forwarded windows are actually rendered locally which would vastly reduce the bandwidth as well as enable the total integration of multiple desktops.

    Vulkan is the rendering API that Linux has needed all along.

    • by armanox ( 826486 )

      Somehow I disagree - marketing has always been the area that makes all the difference.

    • by Anonymous Coward

      one of the big blockers for gaming via WINE has always been DirectX, specifically translating DirectX Graphics to OpenGL.

      One of the big blockers I have seen was always the reliance on unspecified behavior. For example SimCity 2000 relied on specific allocation behavior and when that changed in newer Windows versions Microsoft introduced a fall back mode so SimCity would still work. Another example I personally encountered with Wine is Serious Sam the Second Encounter. It would always blindly take the first entry in a list of available graphics configurations, since that worked on any supported Windows version. On Wine the fir

    • One of the fears that I have with Vulkan is that it destroys the progress with application isolation that has been inspired by android and other platforms and that is happening on the desktop now as well, look at wayland as an example. One of the red flags was hearing the developers of WebGL say that a web version of Vulkan won't be reasonable because of the missing ability to confine the the applications.

      So yes I like low level, but please don't make isolating applications impossible.

      • i too worry about security which is why i think you shouldn't let just any program load SPIR-V GPU code. however, isolation still occurs but is dependant on the implementation of the Vulkan API. on the other hand, the desktop rendering scheme i describe is no more dangerous than current rendering systems.

  • Does Doom even make a good tech demo anymore?
    I mean, can't pretty much every card do "dark, tight, enclosed spaces, with high-contrast shadows" in their sleep?

    Really, the cutting edge in video presentation has to be high-texture details with complex curves in great numbers, massive numbers of moving figures and dynamic lighting in outdoor environments, as well as sightlines - it's always a question of how far you're rendering high details.

    Promising gameplay "like the old Dooms" - I *loved* Doom, Doom2 with

  • by Swampash ( 1131503 ) on Monday May 09, 2016 @12:08AM (#52073419)

    in the sense that it has DOOM in the title, maybe.

    • by Khyber ( 864651 )

      It's a rip-off of Brutal doom with double-jump and CoD regen health crap.

      I'll stick with real Doom 2. Latest Zandronum mod - Complex Doom Invasion. Pretty fucking slick.

      • CoD regen health crap.

        Bullshit.

        "A combat system known as "push forward combat" is featured, which discourages players from taking cover behind obstacles or resting to regain health. Players instead collect health and armour pick-ups scattered throughout levels, or kill enemies to regain health."

        ( https://en.wikipedia.org/wiki/... [wikipedia.org] )

      • ...CoD regen health crap.

        I suspect you weren't paying close attention to the video.

        The player healed by picking up medkits and little blue orbs.

        The enemies seemed to be dropping a lot of health, but there was never any any regeneration.

        • by Khyber ( 864651 )

          I suspect you didn't pay attention to the game's leaked design document.

          I tend to follow those before I follow a video, because history has shown time and time gain that what they advertise to you on video is quite often not what you get (Spore, anyone?)

          • how about did you play the demo? because i did and again you are just saying things that have no bearing in reality when all reality is yelling that you are wrong.

      • by dywolf ( 2673597 )

        REGEN!!?!??!
        BLASPHEMY!!

        Med kits!
        And Blue bottle thingies, laying around.
        That's what it needs.

        • It doesn't use regenerating health.

          • by dywolf ( 2673597 )

            oh thank Romero.
            gave me a panic attack.

          • by Khyber ( 864651 )

            It better not, but that's what's listed in the leaked design document, so you tell me.

            • You are wrong.

              there how's that?

            • They've been very clear that it doesn't use regenerating health in video interviews, they don't like it, they went for pickups instead, the alphas and betas didn't use regenerating health, and the single player footage that's been on twitch and youtube just days before release has clearly shown it's not in use too.

      • hmm, in the video the player was clearly picking up health and when the health was showing "Low Health" it did not increase even after standing still for 10 seconds..

        Also in regards to jumping, the video showed a low gravity type of jumping response but in no way a double jump.

        although this is what i gathered from the video, i also played the demo and it had neither of these.

    • Which version of Doom is that? Doom 3 was not able to handle things like the "guy" does in this video...
    • in the sense that it has DOOM in the title, maybe.

      Or in the sense that it's yet another first person shooter played in exactly the same way as Doom 1 (run around buildings shooting monsters). The biggest difference is that the visuals are far better. Aside from that, though, it's just like every other iteration of Doom.

  • Video hardware achieves high framerate when gameplay takes place indoors in a single room with some platforms and a handful of monsters.

  • by Michael Woodhams ( 112247 ) on Monday May 09, 2016 @12:41AM (#52073567) Journal

    I like my computers very quiet, so my rule of thumb (sometimes violated) is buy the best GPU available which is passively cooled and needs no extra power connector.

    I only found one page about the GTX 1050 or GTX 1040 [linustechtips.com]. This gives expected release date 2016Q3. However they don't give power consumption (critical for my purposes - I'd be looking for a maximum of about 60W) nor do the numbers they quote give me much idea of how much faster it will be than (say) a GTX 750, which so far as I know is the current best quiet GPU.

    • With a bit more research:
      Compared to 1080 [videocardz.com], 1050 has 2/5 as many cores and about 5/8 the clock speed. 1080 has design thermal power of 180W. I don't remember if power is proportional to clock, but if it is, 1050 should have about 1/4 the power draw of a 1080, which puts it at 45W which won't require a power connector and is easy to passively cool, but possibly passive cards won't be available at launch. 1040 would be about 28W (expect fanless to be the norm), and 1060 about 62W (where a power connector might

      • Really you should be caring about 75W as that is what a PCIex16 can supply. The 750 was 71W IIRC and is still the most powerful non-powered card I've seen. There was nothing in the 900 series which replaced this so I would guess that this will be coming, as would an update of the 730 for SFF/low profile uses (25/35W).

  • Given what kind of a stunned sloth the demo ran on my machine that's either:

    a) impressive.
    b) indication that they optimised it and ran it on a half-decent machine.

    I can't say I'm that impressed. 1000 fps, yeah, THAT'S one hell of a piece of hardware and worthy of an article.

    But that an in-production game runs at 60fps vsynced 120 without at 1080p (which isn't actually that high a resolution guys, welcome to 1999) on the top-of-the-line unreleased hardware? Well, I'd bloody hope so. Or else nothing else w

  • It's a shitty console port and all the graphics decisions are based on what the PS4/XB1 have inside.
    Of course it's going to run into the thousands of fps on a video card that costs as much as the whole console.
    Oh wait. It only got 170. For some reason...

  • ...are there two "news" stories about a new Nvidia card? Didn't ya'll just post about the unveiling of the card? Why do we need a follow-up story, when someone plays a new game on it? That seems less than newsworthy...I'm not saying that we've got paid articles being posted....but something doesn't smell right...

  • It would be nice if it were something that nVidia were actually working towards (which I don't think they are, btw), but wouldn't it be nice if the various generations of nVidia GPUs were designed to sit a set of pre-defined targets for thermal output and power consumption?

    Hang on, I get the fact that innovation doesn't come to order, but stay with me a little longer...

    In top end systems these days [those for which things like the 1080 are relevant] the GPU [or plural for SLI configurations] draw the most p

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...