NVIDIA Shows New Doom Demo On GeForce GTX 1080 (hothardware.com) 142
MojoKid shares a video showing the upcoming Doom game on NVIDIA's new GeForce GTX 1080 graphics card using the Vulkan API, quoting this report from HotHardware:
At a private briefing with NVIDIA, representatives from id software came out on stage to show off the upcoming game...the first public demonstration of the game using both NVIDIA's new flagship and the next-gen API, which is a low-overhead, cross-platform graphics and compute API akin to DirectX 12 and AMD's Mantle. In the initial part of the demo, the game is running smoothly, but its frame rate is capped at 60 frames per second. A few minutes in, however, at about the :53 second mark...the rep from id says, "We're going to uncap the framerate and see what Vulkan and Pascal can do".
With the framerate cap removed, the framerate jumps into triple digit territory and bounces between 120 and 170 frames per second, give or take. Note that the game was running on a projector at a resolution of 1080p with all in-game image quality options set to their maximum values. The game is very reminiscent of previous Doom titles and the action is non-stop.
With the framerate cap removed, the framerate jumps into triple digit territory and bounces between 120 and 170 frames per second, give or take. Note that the game was running on a projector at a resolution of 1080p with all in-game image quality options set to their maximum values. The game is very reminiscent of previous Doom titles and the action is non-stop.
Re: (Score:2)
As far i'm aware, its easier to write vulkan drivers than Open GL drivers, so, its quite possible, but not for the reason they're stating.
Re:Is it *really* Vulkan? (Score:4, Interesting)
Correct. Vulkan drivers are easier because they do less and contain fewer optimizations. The optimizations are pushed to the application developer to do. This is both a blessing an a curse:
the blessing is that application developers can gain better performance in some cases, as there is less hardware abstraction and they can gain more control over execution of graphics operations.
the curse is that the driver has fewer optimizations built in, and future drivers can't contain optimizations for games.
This approach is helpful where call overhead is significant, and on less-powerful devices such as mobile devices. On much more powerful desktop GPUs the real-world improvements of such an approach is less clear. In actual implementations with AMD Mantle and Apple's Metal on desktop-class GPUs there are mixed performance gains (sometimes faster than OpenGL, but also sometimes slower). Given the fact that the simplicity of the Vulkan driver pushes complexity to the application (more control, but more unavoidable work for application developers too) then this simpler-driver approach is actually less useful for most of the developers actually writing the applications. Think of the difference between OpenGL and Vulkan as the difference between using a Linux distro and creating a distro out of parts you choose. You can get better performance if you choose all the parts of your custom distro, but it takes a non-negligible amount of effort and is really only beneficial on low-end hardware where every cycle counts, rather than on high-end hardware where you can have cycles to burn in some areas so time-to-market matters much, much more.
Vulkan has its good points - but as a desktop OpenGL developer, it is actually a step backward for the kinds of problems I want to solve (a desktop jet-combat flight simulator to be run on mid- to-high end desktop/workstation class GPUs).
Re: (Score:2)
The trend in normal software development is to make things simpler by adding more abstraction and more building blocks that accomplish more with less. Graphics programming however is moving the other way around. Abstractions are removed and everything is moving towards putting the developer into direct control of the fundamentals. As an amateur OpenGL hobbyist I was shocked when they removed the fixed function pipeline - which is great in principle - but suddenly it was expected of me to write OpenGL shades
Re: (Score:2)
> As an amateur OpenGL hobbyist I was shocked when they removed the fixed function pipeline - which is great in principle - but suddenly it was expected of me to write OpenGL shaders to accomplish the most elementary things such as moving or texturing an object.
What?! You're too lazy to write 2 _trivial_ shaders and setup code???
0. You could always ask Reddit's /r/opengl [reddit.com] for help you know? I hang out there and on /r/gamedev.
1. You write the boiler plate to create/bind a shader ONCE:
* glCreateProgram(),
Re: (Score:2)
My C++ code _with_ comments is ~400 Lines of code.
This.
And you are seriously calling me too lazy to do that?
Of course I can figure all of that out and just do it once. But that's not the point. Previously I could quickly hack an OpenGL application together in an hour. Since OpenGL 3 I would have to invest several days in figuring out how the new programmable shader pipeline works, with all the intricacies involved, learn a new programming language, and then I would have to actually type those 400 lines of code and debug all of my newbie errors, all of whic
Re: (Score:2)
> Shaders are conceptually more difficult than the fixed function pipeline. They are more flexible, but more difficult to think about.
Not really if you understand the rendering pipeline:
* For each vertex, run the the vertex shader.
* For each pixel in the primitive, run the fragment shader
I'd highly recommend you watch:
* http://simonschreibt.de/gat/re... [simonschreibt.de]
* http://etodd.github.io/shaders... [github.io]
Because it sounds like you still lack understanding the fundamentals.
> When I want to quickly throw together a visua
Re: (Score:1)
Precisely. IMHO the guys now running OpenGL have confused its purpose - they are more worried about it being a mobile driver implementation rather than the Application API it was intended to be. This is a failure of design on the part of Khronos, and they have doubled-down on this mistake with Vulkan. They prefer to push complexity to application developers rather than those writing the mobile device drivers (who are paid to implement this stuff).
I have pointed out on Slashdot that Vulkan makes applic
Re: (Score:1)
Re: (Score:2)
Given the fact that the simplicity of the Vulkan driver pushes complexity to the application (more control, but more unavoidable work for application developers too) then this simpler-driver approach is actually less useful for most of the developers actually writing the applications.
Since most developers use a pre-written game engine, presumably they already have the optimizations baked in before they start?
Re: (Score:1)
Some people do make this argument. However, if an 'Application Programming Interface' is so close to a 'driver interface' that application developers find it takes too long to produce efficient and robust applications in a timely manner (time-to-market is THE important commercial metric) then this indicates that the API designers got it wrong. Adding lipstick to the pig simply introduces inefficiencies that Vulkan was supposed to remove.
Then we have the real-world benchmarks that show the promised perfo
Re: (Score:1)
> AMD Mantle and Apple's Metal on desktop-class GPUs there are mixed performance gains (sometimes faster than OpenGL, but also sometimes slower)
If it's slower, it's because of bad implementation. Either badly written drivers or badly written application.
For me, I welcome Vulkan. I don't have to tie myself in knots anymore trying to optimise draw call into batches. The only that OpenGL has that is better than Vulkan is compatibility.
Re: (Score:1)
Thanks for posting. I'm interested in where Vulkan may be useful. Are you working on the desktop, or on mobile? Do you have to spend time writing Vulkan code and different execution paths for each of the GPU architectures you deal with? as a desktop application developer I don't have to do this, and the same OpenGL code runs on Mac OS X, Linux, Windows and Android (I'm using JoGL, because the multi-core CPU is only about 10% utilized on four cores; and shader performance on the GPU is the limitation; c
Re: (Score:1)
You're right, Vulkan isn't for you because in 2016 developers with aims similar to yours shouldn't be writing any code that interfaces with directly with a graphics API, ever. That is a separate job done by the engine developers (think id, UE4, Unity), who are going to take the time to write a properly optimized implementation and present to you an interface suitable to whatever level of abstraction the engine is targeted at.
Re:Is it *really* Vulkan? (Score:4, Informative)
Correction: Doom 3 might have been OpenGL.
Doom was most certainly not.
In fact, it wasn't "anything" but register/memory poking, I imagine.
A sad state of affairs that a sequel's sequel is regarded as the definitive version after only 20 years.
Re: (Score:2)
There's no much difference, everything is a clone of Algol or a clone of a clone.
Except lisp.
Re:"Pascal can do" (Score:5, Funny)
Because Pascal is a lot safer than, say C++, because the compiler won't let you shoot yourself in the foot, rather than you ending up bleeding to death because the EMTs can't find you in a heap of 8192 bitwise copies all pointing at each other saying, "That's me, over there."
Re: (Score:3)
The EMT's just have to attach you to gdb and then press ctrl-c. Then your state is saved and you can be safely brought to the clinic without any haste.
This trick will win me the nobel prize, I've invented something much better than cryostasis!
Re: (Score:1)
Pascal has pointers too. And recent versions are object oriented just like C++. There are a few minor differences in the way parameters are passed to subroutines, the way arrays are treated, and the existence of functions within functions, but other than that, is there really that much difference between them when it comes to safety?
I admit it's been almost 20 years since I last wrote something in Pascal, but I don't remember it being that different from C/C++. I had plenty of crashes due to null pointers w
Re: (Score:2)
Long time to reply, I know, but thought it best to present my complete lack of bona fides and say I've never programmed in either language, but I am familiar with this old joke [york.ac.uk]
Re: (Score:2)
To be fair, freebasic is better than java in several performance wise aspects.
Re: (Score:2)
Re: (Score:2)
Yes, they really used Oberon
sloppy (Score:1)
The system requirements for the new Doom are ridiculous.
Re: (Score:1)
it's because he doesn't know what he is talking about, sure you could render this on a system a mile away and then stream the output to you home computer, but it would still need tobe able to render the 1080p (or higher signal) and the latency would be crazy high. (5/10 seconds?)
They're not that bad at all (Score:3)
Re: (Score:3)
"I could play doom 3 on $500 worth of hardware."
You're doing it wrong. You can run D3 maxed the fuck out on a P4, 1GB RAM, and a midrange 512MB GeForce 5 or 6.
You'd barely break past $100.
Re: (Score:2)
Wow, that is crazy high specs for Doom 2, you could run that on a Pentium 1 75 mhz, you could probably even find one for free on craigslist.
Re: (Score:2)
But not on an i5-750.
Re: (Score:2)
Those aren't the announced specs.
Re: (Score:2)
Take another look. The recommended specs are different from the ones you posted.
Great video. (Score:4, Insightful)
That high-framerate max-everything 1080p footage sure looked impressive shot through someone's phone camera. Nvidia couldn't have provided actual video capture?
Re: (Score:1)
The video capturing software would have interfered with the benchmarking. It almost certainly would have eaten up quite a few of those FPS, so if you want to demonstrate raw power, you can't be running video capturing software at the same time.
Re: (Score:2)
ya, an HDMI splitter + high-end capture-card would have interfered with the rendering....
Re: (Score:2)
You know streamers have solved this by using a second PC to capture the video output from the first PC before it's output to a monitor, right? The second PC doesn't even need to be particularly powerful in comparison so sometimes you'll see some pretty old systems being used as 'capture devices'. So really there isn't much excuse for not capturing the feed themselves, other than just not wanting to.
Re: (Score:2)
You know GPU vendors have solved that by putting a hardware encoder on the die? ;)
Re: (Score:2)
Even with a hardware decoder as part of the GPU's processing ability, to stream or record it takes system resources and so reduces performance somewhat. Hence the comments before mine. Streamers use 'streaming PCs' to reduce this performance hit to nil and optimize the output they can stream. It also lets them do overlays, effects, and other visual alterations with no performance hit as well.
Re:Great video. (Score:5, Insightful)
It is HotHardware's own video, embedded in HotHardware's article, posted to Slashdot by HotHardware's editor-in-chief (MojoKid). So it has nothing to do with nvidia and everything to do with HotHardware.
Re: (Score:2)
So there was a high-quality source available and HotHardware decided to use a phone instead?
Re: (Score:3)
Vulkan API could bring DirectX to Linux (Score:5, Informative)
one of the big blockers for gaming via WINE has always been DirectX, specifically translating DirectX Graphics to OpenGL. Now with the Vulkan API, we'll be able to implement the various DirectX API versions and OpenGL versions in a completely portable way as function calls to RISC-V GPU code. The only thing left is for someone to make open source firmware that implements the Vulkan API and we'll finally have a truly open source video card.
as for non-gaming, looking over how our desktops are rendered, we should implement a minimalistic window rendering API using the Vulkan API that UI libraries can build upon. this reduces the number of layers involved in rendering and can solve the accelerated vs software only problem via the LLVM implementation that runs RISC-V code. at the same time, the desktop API allowing you to choose a target GPU could forward calls from the remote system to your local system so that the forwarded windows are actually rendered locally which would vastly reduce the bandwidth as well as enable the total integration of multiple desktops.
Vulkan is the rendering API that Linux has needed all along.
Re: (Score:2)
Somehow I disagree - marketing has always been the area that makes all the difference.
Re: (Score:1)
one of the big blockers for gaming via WINE has always been DirectX, specifically translating DirectX Graphics to OpenGL.
One of the big blockers I have seen was always the reliance on unspecified behavior. For example SimCity 2000 relied on specific allocation behavior and when that changed in newer Windows versions Microsoft introduced a fall back mode so SimCity would still work. Another example I personally encountered with Wine is Serious Sam the Second Encounter. It would always blindly take the first entry in a list of available graphics configurations, since that worked on any supported Windows version. On Wine the fir
Re: (Score:2)
One of the fears that I have with Vulkan is that it destroys the progress with application isolation that has been inspired by android and other platforms and that is happening on the desktop now as well, look at wayland as an example. One of the red flags was hearing the developers of WebGL say that a web version of Vulkan won't be reasonable because of the missing ability to confine the the applications.
So yes I like low level, but please don't make isolating applications impossible.
Re: (Score:2)
i too worry about security which is why i think you shouldn't let just any program load SPIR-V GPU code. however, isolation still occurs but is dependant on the implementation of the Vulkan API. on the other hand, the desktop rendering scheme i describe is no more dangerous than current rendering systems.
tech demo (Score:1)
Does Doom even make a good tech demo anymore?
I mean, can't pretty much every card do "dark, tight, enclosed spaces, with high-contrast shadows" in their sleep?
Really, the cutting edge in video presentation has to be high-texture details with complex curves in great numbers, massive numbers of moving figures and dynamic lighting in outdoor environments, as well as sightlines - it's always a question of how far you're rendering high details.
Promising gameplay "like the old Dooms" - I *loved* Doom, Doom2 with
"very reminiscent of previous Doom titles" (Score:5, Insightful)
in the sense that it has DOOM in the title, maybe.
Re: (Score:1)
It's a rip-off of Brutal doom with double-jump and CoD regen health crap.
I'll stick with real Doom 2. Latest Zandronum mod - Complex Doom Invasion. Pretty fucking slick.
Re: (Score:3)
CoD regen health crap.
Bullshit.
"A combat system known as "push forward combat" is featured, which discourages players from taking cover behind obstacles or resting to regain health. Players instead collect health and armour pick-ups scattered throughout levels, or kill enemies to regain health."
( https://en.wikipedia.org/wiki/... [wikipedia.org] )
Re: (Score:2)
...CoD regen health crap.
I suspect you weren't paying close attention to the video.
The player healed by picking up medkits and little blue orbs.
The enemies seemed to be dropping a lot of health, but there was never any any regeneration.
Re: (Score:2)
I suspect you didn't pay attention to the game's leaked design document.
I tend to follow those before I follow a video, because history has shown time and time gain that what they advertise to you on video is quite often not what you get (Spore, anyone?)
Re: (Score:1)
how about did you play the demo? because i did and again you are just saying things that have no bearing in reality when all reality is yelling that you are wrong.
Re: (Score:2)
REGEN!!?!??!
BLASPHEMY!!
Med kits!
And Blue bottle thingies, laying around.
That's what it needs.
Re: (Score:2)
It doesn't use regenerating health.
Re: (Score:2)
oh thank Romero.
gave me a panic attack.
Re: (Score:1)
It better not, but that's what's listed in the leaked design document, so you tell me.
Re: (Score:1)
You are wrong.
there how's that?
Re: (Score:2)
They've been very clear that it doesn't use regenerating health in video interviews, they don't like it, they went for pickups instead, the alphas and betas didn't use regenerating health, and the single player footage that's been on twitch and youtube just days before release has clearly shown it's not in use too.
Re: (Score:1)
hmm, in the video the player was clearly picking up health and when the health was showing "Low Health" it did not increase even after standing still for 10 seconds..
Also in regards to jumping, the video showed a low gravity type of jumping response but in no way a double jump.
although this is what i gathered from the video, i also played the demo and it had neither of these.
Re: (Score:2)
Re: (Score:2)
in the sense that it has DOOM in the title, maybe.
Or in the sense that it's yet another first person shooter played in exactly the same way as Doom 1 (run around buildings shooting monsters). The biggest difference is that the visuals are far better. Aside from that, though, it's just like every other iteration of Doom.
Summary (Score:2)
Video hardware achieves high framerate when gameplay takes place indoors in a single room with some platforms and a handful of monsters.
1050 and 1040 (Score:3)
I like my computers very quiet, so my rule of thumb (sometimes violated) is buy the best GPU available which is passively cooled and needs no extra power connector.
I only found one page about the GTX 1050 or GTX 1040 [linustechtips.com]. This gives expected release date 2016Q3. However they don't give power consumption (critical for my purposes - I'd be looking for a maximum of about 60W) nor do the numbers they quote give me much idea of how much faster it will be than (say) a GTX 750, which so far as I know is the current best quiet GPU.
Re: (Score:2)
With a bit more research:
Compared to 1080 [videocardz.com], 1050 has 2/5 as many cores and about 5/8 the clock speed. 1080 has design thermal power of 180W. I don't remember if power is proportional to clock, but if it is, 1050 should have about 1/4 the power draw of a 1080, which puts it at 45W which won't require a power connector and is easy to passively cool, but possibly passive cards won't be available at launch. 1040 would be about 28W (expect fanless to be the norm), and 1060 about 62W (where a power connector might
Re: (Score:2)
Really you should be caring about 75W as that is what a PCIex16 can supply. The 750 was 71W IIRC and is still the most powerful non-powered card I've seen. There was nothing in the 900 series which replaced this so I would guess that this will be coming, as would an update of the 730 for SFF/low profile uses (25/35W).
Not impressive. (Score:2)
Given what kind of a stunned sloth the demo ran on my machine that's either:
a) impressive.
b) indication that they optimised it and ran it on a half-decent machine.
I can't say I'm that impressed. 1000 fps, yeah, THAT'S one hell of a piece of hardware and worthy of an article.
But that an in-production game runs at 60fps vsynced 120 without at 1080p (which isn't actually that high a resolution guys, welcome to 1999) on the top-of-the-line unreleased hardware? Well, I'd bloody hope so. Or else nothing else w
Of course it gets 170 fps (Score:1)
It's a shitty console port and all the graphics decisions are based on what the PS4/XB1 have inside.
Of course it's going to run into the thousands of fps on a video card that costs as much as the whole console.
Oh wait. It only got 170. For some reason...
Re: (Score:2)
umm if its like any new flagshipVC it will cost almost twice a playbox
Why exactly... (Score:1)
...are there two "news" stories about a new Nvidia card? Didn't ya'll just post about the unveiling of the card? Why do we need a follow-up story, when someone plays a new game on it? That seems less than newsworthy...I'm not saying that we've got paid articles being posted....but something doesn't smell right...
Hitting The Sweet Spot (Score:2)
It would be nice if it were something that nVidia were actually working towards (which I don't think they are, btw), but wouldn't it be nice if the various generations of nVidia GPUs were designed to sit a set of pre-defined targets for thermal output and power consumption?
Hang on, I get the fact that innovation doesn't come to order, but stay with me a little longer...
In top end systems these days [those for which things like the 1080 are relevant] the GPU [or plural for SLI configurations] draw the most p
Re: (Score:1)
Re: (Score:2, Funny)
Re: (Score:2)
I'm not buying one until it's over 9k.
Re: (Score:1)
Re: (Score:2)
I'm waiting for the GeForce 640K. That should surely be enough for anybody.
Re: (Score:2)
in this particular case of resolution... that would be way over kill/
Re: (Score:3)
Considering how the 980 Ti performs at 4K vs 1080p, I'm not surprised they didn't show anything at 4K.
The 1080 and AMD's Polaris are not the 4K parts you're waiting for.
Re: (Score:2)
Re: (Score:2)
Where is your evidence for this?
My experience is that everything has stopped at 1920x1080 as these are the panels required for TVs, it's very difficult to get 16x10 aspect ratio displays anywhere. With the larger number of 4k displays (also 16x9 aspect) I don't know where you think these x1200 monitors are coming from.
Re: (Score:2)
Exactly my thought. So the latest and greatest is able to run 1080p in at least 120Hz, which is nice. But can it do 2160p in 60Hz? Probably not, because that would mean pushing twice as many pixels. It may also struggle to provide decent VR (2x 1080p with at least 90Hz).
Re: They were too chicken to show 4k? (Score:2)
Re: (Score:2)
Higher refresh rate is always a benefit even if your hardware is a bit slow, old or low end. You get lower latency, less tearing or when you get tearing it's less severe. So if you're after gaming performance a 1080p 144Hz screen is great. For really high end there's 2560x1440 144Hz, if you buy a 2160p 60Hz you're a sucker (or maybe not, as it may be actually cheaper)
What sucks is the market is so much oriented toward consolidation and high volumes, the options are few for high refresh monitors. So you can'
Re: (Score:2)
And make your cellphone battery last more.
Re: (Score:2)
Naturally it's the most power hungry, compute hungry issue. If you want really good lighting or god forbid shadows it might need one hour per frame to render, whereas we'd rather have it take 10 ms.
Id went to more static lighting with Rage. Although even Doom 3 lacked dynamic lighting for every imp fire shot and plasma gun shot.