Forgot your password?
typodupeerror
Displays Graphics Upgrades

GPUs Keep Getting Faster, But Your Eyes Can't Tell 291

Posted by timothy
from the at-least-my-eyes-can't dept.
itwbennett writes "This brings to mind an earlier Slashdot discussion about whether we've hit the limit on screen resolution improvements on handheld devices. But this time, the question revolves around ever-faster graphics processing units (GPUs) and the resolution limits of desktop monitors. ITworld's Andy Patrizio frames the problem like this: 'Desktop monitors (I'm not talking laptops except for the high-end laptops) tend to vary in size from 20 to 24 inches for mainstream/standard monitors, and 27 to 30 inches for the high end. One thing they all have in common is the resolution. They have pretty much standardized on 1920x1080. That's because 1920x1080 is the resolution for HDTV, and it fits 20 to 24-inch monitors well. Here's the thing: at that resolution, these new GPUs are so powerful you get no major, appreciable gain over the older generation.' Or as Chris Angelini, editorial director for Tom's Hardware Guide, put it, 'The current high-end of GPUs gives you as much as you'd need for an enjoyable experience. Beyond that and it's not like you will get nothing, it's just that you will notice less benefit.'"
This discussion has been archived. No new comments can be posted.

GPUs Keep Getting Faster, But Your Eyes Can't Tell

Comments Filter:
  • Now (Score:3, Interesting)

    by Zeroblitzt (871307) on Thursday October 31, 2013 @05:28PM (#45294655) Homepage
    Make it draw less power!
  • Assumptions (Score:5, Interesting)

    by RogWilco (2467114) on Thursday October 31, 2013 @05:35PM (#45294727)
    That statement makes the rash assumption that GPUs will somehow continue to grow in speed and complexity while everything around them remains static. What about stereoscopic displays which would double the required number of pixels to be rendered for the equivalent of a 2d image? What about HMDs like the forthcoming Oculus Rift, which over time will need to continue pushing the boundaries of higher resolution displays? Who on earth is thinking that the display industry is thinking "whelp, that's it! we've hit 1080p! we can all go home now, there's nothing left to do!" ? 1080p on a 24 inch display is nowhere close to the maximum PPI we can perceive at a normal desktop viewing distance, why is that the boundary? Why are 24" displays the end? Yes, improving technology has diminishing returns. That's nothing groundbreaking, and using that to somehow suggest that we have peaked in terms of usable GPU performance is just downright silly.
  • Re:Now (Score:5, Interesting)

    by afidel (530433) on Thursday October 31, 2013 @05:38PM (#45294763)

    They are, you can get very playable framerates @1080p using a nearly passively cooled card (the next shrink will probably make it possible using a completely passive card). Hell, my new gaming rig draws under 100W while playing most games, my previous rig used over 100W just for the graphics card.

  • by Anonymous Coward on Thursday October 31, 2013 @05:39PM (#45294775)

    "There is considerable debate over what is the limit of the human eye when it comes to frame rate; some say 24, others say 30,"

    That's what is studied and discussed as as the lower limit to trick people into thinking it is in motion. I believe there are other studies where they have used pilots as test subjects where they could spot an object between 1/270 a second and 1/300 a second. In addition, there's another study that our brain (and perhaps eyes) can be trained by watching movies/tv to be more relaxed and accept lower frame rates such as 24 as fluid, or higher. Different careers can have an impact as we are exposed to different things visually.

    Additionally frame latency can continue to be driven down (with diminishing returns) with higher performing cards even if the frame rate stays constant.

  • by Jane Q. Public (1010737) on Thursday October 31, 2013 @05:57PM (#45294933)
    There is absolutely no reason to have 1080p as a "standard" max resolution. 5 years ago I got a nice Princeton 24", 1920 x 1200 monitor at a good price. And I expected resolution to keep going up from there, as it always had before. Imaging my surprise when 1920 x 1200 monitors became harder to find, as manufacturers settled on the lower "standard" of 1920 x 1080 and seemed to be refusing to budge.

    It's great and all that a 1080p monitor will handle 1080p video. BUT... when it does, there is no room for video controls, or anything else, because it's in "full screen" mode, which has limitations. I can play the same video on my monitor, using VLC, and still have room for the controls and other information, always on-screen.

    Now certain forces seem to want us to "upgrade" to 4k, which uses an outrageous amount of memory and hard drive space, super high bandwidth cables, and is more resolution than the eye can discern anyway unless the screen is absolutely huge AND around 10 feet away.

    Whatever happened to the gradual, step-wise progress we used to see? I would not in the least mind having a 26" or 27", 2560 x 1440 monitor on my desk. That should have been the next reasonable step up in monitor resolution... but try to find one from a major manufacturer! There are some on Ebay, mostly from no-names, and most of them are far more expensive than they should be. They should be cheaper than my 24" monitor from 5 years ago. But they aren't. Everything else in the computer field is still getting better and cheaper at the same time. But not monitors. Why?
  • by im_thatoneguy (819432) on Thursday October 31, 2013 @11:01PM (#45297051)

    I do some work on the side for a hardware raytracing company and you're mostly right. Shameless plug: http://caustic.com./ [caustic.com.] And speaking as a VFX artist ray tracing is way easier. When you aren't cheating everything it becomes much simpler to get to "realistic". Global Illumination also goes a long way to help. I can take a game asset with textures and geometry and normal maps etc and render it with a raytracing engine and it looks dramatically better.

    The problem with current technology is that there is something of a divide in performance. Present ray tracing technology is about 5x too slow to match a good rasterized game. You could deliver 10-20 fps at decent resolution with ray tracing but wouldn't get any noticeable benefit. To really get that silky smooth GI you need another 20-30x faster or so (even with a dedicated ray tracing chip).

    The challenge then is to improve ray tracing chips fast enough to catch up to GPUs. I think in 3-4 years you'll see a number of games which deliver exceptional ray traced images. Rivaling film renders in real-time. But 3-4 years in spite of this author's nonsense is a long time in GPU technology. In the last 3-4 years we've seen tessellation, the first instances of GI and dynamic light reflections. These make a huge difference. They're total hacks but game developers can't sit still and as much of a pain as they are--they work. It would be more of a paint to rewrite their engines from scratch to take advantage of a whole new rendering pipeline.

    The other challenge is that the reason many films look so good is because of 2D cheats in the composite. If you look at a raw render out of Arnold, Brazil, Renderman or Vray it's not really like what shows up on screen. There is a lot of sweetening, a lot of one-off lighting tricks in post and in the render which only look great from the one angle. Games have to look good from every angle. I don't know that they'll ever achieve that. They'll look more photographic but the ultra polish of a film comes from lighting TDs, cinematographers and compositors all working in tandem to polish a shot for days or weeks. If things just looked good from every direction all the time--the VFX work on a feature film would be dramatically reduced. So in that regard game developers are going to have it way harder than film people. Not only does it have to render at 60 fps... but you can't cheat detail. You have to make it look good from 300 yards as you drive your car down main street... all the way to jumping out and walking up to 2" away and reading the headline.

%DCL-MEM-BAD, bad memory VMS-F-PDGERS, pudding between the ears

Working...