Forgot your password?
Displays Graphics Upgrades

GPUs Keep Getting Faster, But Your Eyes Can't Tell 291

Posted by timothy
from the at-least-my-eyes-can't dept.
itwbennett writes "This brings to mind an earlier Slashdot discussion about whether we've hit the limit on screen resolution improvements on handheld devices. But this time, the question revolves around ever-faster graphics processing units (GPUs) and the resolution limits of desktop monitors. ITworld's Andy Patrizio frames the problem like this: 'Desktop monitors (I'm not talking laptops except for the high-end laptops) tend to vary in size from 20 to 24 inches for mainstream/standard monitors, and 27 to 30 inches for the high end. One thing they all have in common is the resolution. They have pretty much standardized on 1920x1080. That's because 1920x1080 is the resolution for HDTV, and it fits 20 to 24-inch monitors well. Here's the thing: at that resolution, these new GPUs are so powerful you get no major, appreciable gain over the older generation.' Or as Chris Angelini, editorial director for Tom's Hardware Guide, put it, 'The current high-end of GPUs gives you as much as you'd need for an enjoyable experience. Beyond that and it's not like you will get nothing, it's just that you will notice less benefit.'"
This discussion has been archived. No new comments can be posted.

GPUs Keep Getting Faster, But Your Eyes Can't Tell

Comments Filter:
  • Totally wrong (Score:5, Informative)

    by brennz (715237) on Thursday October 31, 2013 @05:30PM (#45294671)

    In cutting edge games, FPS still suffers even at low resolutions.

    Many users are going to multi-monitor setups to increase their visualization and even cutting edge graphics cards cannot handle gaming at 1920x1080 x 3 display setups on taxing games or applications (e.g. Crysis).

  • by exomondo (1725132) on Thursday October 31, 2013 @05:49PM (#45294867)
    These are often marketed as GPGPU products, nVidia's Tesla for example, rather than taking a bunch of Geforces and putting them together.
  • by Anonymous Coward on Thursday October 31, 2013 @05:56PM (#45294927)

    Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?

    Even ignoring that, the guy is a fucking idiot.
    He seems to be confused about the function of a GPU- they are doing far more than simply pushing pixels onto the screen. Wake up buddy, this isn't a VGA card from the 90's. A modern GPU is doing a holy shitload of processing and post-processing on the rendered scene before it ever gets around to showing the results to the user. Seriously man, there's a reason why our games don't look as awesomely smooth and detailed and complex as a big budget animated film- it's because in order to get that level of detail, yes on that SAME resolution display, you need a farm of servers crunching the scene data for hours, days, etc. Until I can get that level of quality out of my desktop GPU, there will always be room for VERY noticeable performance improvement.

  • Re:Err, wha? (Score:2, Informative)

    by Anonymous Coward on Thursday October 31, 2013 @06:06PM (#45295001)

    ...which also don't exist according to TFA and TFS:

    Desktop monitors (I'm not talking laptops except for the high-end laptops) tend to vary in size from 20 to 24 inches for mainstream/standard monitors, and 27 to 30 inches for the high end. One thing they all have in common is the resolution. They have pretty much standardized on 1920x1080. That's because 1920x1080 is the resolution for HDTV, and it fits 20 to 24-inch monitors well.

  • by nojayuk (567177) on Thursday October 31, 2013 @06:12PM (#45295043)

    Is Dell enough of a major manufacturer for you? I just got a replacement 27" Dell 2560x1440 monitor delivered today after a big electricity spike blew out my previous Dell 27" monitor a few days ago.

    Sure it costs more than piddly little HD-resolution monitors but I'm looking at nearly twice the number of pixels as HD, it's an IPS panel, high-gamut and with a lot of useful extra functionality (a USB 3.0 hub, for example). Well worth the £550 I dropped on it.

    If you are willing to compromise and really want a 24" 1920x1200 monitor Dell make them too. The 2412M is affordable, the U2413 has a higher gamut at a higher price. Your choice.

  • by Anaerin (905998) on Thursday October 31, 2013 @06:14PM (#45295067)
    And it depends on what part of the eye you're talking about. The Rods (The detail-oriented parts of the eye) see at around 30Hz. The Cones (The black-and-white but higher light sensitivity and faster responding parts) see at around 70Hz. This is why CRT monitors were recommended to be set at 72Hz or higher to avoid eyestrain - at 60Hz the Rods couldn't see the flickering of the display, but the Cones could, and the disparity caused headaches (You could also see the effect if you looked at a 60Hz monitor through your peripheral vision - it appears to shimmer).
  • by rew (6140) <> on Thursday October 31, 2013 @07:09PM (#45295527) Homepage

    You don't need a GPU at all. A screen is 2Mpixels. Refreshing that about 60 times per second is enough to create the illusion of fluid motion for most humans. So that's only 120Mpixels per second. Any modern CPU can do that!

    Why do you have a GPU? Because it's not enough to just refresh the pixels. You need (for some applications, e.g. gaming) complex 3D calculations to determine which pixels go where. And in complex scenes, it is not known in advance what objects will be visible and which ones (or part) will be obscured by other objecs. So instead of doing the complex calculations to determine what part of what object is visible, it has been shown to be faster to just draw all objects, but to check on drawing each pixel which object is closer, the already drawn object or the currently being drawn object.

  • Re:Your eyes... (Score:5, Informative)

    by faffod (905810) on Thursday October 31, 2013 @08:19PM (#45296225)
    In your scenario the action started at exactly the same time. Then things diverge. The game state changing and reacting is driven by the game refresh rate (which may be independent of the video refresh rate). The latency between the game state change and the visual feedback is directly linked to the video refresh rate.
    As an example if you have the same game running on two machines, one at 60Hz and the other at 1Hz. In both cases you press the button at exactly the same time. The game update will process that button press and start a muzzle flash, that took some period of time that we will assume is equal for both machines (i.e. I won't make the slow rendering machine also have a slow game update, even if typicaly the two are tightly coupled). So 1/nth of a second after the button press both machines are ready to show the muzzle flash. On the first machine you will see the muzzle flash 16.6 milliseconds later. On the 1Hz machine the muzzle flash will appear 1 second later.

    Now, my example is a bit extreme (to make it obvious that there is a difference. Do not think that this is irrelevant in real word cases. I worked on one of the first fps games to win awards for jump puzzles that were not atrocious. Early on we spent a lot of time testing the game at 30Hz and at 60Hz. If we ran the game at 30 we could effectively double the quality of the graphics, which the art team obviously wanted so that they could do even greater stuff. But after blind testing we found that everyone noticed "something" was better about the game that ran at 60Hz. Reducing the latency between the button press and the jump allowed players to gage the jump point more accurately. Reducing the latency of the joystick movements allowed the player to guide their landings more accurately.

    One final note, maintaining a consistent frame rate is even more critical, players have to know that when the press "x" they will get the same result.
  • by hairyfeet (841228) <bassbeast1968@gma i l . com> on Thursday October 31, 2013 @09:33PM (#45296645) Journal

    But everyone here is missing the REAL advantagse of new GPUs....power consumption and price. My HD4850 uses over 110w and reaches nearly 190c under load, the HD7770 I'm getting for my BDay? It is nearly 60% faster while uses less than HALF of the power and creating less heat by nearly half. And while my HD4850 new cost nearly $300 the HD7770? Can be had for less than $100.

    Now that CPUs have pretty much maxed out the big gains will be in GPUs, both on power and price. Sure the top 'o the line will be a giant wallet raper but those are really ePeens more than anything and the biggest sales will be the $75-$150 range and that is where we are seeing some really sweet cards.

  • by Shadow of Eternity (795165) on Friday November 01, 2013 @12:43AM (#45297435)

    First off you're so wrong it hurts. Until very recently graphical framerates in the average FPS were relatively insulated from "physics framerates", in the days of TFC for example 100fps didn't make your rockets any more accurate because the SERVER calculated its trajectory.

    Secondly it's been proven time and time again that humans are perfectly capable of detecting framerates well into the hundreds. Fighter pilots can not only detect a SINGLE frame from somewhere around 1/200th of a second but even tell you what enemy jet it was. Gamers are similar, until consolization forced a lower standard onto everyone and covered it up with a lower FOV filled with massive amounts of bloom and blur performance was judged by the gold standard of a solid 60fps minimum, 30 was choppy and 100 was idea.

If God is perfect, why did He create discontinuous functions?