Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Displays Graphics Upgrades

GPUs Keep Getting Faster, But Your Eyes Can't Tell 291

itwbennett writes "This brings to mind an earlier Slashdot discussion about whether we've hit the limit on screen resolution improvements on handheld devices. But this time, the question revolves around ever-faster graphics processing units (GPUs) and the resolution limits of desktop monitors. ITworld's Andy Patrizio frames the problem like this: 'Desktop monitors (I'm not talking laptops except for the high-end laptops) tend to vary in size from 20 to 24 inches for mainstream/standard monitors, and 27 to 30 inches for the high end. One thing they all have in common is the resolution. They have pretty much standardized on 1920x1080. That's because 1920x1080 is the resolution for HDTV, and it fits 20 to 24-inch monitors well. Here's the thing: at that resolution, these new GPUs are so powerful you get no major, appreciable gain over the older generation.' Or as Chris Angelini, editorial director for Tom's Hardware Guide, put it, 'The current high-end of GPUs gives you as much as you'd need for an enjoyable experience. Beyond that and it's not like you will get nothing, it's just that you will notice less benefit.'"
This discussion has been archived. No new comments can be posted.

GPUs Keep Getting Faster, But Your Eyes Can't Tell

Comments Filter:
  • Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?

    -- Jim
    Your website could be better. Getting weekly feedback [weeklyfeedback.com] is a good starting point.

    • And perhaps natural language processing, to put forward my pet peeve.
    • by exomondo ( 1725132 ) on Thursday October 31, 2013 @05:49PM (#45294867)
      These are often marketed as GPGPU products, nVidia's Tesla for example, rather than taking a bunch of Geforces and putting them together.
    • by MrHanky ( 141717 )

      I'm not sure I'd call a 27" monitor an area of science, but it does benefit from today's faster GPUs.

      • by geirlk ( 171706 )

        Mine sure does.
        Using two none-SLI GPUs, a GTX670 and an older GTS460. I use the 460 for physx and a couple of older 20" monitors, and a 27" WQHD as main monitor on the GTX670. I've also hooked the 670 up to the TV, running 1080p for when I'm in couch potato mode. I can see a dramatic difference in speed changing between TV and main monitor.

        The difference between 1080p and WQHD is remarkable in certain games, like eg. Civilization 5. Most games with an isometric view benefit from it.

    • by Anonymous Coward on Thursday October 31, 2013 @05:56PM (#45294927)

      Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?

      Even ignoring that, the guy is a fucking idiot.
      He seems to be confused about the function of a GPU- they are doing far more than simply pushing pixels onto the screen. Wake up buddy, this isn't a VGA card from the 90's. A modern GPU is doing a holy shitload of processing and post-processing on the rendered scene before it ever gets around to showing the results to the user. Seriously man, there's a reason why our games don't look as awesomely smooth and detailed and complex as a big budget animated film- it's because in order to get that level of detail, yes on that SAME resolution display, you need a farm of servers crunching the scene data for hours, days, etc. Until I can get that level of quality out of my desktop GPU, there will always be room for VERY noticeable performance improvement.

      • by Score Whore ( 32328 ) on Thursday October 31, 2013 @06:00PM (#45294961)

        Not to mention that the world hasn't standardized on 1920x1080. I've got half a dozen computers / tablets and the only one that is 1080p is the Surface Pro. The MacBook Pro with Retina Display is 2880x1880. Both of my 27" monitors are 2560x1440. I don't have any idea what this dipshit is thinking, but his assumptions are completely wrong.

        • by TechyImmigrant ( 175943 ) on Thursday October 31, 2013 @06:03PM (#45294985) Homepage Journal

          I wouldn't let a 1920x1080 monitor grace my cheap Ikea desk.

          • by Anonymous Coward

            Thats why I have my old 1920x1200 panel on an arm off the wall. TECHNICALLY it isn't touching the desk.

        • by icebike ( 68054 )

          I agree, the guy must live in the ghetto, using recycled year 2000 equipment.
          For his crapstation maybe he wouldn't get any benefit from a faster GPU.

          Monitors are getting bigger all the time, and the real estate is welcome when editing mountains of text, (as is a monitor that can swivel to portrait). 1920x1080 is not sufficient for a large monitor. I don't have any that are that limited. People aren't limited to one monitor either.

      • by amicusNYCL ( 1538833 ) on Thursday October 31, 2013 @07:28PM (#45295703)

        Seriously man, there's a reason why our games don't look as awesomely smooth and detailed and complex as a big budget animated film- it's because in order to get that level of detail, yes on that SAME resolution display, you need a farm of servers crunching the scene data for hours, days, etc.

        That reminds me of the Final Fantasy movie from 2001, I remember watching that and being struck by the realism of the characters, especially the individual strands of hair of the female lead. Apparently she had 60,000 strands of hair that were individually animated and rendered, and her model had 400,000 polygons. The Wikipedia article [wikipedia.org] has some interesting details:

        Square accumulated four SGI Origin 2000 series servers, four Onyx2 systems, and 167 Octane workstations for the film's production. The basic film was rendered at a home-made render farm created by Square in Hawaii. It housed 960 Pentium III-933 MHz workstations. Animation was filmed using motion capture technology. 1,327 scenes in total needed to be filmed to animate the digital characters. The film consists of 141,964 frames, with each frame taking an average of 90 minutes to render. By the end of production Square had a total of 15 terabytes of artwork for the film. It is estimated that over the film's four-year production, approximately 200 people working on it put in a combined 120 years of work.

        To your point, this bears repeating:

        with each frame taking an average of 90 minutes to render

        This isn't exactly a GPU pumping out 40 frames per second where it can afford to make several mistakes in each frame. Also to your point, here's another interesting detail:

        Surprisingly for a film loosely based on a video game series, there were never any plans for a game adaptation of the film itself. Sakaguchi indicated the reason for this was lack of powerful gaming hardware at the time, feeling the graphics in any game adaptation would be far too much of a step down from the graphics in the film itself.

      • Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?

        Even ignoring that, the guy is a fucking idiot.

        Hear hear. I sorely miss the old ars, where new CPU architectures were explained - in detail - by someone who actually knew what they were talking about. Nowadays we have op-ed political ramblings, coverage of every. single. thing. to do with Apple and some guy drinking the latest fad food and giving us all a daily update on what colour his shit is. (Seriously, that happened.)

    • by Nyder ( 754090 )

      Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?

      -- Jim
      Your website could be better. Getting weekly feedback [weeklyfeedback.com] is a good starting point.

      Benefits the Bitcoin Miner malware makers...

    • Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?

      Absolutely, I run complete atomistic molecular dynamics simulations of viruses that cause disease in humans (enterovirus simulations around the 3-4 million atom mark). Five years ago I had to use a supercomputer to model 1/12 of a virus particle which barely scraped into the nanosecond range. I'm now able to run complete virus simulations on my desktop computer (Tesla C2070 and Quadro 5000) and I get 0.1ns/day or on my 2U rack (4x Tesla M2090) with 2 viruses running simultaneously at almost 0.2ns/day. That's using the last generation of nVidia cards (Fermi), I should in theory be able to almost double that with the new Kepler cards. I will be VERY interested to see how the next ?Maxwell architecture pans out in the future. I can see a time in the not to distant future when I can model multiple instances of virus-drug interactions on-site here in the lab and get results overnight that I can compare with our "wet lab" results. I use NAMD for the simulations which works well with the CUDA cards.

    • Lots of uses. Voxel reconstruction is a good one. The algorithms that run the 3D modes on every medical ultrasound, CAT and MRI scanner. They also brute-force problems in more elementary physics - quantum mechanical simulations. Convolution processing used in image and video filtering - not worth the effort if you are just trying to de-blur a photo, handy if you want to de-blur a 4K video feed.

    • by godrik ( 1287354 )

      The number of application GPGPU are tremendous. Actually people start just calling them accelerators more than GPU. from structural biology, to image processing, from graph analytics to text mining, from fluid mecanics to energy minimization, there are not a lot of problems which have been investigated today using GPUs.

    • Actually this is borderline hilarious/obtuse of an article in the first place. This is implying that increasing a GPU's power is about resolution - when more and more demanding games require those same increases of a GPU's power.The article could not be further from any truth whatsoever.

      So can eyes tell when GPU's get faster? Absolutely. You just need to put in a context people can understand. In this article [hexus.net] they have a call of duty video of the xbox one side by side with PS4, and then a video with PC side

  • Lets not forget (Score:5, Insightful)

    by geekoid ( 135745 ) <dadinportlandNO@SPAMyahoo.com> on Thursday October 31, 2013 @05:26PM (#45294623) Homepage Journal

    they need to handle more stuff happening on the screen.

    • Re:Lets not forget (Score:5, Insightful)

      by infogulch ( 1838658 ) on Thursday October 31, 2013 @05:43PM (#45294815)

      Exactly, the games themselves have been pared down to fewer objects because our older cards couldn't handle it. Now there are new cards and people expect games that can use that horsepower to be available instantly? Sounds unreasonable to me.

      When your graphics card can handle 3x 4K monitors at 120Hz and 3D while playing a game with fully destructible and interactable environments (not this weak-ass pre-scripted 'destruction' they're hyping in BF4 & Ghosts) the size of new york city without breaking a sweat, the bank, or the power bill, THEN you can talk about the overabundance of gpu horsepower.

  • Now (Score:3, Interesting)

    by Zeroblitzt ( 871307 ) on Thursday October 31, 2013 @05:28PM (#45294655) Homepage
    Make it draw less power!
    • Re:Now (Score:5, Interesting)

      by afidel ( 530433 ) on Thursday October 31, 2013 @05:38PM (#45294763)

      They are, you can get very playable framerates @1080p using a nearly passively cooled card (the next shrink will probably make it possible using a completely passive card). Hell, my new gaming rig draws under 100W while playing most games, my previous rig used over 100W just for the graphics card.

      • Hell, my new gaming rig draws under 100W while playing most games

        Impressive, my video card alone requires 2 additional power connectors in addition to what it draws from the motherboard.

        • by Eskarel ( 565631 )

          I had a GTX-260 which had two power draws and weighed about a tonne, my new card is a 660, and uses only one additional cable and probably weighs less than half what the old one did. Obviously if you buy the overclocked top of the line ones power draw isn't reducing all that much, but the upper mid range cards are now much more efficient than their equivalents from a few years back.

    • by dgatwood ( 11270 )

      Make it draw less power!

      In a modern architecture with proper power management, increasing the speed of the chip often does exactly that. If you graph power consumption over time, the total power consumption is the area under the line. Thus, if you make a chip that takes twice as much instantaneous power while it is active, but can do the work in a third as much time, then as long as you properly power down that extra hardware while it is idle, you're using two-thirds as much power when averaged out over t

  • by Anonymous Coward

    -Multimonitor gaming
    -3D gaming (120 Hz refresh rate or higher)
    -4K gaming

    keep em coming, and keep em affordable!

  • Err, wha? (Score:4, Insightful)

    by Anonymous Coward on Thursday October 31, 2013 @05:29PM (#45294663)

    One thing they all have in common is the resolution.

    So 2560x1440 and 2560x1600 27"s only exist in my imagination?

    • One thing they all have in common is the resolution.

      So 2560x1600 27"s only exist in my imagination?

      um yes... those would be the 30" models...

      • Re: (Score:2, Informative)

        by Anonymous Coward

        ...which also don't exist according to TFA and TFS:

        Desktop monitors (I'm not talking laptops except for the high-end laptops) tend to vary in size from 20 to 24 inches for mainstream/standard monitors, and 27 to 30 inches for the high end. One thing they all have in common is the resolution. They have pretty much standardized on 1920x1080. That's because 1920x1080 is the resolution for HDTV, and it fits 20 to 24-inch monitors well.

    • by Ark42 ( 522144 )

      I guess, since it seems like every time I see somebody with a 2560 or 2880 width monitor, it's effectively pretending to be 1280 and 1440 anyway...
      "Why does this website go off the edge on my monitor?"
      "I don't know, it looks good for me, I'm using 1920 width, what's your computer set to?"
      "Says 2880"
      "Send me a screenshot"
      Screenshot is basically 1440 pixels blown up to 200%....

    • by Luthair ( 847766 )
      I agree, the 27/30" monitors @ 1920x1080 would hardly be called high-end.
  • Your eyes... (Score:5, Insightful)

    by blahplusplus ( 757119 ) on Thursday October 31, 2013 @05:29PM (#45294665)

    ... can certainly tell. The more onscreen objects there are the more slowdown there is. This is why I like sites like HardOCP that look at MIN and MAX framerates during a gameplay session. No one cares that a basic non-interactive timedemo gets 100's of frames a second, they are concerned with the framerate floor during actually playing the game.

    • Re:Your eyes... (Score:5, Insightful)

      by MatthiasF ( 1853064 ) on Thursday October 31, 2013 @05:52PM (#45294883)
      And don't forget about refresh rates. A 60hz refresh rate might be the standard but motion looks a lot better at 120 hz on better monitors.

      Higher refresh requires more powerful GPU, no matter the resolution.
      • by faffod ( 905810 )
        Refresh rate also reduces latency. Your button press will make a change in the game world sooner the faster the game simulation is running.
    • "The more onscreen objects there are the more slowdown there is." Even when the framerates are fully in order, that one's a kicker: How did the developers ensure that framerates would be adequate on consoles, and on average PCs? By keeping the amount of stuff on screen down. And so we have pop-in, RPGs where a 'city' has maybe 100 people (spread across multiple areas with lots of clutter to occlude sightlines, and various other deviations from either the realistic or the epic, depending on what the occasio
      • I wrote a mod (Yes, you can have it if you ask) for UT2K4 that disables cleanup of decals, gibs and corpses. It's actually quite tricky - the code that does that is quite low-level, beyond the reach of unrealscript, so I had to use some hackery.

        I called it 'Knee Deep in the Dead,' an expression any gamer should recognise. The game handles it perfectly on modern hardware. Not only does it look a lot of fun, but it also impacts gameplay. You can get a good idea for where the danger zones are by the amount of

        • Tell the guy with the laptop that if he can't handle it, he should push his graphics settings down from 'hurt me plenty' to one of the lower difficultly levels.
  • Totally wrong (Score:5, Informative)

    by brennz ( 715237 ) on Thursday October 31, 2013 @05:30PM (#45294671)

    In cutting edge games, FPS still suffers even at low resolutions.

    Many users are going to multi-monitor setups to increase their visualization and even cutting edge graphics cards cannot handle gaming at 1920x1080 x 3 display setups on taxing games or applications (e.g. Crysis).

  • Seriously? (Score:5, Insightful)

    by fragfoo ( 2018548 ) on Thursday October 31, 2013 @05:32PM (#45294693)

    For games, GPU's have to process 3D geometry, light, shadows, etc. Number of pixels is not the only factor. This is so lame.

    • by Andy Dodd ( 701 )

      Yup. GPU power these days isn't about final pixel fill rates, we've had more than enough for this for a while (although keep in mind that many GPUs render at 4x the screen resolution or more to support antialiasing) - it's about geometry and effects. Most of the focus on new GPU designs is in improving shader throughput.

      Yeah, monitor resolutions aren't changing much - but more GPU horsepower means that you can render a given scene in far more detail.

      Think of it this way - even GPUs from the early 2000s ha

      • Eventually we might finally have the power for raytracing.

        Developers would be happy. Raytracing algorithms are really simple - all the complexity in modern graphics comes from having to carefully calculate things like shadows, occlusion and shading that emerge naturally from the mathematics of raytracing. Just needs a ridiculous amount of processing power to pull it off in real time.

      • (although keep in mind that many GPUs render at 4x the screen resolution or more to support antialiasing)

        This is SSAA and on both AMD and nVidia its a choice that must be intentionally made these days. Only very early generation video cards were limited to supporting SSAA. The default on modern cards is always some variant of MSAA, which at its base level only anti-aliases the edges of polygons. MSAA has the advantage that for the same processing power, the edges can have a lot more samples per pixel taken than they would with SSAA. (ie, MSAAx16 is similar in performance to SSAAx4)

        There was a blip in betwee

    • by ddt ( 14627 )

      Agreed. The assertion is totally ridiculous. Smells like slashdot is getting played by someone who wants to convince the world to buy underpowered GPUs.

    • But it is a very significant factor. Increasing the output resolution (either through pure display, or by supersample antialiasing*) gives you a linear decrease in performance. Double the number of pixels, halve your framerate. So if a $200, 3840x2160 monitor were to come out tomorrow, most gamers would be getting about 15fps using the same hardware and settings they are now.

      * SSAA uses larger buffers for everything - color buffers, z-buffers, stencil buffers, etc. The more common MSAA (multisample antialia

  • by Alejux ( 2800513 ) on Thursday October 31, 2013 @05:32PM (#45294697)
    8K resolution, 120hz. Nuff said.
  • Assumptions (Score:5, Interesting)

    by RogWilco ( 2467114 ) on Thursday October 31, 2013 @05:35PM (#45294727)
    That statement makes the rash assumption that GPUs will somehow continue to grow in speed and complexity while everything around them remains static. What about stereoscopic displays which would double the required number of pixels to be rendered for the equivalent of a 2d image? What about HMDs like the forthcoming Oculus Rift, which over time will need to continue pushing the boundaries of higher resolution displays? Who on earth is thinking that the display industry is thinking "whelp, that's it! we've hit 1080p! we can all go home now, there's nothing left to do!" ? 1080p on a 24 inch display is nowhere close to the maximum PPI we can perceive at a normal desktop viewing distance, why is that the boundary? Why are 24" displays the end? Yes, improving technology has diminishing returns. That's nothing groundbreaking, and using that to somehow suggest that we have peaked in terms of usable GPU performance is just downright silly.
    • Unfortunately desktop display resolutions have been stagnant for nearly 10 years now. (The Apple 30" was released June 2004).

      I was disappointed the new MacBook Pro does have Thunderbolt2, but does not support 4k displays. I have a Dell 30" on my old MacBook Pro but was looking forwarding to an upgrade, finally, but no.

      • Stagnant hell. They've regressed. You can pry my 1880x1400 CRTs from my cold dead hands. It took a Korean manufacturer doing international sales to finally break the logjam in the US market on 2560x1600 LCDs, and that only happened last year. Until then, you couldn't get one for less than $1000, and they went for $1200-$1400 for years and years. They still run nearly $800 on NewEgg even today.

        And they're already outclassed. Now you can get this [amazon.com], an UltraHD 3840x2160 display, for less money. The downs

  • by JustNiz ( 692889 ) on Thursday October 31, 2013 @05:36PM (#45294739)

    If you're talking 2D desktop-type computing (surfing, emails, writing documents etc) the point of this article has already been true for at least a decade.

    If you're talking 3D hardware rendering (most usually gaming), there is no such thing as enough GPU power, as its also about consistently achieving the highest framerates your monitor can handle, while having every eye-candy setting maxxed out on the latest AAA games, which are mostly already developed to get the most out of the current and next generation hardware. Its a moving goalpost on purpose.

  • Get some volunteers, let them play on a machine with an old GPU and a machine with a new one. If they can tell which is which, then apparently our eyes can see the difference. I'd be curious to see the result.

  • by IdeaMan ( 216340 )

    A GPU is no longer a Graphics Processing Unit, it's a general purpose DSP usable for tasks that have simple logic that must done in a massively parallel fashion. No, I'm actually not talking about mining bitcoins or specialty stuff, I'm talking about things like physics engines.
    On the other hand, they are still WAY behind the curve measured by the "My screen isn't 4xAA RAYTRACED yet" crowd.

  • by The Cat ( 19816 ) *

    Now maybe we can have gameplay and originality again.

  • by Anonymous Coward

    "There is considerable debate over what is the limit of the human eye when it comes to frame rate; some say 24, others say 30,"

    That's what is studied and discussed as as the lower limit to trick people into thinking it is in motion. I believe there are other studies where they have used pilots as test subjects where they could spot an object between 1/270 a second and 1/300 a second. In addition, there's another study that our brain (and perhaps eyes) can be trained by watching movies/tv to be more relaxe

    • by Anaerin ( 905998 ) on Thursday October 31, 2013 @06:14PM (#45295067)
      And it depends on what part of the eye you're talking about. The Rods (The detail-oriented parts of the eye) see at around 30Hz. The Cones (The black-and-white but higher light sensitivity and faster responding parts) see at around 70Hz. This is why CRT monitors were recommended to be set at 72Hz or higher to avoid eyestrain - at 60Hz the Rods couldn't see the flickering of the display, but the Cones could, and the disparity caused headaches (You could also see the effect if you looked at a 60Hz monitor through your peripheral vision - it appears to shimmer).
  • I like high framerates and can see the difference, and there are other ways to spend the bandwidth and processing time, like color depth. 24bit is still quite limiting compared to 'real life' color gamut.. Of course, in order to be of benefit, we need displays capable of 'real life' color gamut, and normalizing even a 30bit depth on today's monitors is pointless.

    Another place GPU is (ab)used is with antialiasing and post process effects, which many like. I dislike antialiasing because it causes me eye str

  • Wow, how can something so stupid get chosen as a post? Seriously. Even at 1080p, even the high end GPUs fall below 60fps on the most demanding games out there. People to buy high-end GPUs often do so to pair them up with 3 1080p monitors, or a 1440p monitor, or even a 1600p monitor. In fact, these people need to buy 2 to 4 of these top-end GPUs to drive that many pixels and triangles.
  • by Jane Q. Public ( 1010737 ) on Thursday October 31, 2013 @05:57PM (#45294933)
    There is absolutely no reason to have 1080p as a "standard" max resolution. 5 years ago I got a nice Princeton 24", 1920 x 1200 monitor at a good price. And I expected resolution to keep going up from there, as it always had before. Imaging my surprise when 1920 x 1200 monitors became harder to find, as manufacturers settled on the lower "standard" of 1920 x 1080 and seemed to be refusing to budge.

    It's great and all that a 1080p monitor will handle 1080p video. BUT... when it does, there is no room for video controls, or anything else, because it's in "full screen" mode, which has limitations. I can play the same video on my monitor, using VLC, and still have room for the controls and other information, always on-screen.

    Now certain forces seem to want us to "upgrade" to 4k, which uses an outrageous amount of memory and hard drive space, super high bandwidth cables, and is more resolution than the eye can discern anyway unless the screen is absolutely huge AND around 10 feet away.

    Whatever happened to the gradual, step-wise progress we used to see? I would not in the least mind having a 26" or 27", 2560 x 1440 monitor on my desk. That should have been the next reasonable step up in monitor resolution... but try to find one from a major manufacturer! There are some on Ebay, mostly from no-names, and most of them are far more expensive than they should be. They should be cheaper than my 24" monitor from 5 years ago. But they aren't. Everything else in the computer field is still getting better and cheaper at the same time. But not monitors. Why?
    • by nojayuk ( 567177 ) on Thursday October 31, 2013 @06:12PM (#45295043)

      Is Dell enough of a major manufacturer for you? I just got a replacement 27" Dell 2560x1440 monitor delivered today after a big electricity spike blew out my previous Dell 27" monitor a few days ago.

      Sure it costs more than piddly little HD-resolution monitors but I'm looking at nearly twice the number of pixels as HD, it's an IPS panel, high-gamut and with a lot of useful extra functionality (a USB 3.0 hub, for example). Well worth the £550 I dropped on it.

      If you are willing to compromise and really want a 24" 1920x1200 monitor Dell make them too. The 2412M is affordable, the U2413 has a higher gamut at a higher price. Your choice.

    • If there is a widely accepted standard, there is a guaranteed customer base. 4K is the next logical plateau since it is gaining traction as the next broadcast standard (although NHK is pushing for 8K!).

      There are only two suppliers of flat panels, Samsung and Sharp (and Sharp isn't looking good financially). If you ask them to make a common, but not wildly common resolution, it will cost much more.

      In two years, you will be able to buy a 4K monitor for the same price as a 1080p screen, and all new video car

    • by sribe ( 304414 )

      I would not in the least mind having a 26" or 27", 2560 x 1440 monitor on my desk. That should have been the next reasonable step up in monitor resolution... but try to find one from a major manufacturer!

      For whatever reason, they seem to have been coming and going for the past year or two. If you don't find them this month, check back next month and you'll find several...

    • by Ichijo ( 607641 )

      [4k] is more resolution than the eye can discern...

      Yes, that's the whole idea behind retina displays.

    • It's the sweet-spot on price. The point beyond which component price starts rapidly increasing.

      This is partly due to an economy of scale issue. A lot of HDTVs use that panel size - 1080P HDTV. That means they are manufactured in vast quantities, which pushes the per-unit cost down.

    • by Kjella ( 173770 )

      1. Volume, volume, volume. Same panel for monitor and TV.
      2. Most software can't adjust DPI properly.
      3. Even on computers YouTube, Netflix etc. is huge

      The trend was already very clear with ATSC, DVB-S2 and BluRay all standardizing on 1920x1080 ten years ago, if you didn't realize it five years ago you must have had your head in the sand. You're the first person I've heard that has claimed it's an advantage to always have distracting controls and other information on the screen, maybe you're just seriously o

    • Amen. I find 16:9 to be too cramped, and this is compounded by the fact that a lot of web developers are still making content that assumes we're back in the age of non-widescreen monitors, meaning more scrolling. Or in the case of 16:9 monitors, MORE more scrolling.

      16:10 is a compromise I can live with, and it disappoints me that 1920x1080 has somehow become dominant merely because of a video distribution standard. Don't shackle me in your 16:9 chains.

      Frankly, I'd rather see a move to higher-resolution m

  • Most games are written for the Lowest Common Denominator, that is, Game Consoles.

    Hopefully PC games will be 'allowed' to improve when the next generation of console becomes standard.

    • by 0123456 ( 636235 )

      Hopefully PC games will be 'allowed' to improve when the next generation of console becomes standard.

      Except the next generation of consoles are basically low to mid-range gaming PCs.

      • Where does the word "except" come into play there? They are about 8 years newer than the last gen and even if the last gen was top of the line and this is mid-bottom it's still a 5 year advance, if we assume that it takes 3 years for top of the line gaming hardware to mainstream itself.

  • by HaeMaker ( 221642 ) on Thursday October 31, 2013 @06:05PM (#45294999) Homepage

    Until, when I look at a video game on my screen and look at a live action TV show and can't tell the difference, there is room for improvement. Perhaps the improvement needs to come from the game developers, but there is still room and I do not believe we have hit the pinacle of GPU performance.

    By the way, 4K will replace 1080p very soon, so the article is doubly moot.

  • Really? GPU's are being used more and more for more than just graphics processing. Many interesting parallel processing problems are being off loaded to GPU's where they are number crunching on hundreds of cores much faster than can be done on your main CPU. See http://www.nvidia.com/object/cuda_home_new.html [nvidia.com] for one such set of libraries for Nvidia cards.

    So WHO CARES if you cannot see the difference in what gets displayed. There is a LOT more going on.

  • 1080p on a 27" screen? Those are some pretty big pixels! Actually my 27" iMac has far higher res than that, though still looks a little fuzzy after using my "retina" laptop screen. I hope to see these 300-ish ppi values reaching 27" screens sometime soon, and that's what these GPUs will need to be fast for. Unlike a TV set, which is viewed from a distance, a monitor is used much closer, so higher res is a very obvious benefit.
  • by rew ( 6140 ) <r.e.wolff@BitWizard.nl> on Thursday October 31, 2013 @07:09PM (#45295527) Homepage

    You don't need a GPU at all. A screen is 2Mpixels. Refreshing that about 60 times per second is enough to create the illusion of fluid motion for most humans. So that's only 120Mpixels per second. Any modern CPU can do that!

    Why do you have a GPU? Because it's not enough to just refresh the pixels. You need (for some applications, e.g. gaming) complex 3D calculations to determine which pixels go where. And in complex scenes, it is not known in advance what objects will be visible and which ones (or part) will be obscured by other objecs. So instead of doing the complex calculations to determine what part of what object is visible, it has been shown to be faster to just draw all objects, but to check on drawing each pixel which object is closer, the already drawn object or the currently being drawn object.

  • I remember when the manufacturers of speakers believed that humans couldn't sense audio information below 30Hz. There are still arguments about whether human beings can discern 100,000 colors or 10million. (http://hypertextbook.com/facts/2006/JenniferLeong.shtml)

    Oh, and don't forget that people really can't tell the difference between a 128-bit MP3 and one at 320-bit. And those who tell you that vinyl sounds better? Or that there's a difference between audio recorded digitally and audio recorded using an

  • I've seen side-by-side comparisons at SONY stores. This applies to monitors 4 feet or larger. 4K is 2x in each direction and 4x overall. There is not a lot of true 4K programming out there however. More movies are being filmed in 4K. No real plans to braodcast at the resolution in the US.
  • I know there are many other posts that are saying the same but regardless: bullshit. Polygon counts CAN get better, texture resolution CAN get better, detail quality CAN get better, particle counts CAN get better. And that's while keeping the same "classic" rasterizing concept. If you move into the raytracing world, there's plenty of space for improvement.
    • by Shados ( 741919 )

      The issue is that the bottleneck is starting to not be so much the hardware, but the designers/graphic artists/animators at this point. Big name games need a total army of graphic artists to make to properly use current techs... the market will only absorb so much cost increase.

  • Cuda, OpenCL and PhysX would tend to disagree about that statement.

    I'm pretty sure there's still room for improvement in framerate on a 6 4K monitor setup.

    Besides, technology has a tendency to trickle down, like F1 Racing technology from years ago is used in today's Fords and GMs, My cellphone blows the doors off a 2003 computer, heck, my Dollar-store calculator is probably faster than a 1995 supercomputer.

    Besides, ATI and nVidia fighting off brings us cheaper and more silent GPUs...

"It's a dog-eat-dog world out there, and I'm wearing Milkbone underware." -- Norm, from _Cheers_

Working...