GPUs Keep Getting Faster, But Your Eyes Can't Tell 291
itwbennett writes "This brings to mind an earlier Slashdot discussion about whether we've hit the limit on screen resolution improvements on handheld devices. But this time, the question revolves around ever-faster graphics processing units (GPUs) and the resolution limits of desktop monitors. ITworld's Andy Patrizio frames the problem like this: 'Desktop monitors (I'm not talking laptops except for the high-end laptops) tend to vary in size from 20 to 24 inches for mainstream/standard monitors, and 27 to 30 inches for the high end. One thing they all have in common is the resolution. They have pretty much standardized on 1920x1080. That's because 1920x1080 is the resolution for HDTV, and it fits 20 to 24-inch monitors well. Here's the thing: at that resolution, these new GPUs are so powerful you get no major, appreciable gain over the older generation.' Or as Chris Angelini, editorial director for Tom's Hardware Guide, put it, 'The current high-end of GPUs gives you as much as you'd need for an enjoyable experience. Beyond that and it's not like you will get nothing, it's just that you will notice less benefit.'"
There are other applications (Score:2)
Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?
-- Jim
Your website could be better. Getting weekly feedback [weeklyfeedback.com] is a good starting point.
Re: (Score:2)
Re: (Score:3, Insightful)
Err, wrong.
24-30FPS is enough *with proper motion blur*
Without motion blur, you need about 3x that.
Re:There are other applications (Score:5, Insightful)
We only get good enough framerate at 1920x1200 (the One True Resolution) because of a lot of shortcuts. Improved computing power could allow games to make the transition to better lighting models (whatever they call the new ray0tracing stuff) that are both easier for artists/world builders and look better and more natural. It would also be nice to stop thinking of everything in polygons, but there's so much tooling there beyond the GPU (and if you push the poly-count high enough it doesn't matter visually).
Re:There are other applications (Score:5, Interesting)
I do some work on the side for a hardware raytracing company and you're mostly right. Shameless plug: http://caustic.com./ [caustic.com.] And speaking as a VFX artist ray tracing is way easier. When you aren't cheating everything it becomes much simpler to get to "realistic". Global Illumination also goes a long way to help. I can take a game asset with textures and geometry and normal maps etc and render it with a raytracing engine and it looks dramatically better.
The problem with current technology is that there is something of a divide in performance. Present ray tracing technology is about 5x too slow to match a good rasterized game. You could deliver 10-20 fps at decent resolution with ray tracing but wouldn't get any noticeable benefit. To really get that silky smooth GI you need another 20-30x faster or so (even with a dedicated ray tracing chip).
The challenge then is to improve ray tracing chips fast enough to catch up to GPUs. I think in 3-4 years you'll see a number of games which deliver exceptional ray traced images. Rivaling film renders in real-time. But 3-4 years in spite of this author's nonsense is a long time in GPU technology. In the last 3-4 years we've seen tessellation, the first instances of GI and dynamic light reflections. These make a huge difference. They're total hacks but game developers can't sit still and as much of a pain as they are--they work. It would be more of a paint to rewrite their engines from scratch to take advantage of a whole new rendering pipeline.
The other challenge is that the reason many films look so good is because of 2D cheats in the composite. If you look at a raw render out of Arnold, Brazil, Renderman or Vray it's not really like what shows up on screen. There is a lot of sweetening, a lot of one-off lighting tricks in post and in the render which only look great from the one angle. Games have to look good from every angle. I don't know that they'll ever achieve that. They'll look more photographic but the ultra polish of a film comes from lighting TDs, cinematographers and compositors all working in tandem to polish a shot for days or weeks. If things just looked good from every direction all the time--the VFX work on a feature film would be dramatically reduced. So in that regard game developers are going to have it way harder than film people. Not only does it have to render at 60 fps... but you can't cheat detail. You have to make it look good from 300 yards as you drive your car down main street... all the way to jumping out and walking up to 2" away and reading the headline.
Comment removed (Score:5, Informative)
Re:There are other applications (Score:5, Informative)
First off you're so wrong it hurts. Until very recently graphical framerates in the average FPS were relatively insulated from "physics framerates", in the days of TFC for example 100fps didn't make your rockets any more accurate because the SERVER calculated its trajectory.
Secondly it's been proven time and time again that humans are perfectly capable of detecting framerates well into the hundreds. Fighter pilots can not only detect a SINGLE frame from somewhere around 1/200th of a second but even tell you what enemy jet it was. Gamers are similar, until consolization forced a lower standard onto everyone and covered it up with a lower FOV filled with massive amounts of bloom and blur performance was judged by the gold standard of a solid 60fps minimum, 30 was choppy and 100 was idea.
Re:There are other applications (Score:5, Informative)
Re: (Score:2)
I'm not sure I'd call a 27" monitor an area of science, but it does benefit from today's faster GPUs.
Re: (Score:3)
Mine sure does.
Using two none-SLI GPUs, a GTX670 and an older GTS460. I use the 460 for physx and a couple of older 20" monitors, and a 27" WQHD as main monitor on the GTX670. I've also hooked the 670 up to the TV, running 1080p for when I'm in couch potato mode. I can see a dramatic difference in speed changing between TV and main monitor.
The difference between 1080p and WQHD is remarkable in certain games, like eg. Civilization 5. Most games with an isometric view benefit from it.
Re:There are other applications (Score:5, Informative)
Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?
Even ignoring that, the guy is a fucking idiot.
He seems to be confused about the function of a GPU- they are doing far more than simply pushing pixels onto the screen. Wake up buddy, this isn't a VGA card from the 90's. A modern GPU is doing a holy shitload of processing and post-processing on the rendered scene before it ever gets around to showing the results to the user. Seriously man, there's a reason why our games don't look as awesomely smooth and detailed and complex as a big budget animated film- it's because in order to get that level of detail, yes on that SAME resolution display, you need a farm of servers crunching the scene data for hours, days, etc. Until I can get that level of quality out of my desktop GPU, there will always be room for VERY noticeable performance improvement.
Re:There are other applications (Score:5, Insightful)
Not to mention that the world hasn't standardized on 1920x1080. I've got half a dozen computers / tablets and the only one that is 1080p is the Surface Pro. The MacBook Pro with Retina Display is 2880x1880. Both of my 27" monitors are 2560x1440. I don't have any idea what this dipshit is thinking, but his assumptions are completely wrong.
Re:There are other applications (Score:5, Funny)
I wouldn't let a 1920x1080 monitor grace my cheap Ikea desk.
Re: (Score:2)
Thats why I have my old 1920x1200 panel on an arm off the wall. TECHNICALLY it isn't touching the desk.
Re: (Score:3)
Re: (Score:3)
I agree, the guy must live in the ghetto, using recycled year 2000 equipment.
For his crapstation maybe he wouldn't get any benefit from a faster GPU.
Monitors are getting bigger all the time, and the real estate is welcome when editing mountains of text, (as is a monitor that can swivel to portrait). 1920x1080 is not sufficient for a large monitor. I don't have any that are that limited. People aren't limited to one monitor either.
Re:There are other applications (Score:4)
Seriously man, there's a reason why our games don't look as awesomely smooth and detailed and complex as a big budget animated film- it's because in order to get that level of detail, yes on that SAME resolution display, you need a farm of servers crunching the scene data for hours, days, etc.
That reminds me of the Final Fantasy movie from 2001, I remember watching that and being struck by the realism of the characters, especially the individual strands of hair of the female lead. Apparently she had 60,000 strands of hair that were individually animated and rendered, and her model had 400,000 polygons. The Wikipedia article [wikipedia.org] has some interesting details:
Square accumulated four SGI Origin 2000 series servers, four Onyx2 systems, and 167 Octane workstations for the film's production. The basic film was rendered at a home-made render farm created by Square in Hawaii. It housed 960 Pentium III-933 MHz workstations. Animation was filmed using motion capture technology. 1,327 scenes in total needed to be filmed to animate the digital characters. The film consists of 141,964 frames, with each frame taking an average of 90 minutes to render. By the end of production Square had a total of 15 terabytes of artwork for the film. It is estimated that over the film's four-year production, approximately 200 people working on it put in a combined 120 years of work.
To your point, this bears repeating:
with each frame taking an average of 90 minutes to render
This isn't exactly a GPU pumping out 40 frames per second where it can afford to make several mistakes in each frame. Also to your point, here's another interesting detail:
Surprisingly for a film loosely based on a video game series, there were never any plans for a game adaptation of the film itself. Sakaguchi indicated the reason for this was lack of powerful gaming hardware at the time, feeling the graphics in any game adaptation would be far too much of a step down from the graphics in the film itself.
Re: (Score:3)
Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?
Even ignoring that, the guy is a fucking idiot.
Hear hear. I sorely miss the old ars, where new CPU architectures were explained - in detail - by someone who actually knew what they were talking about. Nowadays we have op-ed political ramblings, coverage of every. single. thing. to do with Apple and some guy drinking the latest fad food and giving us all a daily update on what colour his shit is. (Seriously, that happened.)
Re: (Score:2)
Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?
-- Jim
Your website could be better. Getting weekly feedback [weeklyfeedback.com] is a good starting point.
Benefits the Bitcoin Miner malware makers...
Re: (Score:2)
Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?
Absolutely, I run complete atomistic molecular dynamics simulations of viruses that cause disease in humans (enterovirus simulations around the 3-4 million atom mark). Five years ago I had to use a supercomputer to model 1/12 of a virus particle which barely scraped into the nanosecond range. I'm now able to run complete virus simulations on my desktop computer (Tesla C2070 and Quadro 5000) and I get 0.1ns/day or on my 2U rack (4x Tesla M2090) with 2 viruses running simultaneously at almost 0.2ns/day. That's using the last generation of nVidia cards (Fermi), I should in theory be able to almost double that with the new Kepler cards. I will be VERY interested to see how the next ?Maxwell architecture pans out in the future. I can see a time in the not to distant future when I can model multiple instances of virus-drug interactions on-site here in the lab and get results overnight that I can compare with our "wet lab" results. I use NAMD for the simulations which works well with the CUDA cards.
Re: (Score:2)
Lots of uses. Voxel reconstruction is a good one. The algorithms that run the 3D modes on every medical ultrasound, CAT and MRI scanner. They also brute-force problems in more elementary physics - quantum mechanical simulations. Convolution processing used in image and video filtering - not worth the effort if you are just trying to de-blur a photo, handy if you want to de-blur a 4K video feed.
Re: (Score:2)
The number of application GPGPU are tremendous. Actually people start just calling them accelerators more than GPU. from structural biology, to image processing, from graph analytics to text mining, from fluid mecanics to energy minimization, there are not a lot of problems which have been investigated today using GPUs.
Re: (Score:3)
Actually this is borderline hilarious/obtuse of an article in the first place. This is implying that increasing a GPU's power is about resolution - when more and more demanding games require those same increases of a GPU's power.The article could not be further from any truth whatsoever.
So can eyes tell when GPU's get faster? Absolutely. You just need to put in a context people can understand. In this article [hexus.net] they have a call of duty video of the xbox one side by side with PS4, and then a video with PC side
Re: (Score:3)
Silly rabbit. GPUs are for mining bitcoins.
Lets not forget (Score:5, Insightful)
they need to handle more stuff happening on the screen.
Re:Lets not forget (Score:5, Insightful)
Exactly, the games themselves have been pared down to fewer objects because our older cards couldn't handle it. Now there are new cards and people expect games that can use that horsepower to be available instantly? Sounds unreasonable to me.
When your graphics card can handle 3x 4K monitors at 120Hz and 3D while playing a game with fully destructible and interactable environments (not this weak-ass pre-scripted 'destruction' they're hyping in BF4 & Ghosts) the size of new york city without breaking a sweat, the bank, or the power bill, THEN you can talk about the overabundance of gpu horsepower.
Re: (Score:2)
Well, I just wanted an excuse for my 120Hz requirement.
Now (Score:3, Interesting)
Re:Now (Score:5, Interesting)
They are, you can get very playable framerates @1080p using a nearly passively cooled card (the next shrink will probably make it possible using a completely passive card). Hell, my new gaming rig draws under 100W while playing most games, my previous rig used over 100W just for the graphics card.
Re: (Score:2)
Hell, my new gaming rig draws under 100W while playing most games
Impressive, my video card alone requires 2 additional power connectors in addition to what it draws from the motherboard.
Re: (Score:2)
I had a GTX-260 which had two power draws and weighed about a tonne, my new card is a 660, and uses only one additional cable and probably weighs less than half what the old one did. Obviously if you buy the overclocked top of the line ones power draw isn't reducing all that much, but the upper mid range cards are now much more efficient than their equivalents from a few years back.
Re: (Score:2)
Based on results in the database it looks like it would score around 7k, I'll gladly give up 25% performance for a machine that runs silent even while gaming =)
Re: (Score:2)
In a modern architecture with proper power management, increasing the speed of the chip often does exactly that. If you graph power consumption over time, the total power consumption is the area under the line. Thus, if you make a chip that takes twice as much instantaneous power while it is active, but can do the work in a third as much time, then as long as you properly power down that extra hardware while it is idle, you're using two-thirds as much power when averaged out over t
we will need those phat GPUs: (Score:2, Insightful)
-Multimonitor gaming
-3D gaming (120 Hz refresh rate or higher)
-4K gaming
keep em coming, and keep em affordable!
Err, wha? (Score:4, Insightful)
One thing they all have in common is the resolution.
So 2560x1440 and 2560x1600 27"s only exist in my imagination?
Re: (Score:2)
One thing they all have in common is the resolution.
So 2560x1600 27"s only exist in my imagination?
um yes... those would be the 30" models...
Re: (Score:2, Informative)
...which also don't exist according to TFA and TFS:
Desktop monitors (I'm not talking laptops except for the high-end laptops) tend to vary in size from 20 to 24 inches for mainstream/standard monitors, and 27 to 30 inches for the high end. One thing they all have in common is the resolution. They have pretty much standardized on 1920x1080. That's because 1920x1080 is the resolution for HDTV, and it fits 20 to 24-inch monitors well.
Re: (Score:3)
I prefer my 2560x1600 screens in 10" form factor [samsung.com]. 27"+ needs to be at least 3840x2160.
I prefer my 3840x2400 screens in 22" form factor [wikipedia.org]. 27"+ needs to be... I dunno, something bigger.
On the bright side, modern 4k displays do have better frame-rate and more convenient inputs than the old beast.
Re: (Score:2)
I guess, since it seems like every time I see somebody with a 2560 or 2880 width monitor, it's effectively pretending to be 1280 and 1440 anyway...
"Why does this website go off the edge on my monitor?"
"I don't know, it looks good for me, I'm using 1920 width, what's your computer set to?"
"Says 2880"
"Send me a screenshot"
Screenshot is basically 1440 pixels blown up to 200%....
Re: (Score:2)
Your eyes... (Score:5, Insightful)
... can certainly tell. The more onscreen objects there are the more slowdown there is. This is why I like sites like HardOCP that look at MIN and MAX framerates during a gameplay session. No one cares that a basic non-interactive timedemo gets 100's of frames a second, they are concerned with the framerate floor during actually playing the game.
Re:Your eyes... (Score:5, Insightful)
Higher refresh requires more powerful GPU, no matter the resolution.
Re: (Score:2)
Re:Your eyes... (Score:5, Informative)
As an example if you have the same game running on two machines, one at 60Hz and the other at 1Hz. In both cases you press the button at exactly the same time. The game update will process that button press and start a muzzle flash, that took some period of time that we will assume is equal for both machines (i.e. I won't make the slow rendering machine also have a slow game update, even if typicaly the two are tightly coupled). So 1/nth of a second after the button press both machines are ready to show the muzzle flash. On the first machine you will see the muzzle flash 16.6 milliseconds later. On the 1Hz machine the muzzle flash will appear 1 second later.
Now, my example is a bit extreme (to make it obvious that there is a difference. Do not think that this is irrelevant in real word cases. I worked on one of the first fps games to win awards for jump puzzles that were not atrocious. Early on we spent a lot of time testing the game at 30Hz and at 60Hz. If we ran the game at 30 we could effectively double the quality of the graphics, which the art team obviously wanted so that they could do even greater stuff. But after blind testing we found that everyone noticed "something" was better about the game that ran at 60Hz. Reducing the latency between the button press and the jump allowed players to gage the jump point more accurately. Reducing the latency of the joystick movements allowed the player to guide their landings more accurately.
One final note, maintaining a consistent frame rate is even more critical, players have to know that when the press "x" they will get the same result.
Re: (Score:2)
Re: (Score:2)
I wrote a mod (Yes, you can have it if you ask) for UT2K4 that disables cleanup of decals, gibs and corpses. It's actually quite tricky - the code that does that is quite low-level, beyond the reach of unrealscript, so I had to use some hackery.
I called it 'Knee Deep in the Dead,' an expression any gamer should recognise. The game handles it perfectly on modern hardware. Not only does it look a lot of fun, but it also impacts gameplay. You can get a good idea for where the danger zones are by the amount of
Re: (Score:2)
Totally wrong (Score:5, Informative)
In cutting edge games, FPS still suffers even at low resolutions.
Many users are going to multi-monitor setups to increase their visualization and even cutting edge graphics cards cannot handle gaming at 1920x1080 x 3 display setups on taxing games or applications (e.g. Crysis).
Re: (Score:2)
plus 4K screens are just around the corner. and multi-screen 4K.
Seriously? (Score:5, Insightful)
For games, GPU's have to process 3D geometry, light, shadows, etc. Number of pixels is not the only factor. This is so lame.
Re: (Score:2)
Yup. GPU power these days isn't about final pixel fill rates, we've had more than enough for this for a while (although keep in mind that many GPUs render at 4x the screen resolution or more to support antialiasing) - it's about geometry and effects. Most of the focus on new GPU designs is in improving shader throughput.
Yeah, monitor resolutions aren't changing much - but more GPU horsepower means that you can render a given scene in far more detail.
Think of it this way - even GPUs from the early 2000s ha
Re: (Score:2)
Eventually we might finally have the power for raytracing.
Developers would be happy. Raytracing algorithms are really simple - all the complexity in modern graphics comes from having to carefully calculate things like shadows, occlusion and shading that emerge naturally from the mathematics of raytracing. Just needs a ridiculous amount of processing power to pull it off in real time.
Re: (Score:2)
(although keep in mind that many GPUs render at 4x the screen resolution or more to support antialiasing)
This is SSAA and on both AMD and nVidia its a choice that must be intentionally made these days. Only very early generation video cards were limited to supporting SSAA. The default on modern cards is always some variant of MSAA, which at its base level only anti-aliases the edges of polygons. MSAA has the advantage that for the same processing power, the edges can have a lot more samples per pixel taken than they would with SSAA. (ie, MSAAx16 is similar in performance to SSAAx4)
There was a blip in betwee
Re: (Score:2)
Agreed. The assertion is totally ridiculous. Smells like slashdot is getting played by someone who wants to convince the world to buy underpowered GPUs.
Re: (Score:2)
But it is a very significant factor. Increasing the output resolution (either through pure display, or by supersample antialiasing*) gives you a linear decrease in performance. Double the number of pixels, halve your framerate. So if a $200, 3840x2160 monitor were to come out tomorrow, most gamers would be getting about 15fps using the same hardware and settings they are now.
* SSAA uses larger buffers for everything - color buffers, z-buffers, stencil buffers, etc. The more common MSAA (multisample antialia
Oculus Rift at... (Score:3)
Assumptions (Score:5, Interesting)
Re: (Score:2)
I was disappointed the new MacBook Pro does have Thunderbolt2, but does not support 4k displays. I have a Dell 30" on my old MacBook Pro but was looking forwarding to an upgrade, finally, but no.
Re: (Score:2)
Stagnant hell. They've regressed. You can pry my 1880x1400 CRTs from my cold dead hands. It took a Korean manufacturer doing international sales to finally break the logjam in the US market on 2560x1600 LCDs, and that only happened last year. Until then, you couldn't get one for less than $1000, and they went for $1200-$1400 for years and years. They still run nearly $800 on NewEgg even today.
And they're already outclassed. Now you can get this [amazon.com], an UltraHD 3840x2160 display, for less money. The downs
Re: (Score:2)
No news here (Score:3)
If you're talking 2D desktop-type computing (surfing, emails, writing documents etc) the point of this article has already been true for at least a decade.
If you're talking 3D hardware rendering (most usually gaming), there is no such thing as enough GPU power, as its also about consistently achieving the highest framerates your monitor can handle, while having every eye-candy setting maxxed out on the latest AAA games, which are mostly already developed to get the most out of the current and next generation hardware. Its a moving goalpost on purpose.
That's an easy question to settle (Score:2)
Get some volunteers, let them play on a machine with an old GPU and a machine with a new one. If they can tell which is which, then apparently our eyes can see the difference. I'd be curious to see the result.
Sure! (Score:2)
Re: (Score:2)
If that game gets much more complicated, it's going to need GPU acceleration.
DSP (Score:2)
A GPU is no longer a Graphics Processing Unit, it's a general purpose DSP usable for tasks that have simple logic that must done in a massively parallel fashion. No, I'm actually not talking about mining bitcoins or specialty stuff, I'm talking about things like physics engines.
On the other hand, they are still WAY behind the curve measured by the "My screen isn't 4xAA RAYTRACED yet" crowd.
Re: (Score:2)
Good (Score:2)
Now maybe we can have gameplay and originality again.
Author's poor interpretation of performance (Score:2, Interesting)
"There is considerable debate over what is the limit of the human eye when it comes to frame rate; some say 24, others say 30,"
That's what is studied and discussed as as the lower limit to trick people into thinking it is in motion. I believe there are other studies where they have used pilots as test subjects where they could spot an object between 1/270 a second and 1/300 a second. In addition, there's another study that our brain (and perhaps eyes) can be trained by watching movies/tv to be more relaxe
Re:Author's poor interpretation of performance (Score:5, Informative)
Re: (Score:2)
You've got that reversed: Cones are color sensitive and slower responding. Rods are monochromatic. Reference. [gsu.edu]
Re: (Score:2)
75 Hz? Back when I used a CRT, 85 Hz was my preferred minimum; 60 Hz was ghastly, 75 Hz was bearable for short periods, and 85 Hz was rock solid.
my eyes can (Score:2)
I like high framerates and can see the difference, and there are other ways to spend the bandwidth and processing time, like color depth. 24bit is still quite limiting compared to 'real life' color gamut.. Of course, in order to be of benefit, we need displays capable of 'real life' color gamut, and normalizing even a 30bit depth on today's monitors is pointless.
Another place GPU is (ab)used is with antialiasing and post process effects, which many like. I dislike antialiasing because it causes me eye str
C'mon, who let this crap get posted? (Score:2)
It's A Dumb "Standard" (Score:3, Interesting)
It's great and all that a 1080p monitor will handle 1080p video. BUT... when it does, there is no room for video controls, or anything else, because it's in "full screen" mode, which has limitations. I can play the same video on my monitor, using VLC, and still have room for the controls and other information, always on-screen.
Now certain forces seem to want us to "upgrade" to 4k, which uses an outrageous amount of memory and hard drive space, super high bandwidth cables, and is more resolution than the eye can discern anyway unless the screen is absolutely huge AND around 10 feet away.
Whatever happened to the gradual, step-wise progress we used to see? I would not in the least mind having a 26" or 27", 2560 x 1440 monitor on my desk. That should have been the next reasonable step up in monitor resolution... but try to find one from a major manufacturer! There are some on Ebay, mostly from no-names, and most of them are far more expensive than they should be. They should be cheaper than my 24" monitor from 5 years ago. But they aren't. Everything else in the computer field is still getting better and cheaper at the same time. But not monitors. Why?
Re:It's A Dumb "Standard" (Score:4, Informative)
Is Dell enough of a major manufacturer for you? I just got a replacement 27" Dell 2560x1440 monitor delivered today after a big electricity spike blew out my previous Dell 27" monitor a few days ago.
Sure it costs more than piddly little HD-resolution monitors but I'm looking at nearly twice the number of pixels as HD, it's an IPS panel, high-gamut and with a lot of useful extra functionality (a USB 3.0 hub, for example). Well worth the £550 I dropped on it.
If you are willing to compromise and really want a 24" 1920x1200 monitor Dell make them too. The 2412M is affordable, the U2413 has a higher gamut at a higher price. Your choice.
Two Words: "Volume" and "Suppliers" (Score:2)
If there is a widely accepted standard, there is a guaranteed customer base. 4K is the next logical plateau since it is gaining traction as the next broadcast standard (although NHK is pushing for 8K!).
There are only two suppliers of flat panels, Samsung and Sharp (and Sharp isn't looking good financially). If you ask them to make a common, but not wildly common resolution, it will cost much more.
In two years, you will be able to buy a 4K monitor for the same price as a 1080p screen, and all new video car
Re: (Score:2)
I would not in the least mind having a 26" or 27", 2560 x 1440 monitor on my desk. That should have been the next reasonable step up in monitor resolution... but try to find one from a major manufacturer!
For whatever reason, they seem to have been coming and going for the past year or two. If you don't find them this month, check back next month and you'll find several...
Re: (Score:2)
Yes, that's the whole idea behind retina displays.
Re: (Score:2)
It's the sweet-spot on price. The point beyond which component price starts rapidly increasing.
This is partly due to an economy of scale issue. A lot of HDTVs use that panel size - 1080P HDTV. That means they are manufactured in vast quantities, which pushes the per-unit cost down.
Re: (Score:2)
1. Volume, volume, volume. Same panel for monitor and TV.
2. Most software can't adjust DPI properly.
3. Even on computers YouTube, Netflix etc. is huge
The trend was already very clear with ATSC, DVB-S2 and BluRay all standardizing on 1920x1080 ten years ago, if you didn't realize it five years ago you must have had your head in the sand. You're the first person I've heard that has claimed it's an advantage to always have distracting controls and other information on the screen, maybe you're just seriously o
Re: (Score:3)
Amen. I find 16:9 to be too cramped, and this is compounded by the fact that a lot of web developers are still making content that assumes we're back in the age of non-widescreen monitors, meaning more scrolling. Or in the case of 16:9 monitors, MORE more scrolling.
16:10 is a compromise I can live with, and it disappoints me that 1920x1080 has somehow become dominant merely because of a video distribution standard. Don't shackle me in your 16:9 chains.
Frankly, I'd rather see a move to higher-resolution m
Console Ports. (Score:2)
Most games are written for the Lowest Common Denominator, that is, Game Consoles.
Hopefully PC games will be 'allowed' to improve when the next generation of console becomes standard.
Re: (Score:2)
Hopefully PC games will be 'allowed' to improve when the next generation of console becomes standard.
Except the next generation of consoles are basically low to mid-range gaming PCs.
Re: (Score:2)
Where does the word "except" come into play there? They are about 8 years newer than the last gen and even if the last gen was top of the line and this is mid-bottom it's still a 5 year advance, if we assume that it takes 3 years for top of the line gaming hardware to mainstream itself.
Yes, I can see it... (Score:3)
Until, when I look at a video game on my screen and look at a live action TV show and can't tell the difference, there is room for improvement. Perhaps the improvement needs to come from the game developers, but there is still room and I do not believe we have hit the pinacle of GPU performance.
By the way, 4K will replace 1080p very soon, so the article is doubly moot.
GPU's are not just for Graphics anymore (Score:2)
Really? GPU's are being used more and more for more than just graphics processing. Many interesting parallel processing problems are being off loaded to GPU's where they are number crunching on hundreds of cores much faster than can be done on your main CPU. See http://www.nvidia.com/object/cuda_home_new.html [nvidia.com] for one such set of libraries for Nvidia cards.
So WHO CARES if you cannot see the difference in what gets displayed. There is a LOT more going on.
More resolution! (Score:2)
You don't need a GPU. (Score:5, Informative)
You don't need a GPU at all. A screen is 2Mpixels. Refreshing that about 60 times per second is enough to create the illusion of fluid motion for most humans. So that's only 120Mpixels per second. Any modern CPU can do that!
Why do you have a GPU? Because it's not enough to just refresh the pixels. You need (for some applications, e.g. gaming) complex 3D calculations to determine which pixels go where. And in complex scenes, it is not known in advance what objects will be visible and which ones (or part) will be obscured by other objecs. So instead of doing the complex calculations to determine what part of what object is visible, it has been shown to be faster to just draw all objects, but to check on drawing each pixel which object is closer, the already drawn object or the currently being drawn object.
Re: (Score:3)
What I'm trying to say is: In theory a CPU is fast enough to refresh all pixels within the time of a single frame.
But having a GPU that can do things to the screen while the CPU does other neccessary stuff makes sense. It starts with 2D bitblits.
Are we really doing this again? (Score:2)
I remember when the manufacturers of speakers believed that humans couldn't sense audio information below 30Hz. There are still arguments about whether human beings can discern 100,000 colors or 10million. (http://hypertextbook.com/facts/2006/JenniferLeong.shtml)
Oh, and don't forget that people really can't tell the difference between a 128-bit MP3 and one at 320-bit. And those who tell you that vinyl sounds better? Or that there's a difference between audio recorded digitally and audio recorded using an
4K is stunning (Score:2)
Bullshit. (Score:2)
Re: (Score:2)
The issue is that the bottleneck is starting to not be so much the hardware, but the designers/graphic artists/animators at this point. Big name games need a total army of graphic artists to make to properly use current techs... the market will only absorb so much cost increase.
BS (Score:2)
Cuda, OpenCL and PhysX would tend to disagree about that statement.
I'm pretty sure there's still room for improvement in framerate on a 6 4K monitor setup.
Besides, technology has a tendency to trickle down, like F1 Racing technology from years ago is used in today's Fords and GMs, My cellphone blows the doors off a 2003 computer, heck, my Dollar-store calculator is probably faster than a 1995 supercomputer.
Besides, ATI and nVidia fighting off brings us cheaper and more silent GPUs...
Re: (Score:2)
If your dropping 300 - 1k for a high end video card why would you be driving a 150 buck 1080p monitor off it? 2560x1600 monitors are 350 ish with decent ips panels.
I'm running 3 32 inch 2560x1600 panels on my primary desktop and still want more pixels.
Re: 1080p is dildos (Score:2)
1080p @ 120 Hz with V-Sync OFF for multiplayer FPS is the gold standard on my GTX Titan.
Re: (Score:2)
only 60? pff, 120 please..
Re: (Score:2)
Re: (Score:2)
Oooh. Shiny!
Re: (Score:2)
640kHz is a really fast frame rate.