Are We At the Limit of Screen Resolution Improvements? 414
itwbennett writes "A pair of decisions by Motorola and Ubuntu to settle for 'good enough' when it comes to screen resolution for the Ubuntu Edge and the Moto X raises the question: Have we reached the limit of resolution improvements that people with average vision can actually notice?" Phone vs. laptop vs. big wall-mounted monitor seems an important distinction; the 10-foot view really is different.
No (Score:5, Interesting)
Come back and talk to me again when the average laptop and desktop screen hits high density PPI :)
Digital Movie Projection... and "Average People" (Score:4, Interesting)
If you build for the average person, you are doomed to fail. Because 1/2 of the population is above average. Also there are the finer details that a person doesn't fully recognize. The average person cannot tell the difference between 720p and 1080p. However if you have them side by side (with colors/contract/brightness matching) They will see the a difference.
Re:900 dpi (Score:5, Interesting)
It's a bit complex, because the retina doesn't really have a static resolution: it integrates information from constant movements, responses nonlinearly to different patterns of photon impacts, and has different sensitivies across different parts. You could put a ballpark number on it, but it's difficult to really sort out what the "resolution of the retina" is.
To quote a paper:
Pretty interesting stuff, from a project that tried to build a photon-accurate model of the human eye [michaelfrankdeering.com].
Hasn't stopped manufacturers (Score:4, Interesting)
Have we reached the limit of resolution improvements that people with average vision can actually notice?
Hasn't really slowed the push toward 4K in video production. While it's sometimes handy to have the frame real estate in production, it takes up a crapton more space, requires more power to edit and it's mostly useless to consumers. Even theater projection systems can't resolve much over 2K.
But if the industry doesn't go to 4K, then who will buy new cameras, editing software and storage hardware? And consumers might never upgrade their "old" HDTVs. Think of the children!
Printers and resolution (Score:5, Interesting)
Didn't laser printers show us that 300dpi is still a bit jaggy, and 600dpi is perfectly smooth at arm's length? When screen resolution is around 400dpi then we are probably done.
300dpi didn't cut it for dithered images - 600dpi was close, but not quite enough. The winner was the 1200dpi laser printers.
When you have a grayscale image you want to print on a single-color device, you use dithering to create the illusion of gray shades. A 1-to-1 mapping of pixels to printer dots gives you 2 colors - black and white. Photos look horrible. Double the printer resolution so you have a 2x2 dot array for each pixel and you have 16 possible shades. Double it again for a 4x4 dot array per pixel and you have 256 possible shades. So if you want a 300 pixel-per-inch gray scale image to look good, you need a printer resolution of 1200dpi.
Now, all this changes for RGB displays, since each pixel can be from 16 to 256 shades each. But less depth per pixel might be compensated for by smaller pixels and a higher density.
I remember in the early days of computer graphics, it was believed that 24-bit color (8-bit each Red, Green and Blue pixels) was the pinnacle. But once 24-bit color became widely available, we discovered it wasn't enough. When edited in Photoshop, often a 24-bit image would show banding in the sky, due to rounding errors in the math involved. Adobe added 48-bit color (16-bits per RGB channel) the rounding errors became much less visible. Today cameras capture 8, 12,14 or 16 bits per RGB channel, and using HDR software we get 96-bit color.
My point is we have a history of thinking we know where the limit is, but when the technology arrives, we discover we need a little bit more....