Are We At the Limit of Screen Resolution Improvements? 414
itwbennett writes "A pair of decisions by Motorola and Ubuntu to settle for 'good enough' when it comes to screen resolution for the Ubuntu Edge and the Moto X raises the question: Have we reached the limit of resolution improvements that people with average vision can actually notice?" Phone vs. laptop vs. big wall-mounted monitor seems an important distinction; the 10-foot view really is different.
I have a hard time (Score:5, Funny)
Re: (Score:2)
Re: (Score:2)
He probably just needs a retina display. It's got what retinas crave.
Re: (Score:2)
Re: (Score:2)
Whooosh Woooosh.
already passing it (Score:5, Insightful)
Re: (Score:2)
I feel your pain. I can no longer do any glasses-free browsing on my smartphone without a lot of squinting and resulting headache. I fear that increasing resolution will just tempt younger developers (who have yet to encounter the joys of presbyopia) to design things in even smaller fonts.
Re: (Score:2)
you can use a 2 finger "stretch" gesture to zoom in.
Re: (Score:2)
you can use a 2 finger "stretch" gesture to zoom in.
Only works for some things and some sites...
Re: (Score:2)
Re: (Score:3)
Re: (Score:3)
Re: (Score:2)
Re: (Score:3)
Native apps should follow your system font size settings. Complain to the developer if they do not.
Re: (Score:2, Informative)
Re: (Score:3)
Re: already passing it (Score:3)
Yes. Web designers have all decided they're laying out magazines. It's dumb, and that capability should never have been added to HTML.
Re: (Score:3)
Dead Ends (Score:4, Funny)
What we should be/are concentrating on is better reflow and text to speech.
Ask me how I can tell you don't go out in crowds much.
Re: (Score:2)
We're already past the level where I can benefit from higher resolution on phones. I'm over 40 and already have reading glasses, but I'd need to get special phone-only glasses to see any more detail or smaller type.
Indeed, I use 1.5 glasses for reading, and 2.0 glasses for my phone.
Re: (Score:3)
I agree 100%
A friend of mine has designed the world smallest font: 3x3 for upper case which includes 2x2 for lowercase.
On the iPhone 5 with ~326 ppi I can't read it so it looks like 300 dpi is "good enough" for screen. (Between 600 and 1200 dpi for print.)
The problem is the cost of getting a proper 300 dpi monitor that is 24" diagonal = ~19" wide by ~15" tall makes for an effective resolution of 5700 x 4500 well over 4K resolution.
It is going to be quite a while before the economies of scale deliver cheap
Re: (Score:3)
> what value is there to > 24-bit color?
TL:DR; The eye can clearly see more then 256 levels of primary colors.
There are 3 big problems with 24-bpp.
1. Mach Banding (or Gradients)
2. Blending
3. Limited Gamut
10-bit, 12-bit, or even 16-bit per channel provides more headroom for finer gradients.
The problem is exemplified when you do multiple blends. Since most display devices are still only 24-bit the maximum error we want with 8-bit-per-channel is 1/255 = 0.00392156862745. Using 16-bit per channel means
Re: (Score:3)
> 2x2 for lowercase. Right. That's 16 possible "characters" Correct. > with one of those being empty space and 4 of them being single pixels. Wow you figured out not every possible combination is -> useful <- all on your own? Here is your sticker.
Condescending sarcasm only works if you're actually making an intelligent point, otherwise you just end up sounding like a jackass. The point of my statement, in case it went over your head, was that there are 26 characters in the English alphabet, and 9 pixel patterns are insufficient to portray them all. Nice try anyway.
It should also be pointed out that the 2x2 lowercase font you're bragging about isn't 2x2. The h is 2x3; n, m, u and v are 3x2, s, t and y are 3x3... and those are just the ones I spotted
Re: (Score:3)
Not having to zoom the view in and out when doing CAD work and being able to read text fluently without bad kerning/font hinting getting in the way.
Re: (Score:3)
The only part of human vision that is capable of relatively high resolution is the "sweet spot" that makes up the middlemost +/- 8 degrees of your field of view. The eye's resolution drops off sharply once you get out of that zone which is part of the reason why if you fixate any word on this page, you likely cannot clearly distinguish words that are more than a few words or lines away in any direction from whichever point you fixated.
There is no need to have retina resolution across the whole field of view
Not until Anti-Aliasing isn't a thing (Score:5, Insightful)
Re: (Score:2)
We've reached this point with some devices, but a screen isn't a high enough resolution until Anti-Aliasing isn't needed in any form.
Came here to say the same thing. I'm looking forward to the new 4K monitors finally starting to come out, which may spell the end for AA.
Re: (Score:3, Insightful)
I'm really excited for 4k monitors, but it's going to be awhile before really high quality ones that are great for work (color accuracy and reproduction, no weird problems exhausting your eyes like a lot of gaming-specific monitors) as well as great for gaming (responsive, negligible lag/input-delay/ghosting) are available. Even longer before they are around $3,0000 (which is about the price at which I'd pull the trigger on at least one of them).
Hopefully, by the time those exist, GPUs will exist that can f
Re: (Score:3)
You're not going to get the response times you want until we go back to electron/phosphor tech instead of physically moving pixels. I get a new trinitron off ebay periodically because even the fastest "gaming" screens these days are still so slow compared to a CRT that I can see the blur just from moving around ingame like a smeared oilpainting.
Re: (Score:2)
yeah.
the mentioned devices do it for parts sourcing and money reasons - and not wanting to go higher density than what's available on default configs shipped with os on the os they're shipping with.. (android, yes edge ships with android... or might ship. but they do state that it will ship with android and then later with the ubububutouch).
that's the usual line anyways, what's cheap enough is good enough - for now. and that for now part is what companies like to skip in their shitty materials.
and it's real
Re: (Score:2)
I can't wait for that day.
But knowing gamers, we could have a 1,920,000 x 1,080,000 pixel 15 inch screen and they'd STILL turn on 16xAA.
I actually thought this was the whole reason why Apple is pushing their Retina display - they can get more performance out of their portable GPUs if they can stay away from wasting power on an AA pass.
Re:Not until Anti-Aliasing isn't a thing (Score:5, Informative)
can't seem to edit my previous post. antialiasing has nothing to do with resolution.
antialiasing and font edge smoothing as it is understood when people speak of antialiasing has pretty much everything to do with resolution.
if you can't see the individual pixels, and need say a group of 10x10 pixels to see a point on the screen, it becomes meaningless to do any subpixel effects of any kind on those 100 pixels that make up the smallest unit you can actually see.
and slashdot doesn't have an edit functionality btw.
Re:Not until Anti-Aliasing isn't a thing (Score:5, Insightful)
But once it's aliasing invisible to the human eye, anti-aliasing becomes pointless.
Re: (Score:2)
AA was a solution to get around lower res screens jagging everything up. It was not designed with higher screens in mind. People were running 800x600 displays when hardware acceleration became a thing at the consumer level.
The N64, notorious for it's aggressive AA, regularly had games running in 320x240..
No (Score:5, Interesting)
Come back and talk to me again when the average laptop and desktop screen hits high density PPI :)
Re: (Score:2)
Re:No (Score:5, Insightful)
Phones? Yes (There's not much benefit going past 1280 * 800 )
Tablets? Getting there (Nexus 7 at 1080p, Nexus 10 at 2560 * 1600)
Monitors? NO! Let me put it like this. Most monitors sit somewhere between the previously mentioned phone and tablet resolutions, despite being 2-5 times the size.
Re:No (Score:4, Insightful)
The average smartphone has a 720p screen with a pixel density well above 200 now. In the context of this discussion, why can't an average panel that is generally within 12-24"s of your face (desktop or laptop) not have the same requirements?
Sure, there exists laptops today that do. But those laptops don't provide you with alot of choice (both are walled gardens, yeah yeah yeah, I know you can install other things on them etc etc etc, but that's not the point here).
That said, I know this is coming. We're seeing more and more high resolution ultrabooks/laptops. So when I say come back and talk to me again, it's very likely by the end of the year :).
no (Score:3, Insightful)
I have rather poor vision, having to use different lens for reading, computer, distance...and I can still see the difference between 1080i and 4K monitors, a person with 20/20 should be able to benefit from even higher resolution (and I suspect even higher contrast ratios).
We know from testing a significant part of the female population would notice higher bit color space too.
Re: (Score:2)
1080i?
There are no 1080i monitors. The i stands for interlaced, which means that under high data rate of playing back a video, every other line of the current frame is skipped and filled in in the next frame.
The monitors are p, which stands for progressive and the progressive is progressive scan, as in top to bottom. This, today, is not really relevant on non CRT displays either since the CRTs used scanlines to display the image.
FYI, a 1080p display should be 1920 x 1080 square pixels.
Re: (Score:2)
yes I mistyped
as aside my "p" monitor can go into "i" mode though
Re: (Score:2)
But many people with less than perfect but better than dismal vision will tend to use their phones with uncorrected vision so they don't have to get out their reading glasses on the move.
Re: (Score:2)
well, some of us old farts use the reading portion of our /distance bifocals, which we change to monitor glasses when we get to work. so at work the phone is blurry. soon we'll get to heads up display in light weight glasses that actually zoom and focus, but that don't look like dork-ware (google glass,etc).
Re: (Score:3)
Re: (Score:2)
no, it is not. the article only addresses phone and laptop distance and resolutions and discusses the debate there.
maybe you remind us why AC should be under the threshold of normal viewers
Re: (Score:2)
In AC's defense:
Phone vs. laptop vs. big wall-mounted monitor seems an important distinction; the 10-foot view really is different.
That was in the slashdot article itself (not the linked article).
I chose to make my post because I thought it needed to be explicitly answered :)
Digital Movie Projection... and "Average People" (Score:4, Interesting)
If you build for the average person, you are doomed to fail. Because 1/2 of the population is above average. Also there are the finer details that a person doesn't fully recognize. The average person cannot tell the difference between 720p and 1080p. However if you have them side by side (with colors/contract/brightness matching) They will see the a difference.
Re:Digital Movie Projection... and "Average People (Score:4, Informative)
Because 1/2 of the population is above average.
Half the population is above (or below) the median.
Re: (Score:2)
Re: (Score:2)
have vision like 20/15. It doesn't last, and "half" the population isn't any where near it.
hrm, I've been contact-lens corrected to 20/15 for the past 28 years.
Re: (Score:2)
Wait, we're talking about digital movie projection, as in machines that will be used to show "Transformers 7: Incomprehensible Jump-Cut Explosiongasm!" and you're worried about it being commercial failure because too many people are above average?
(Oh god, when did I get so old?)
Re:Digital Movie Projection... and "Average People (Score:5, Insightful)
Basic stats fail.
I can't believe there are five posts on here that declare 'average' to be 'mean' and then go on to criticize the GP's lack of statistical knowledge.
I think the very first thing on the very first day of my first statistics class was a discussion of mean, median, and mode, and how all three are referred to as 'average' in common parlance, depending on context.
900 dpi (Score:2)
I remember someone did a test of this when Steve Jobs came out with "retina" claim. For a young child holding a phone at arm's distance 900 ppi was really "retina" resolution. I think we are likely one double short of retina resolutions on our higher resolution devices. 20 megapixel for a laptop, 5 megapixel for a phone is probably genuinely the limit.
Right now our hardware isn't fast enough to handle that much resolution so it is still a balancing act.
Re:900 dpi (Score:5, Interesting)
It's a bit complex, because the retina doesn't really have a static resolution: it integrates information from constant movements, responses nonlinearly to different patterns of photon impacts, and has different sensitivies across different parts. You could put a ballpark number on it, but it's difficult to really sort out what the "resolution of the retina" is.
To quote a paper:
Pretty interesting stuff, from a project that tried to build a photon-accurate model of the human eye [michaelfrankdeering.com].
Re: (Score:2)
Printers (Score:3)
Didn't laser printers show us that 300dpi is still a bit jaggy, and 600dpi is perfectly smooth at arm's length? When screen resolution is around 400dpi then we are probably done.
Re: (Score:2)
The article quotes researchers delivering numbers between about 240 dpi and 477 dpi. When 300 dpi laser printers were popular, I remember being able to spot the dots. However, I had to try. Since then 600+ dpi laser printers have taken over the market, and I can't easily spot the dots with the newer high-resolution laser printers.
As such, the observations from both the print and the display researchers are consistent. Somewhere between 200 and 400 dpi the technology becomes "good enough" for many peopl
Re: (Score:2)
My guess is you can easily see the difference between 600 dpi and 2400 dpi print, especially for a photo. Print something on your 600 dpi printer that came from a fashion magazine. Resolution is worse on screens than on paper but no the cutoff isn't where you think it is.
Re: (Score:2)
My guess is you can easily see the difference between 600 dpi and 2400 dpi print, especially for a photo. Print something on your 600 dpi printer that came from a fashion magazine. Resolution is worse on screens than on paper but no the cutoff isn't where you think it is.
As long as you're talking about dots that are simply on or off, yes. As soon as you start using dots whose size can be modulated, the comparisons get much fuzzier (haha), and of course fewer dots are needed.
Re: (Score:2)
Re: (Score:2)
Printers are 300/600 dpi in 2 bit colour, or 2 bit mono. Displays have at least 6 bit colour, and usualy 8 bit.
Printers and resolution (Score:5, Interesting)
Didn't laser printers show us that 300dpi is still a bit jaggy, and 600dpi is perfectly smooth at arm's length? When screen resolution is around 400dpi then we are probably done.
300dpi didn't cut it for dithered images - 600dpi was close, but not quite enough. The winner was the 1200dpi laser printers.
When you have a grayscale image you want to print on a single-color device, you use dithering to create the illusion of gray shades. A 1-to-1 mapping of pixels to printer dots gives you 2 colors - black and white. Photos look horrible. Double the printer resolution so you have a 2x2 dot array for each pixel and you have 16 possible shades. Double it again for a 4x4 dot array per pixel and you have 256 possible shades. So if you want a 300 pixel-per-inch gray scale image to look good, you need a printer resolution of 1200dpi.
Now, all this changes for RGB displays, since each pixel can be from 16 to 256 shades each. But less depth per pixel might be compensated for by smaller pixels and a higher density.
I remember in the early days of computer graphics, it was believed that 24-bit color (8-bit each Red, Green and Blue pixels) was the pinnacle. But once 24-bit color became widely available, we discovered it wasn't enough. When edited in Photoshop, often a 24-bit image would show banding in the sky, due to rounding errors in the math involved. Adobe added 48-bit color (16-bits per RGB channel) the rounding errors became much less visible. Today cameras capture 8, 12,14 or 16 bits per RGB channel, and using HDR software we get 96-bit color.
My point is we have a history of thinking we know where the limit is, but when the technology arrives, we discover we need a little bit more....
Re: (Score:2)
Re: (Score:3)
Not quite. In a 2x2 array, the number of black pixels can be 0, 1, 2, 3 or 4, that is 5 different values. In a 4x4 array, you have 17 different values.
In a way, we are both correct. My example shows the maximum number of combinations, while your example groups them by the number of black dots possible. Yes, in a 4x4 array there are six possible arrangements of 2-black and 2-white "dots". But those six arrangements may give you the appearance of different shades of gray - depending on the surrounding dots.
As an example - a 4x4 array with the two left dots black and the right side white. Imagine what that would look like if the same array is repeated vs
Re: (Score:2)
Yea, but they will just start on refresh rates then.. Followed by 3D?
It is not close to the end.. ;)
Re: (Score:3)
Laser printers are bitonal devices and need extra resolution for dithering. That's why there is even a detectable, although slight, difference between 600dpi and 1200dpi.
We've fixed resolution... (Score:2)
how about sorting out readability in bright sunlight and battery life (without losing the gains in the other factors)?
Holograms (Score:2)
If you want to make high resolution true holograms, you'll need to square the pixel density (intuitively, each pixel has to contain an entire high resolution image which would correspond to the hologram as seen though a pinhole placed on this pixel).
Bring on the 1M PPI displays.
Re: (Score:2)
that's overengineering. you only need to provide a a position-dependent view for each eyeball in the room. so number of viewers x 2 x 2D pixel resolution.
Re: (Score:2)
I agree, it gets you 98% there, but an eyeball isn't a pinhole either.
Re: (Score:3)
the rods and cones of the eye are on a surface, we only need concern ourselves with paths that terminate on that surface, they can originate from a surface
Re: (Score:2)
Re: (Score:2)
seems like... (Score:4, Informative)
Re: (Score:2)
Smallest pixel (Score:3)
What I'd rather have on my phone (Score:2)
For my phone, screen resolution is good enough(tm). Screen power consumption is in drastic need of improvement. It's consistently the biggest drain on the battery.
Almost there... (Score:2)
If we are getting 1080p on 5" phones you hold 10" from your eyes, I want similar resolution on my 30" desktop that I sit 20" from.
Maybe my math is wrong, but 2x distance should require 1/2 the pixel density. But 6x the size would be something around 6000x3000 on my desktop I think. I am happy with 2650x1600, but it could use 4x the pixels I guess.
I am happy with 52" 1080p in my den at 8' but 4k would be better...
I have been craving more pixels since I found I could make my 486 33 run some games in xga mode,
Hasn't stopped manufacturers (Score:4, Interesting)
Have we reached the limit of resolution improvements that people with average vision can actually notice?
Hasn't really slowed the push toward 4K in video production. While it's sometimes handy to have the frame real estate in production, it takes up a crapton more space, requires more power to edit and it's mostly useless to consumers. Even theater projection systems can't resolve much over 2K.
But if the industry doesn't go to 4K, then who will buy new cameras, editing software and storage hardware? And consumers might never upgrade their "old" HDTVs. Think of the children!
Re: (Score:2)
requires more power to edit and it's mostly useless to consumers.
who cares about consumers? Give me a 16K video sensor and then I can zoom or re-crop the video in post and still deliver a nice 4k product. It's simply a matter of the cost of hardware.
Re: (Score:2)
The video sensor has nothing to do with it. The size and quality of the lens is what determines image quality and resolution. Image sensors far outdo the lenses they're paired with right now.
Re: (Score:2)
Except when I move close... (Score:2)
I agree "at normal viewing distances" I don't have perfect vision, but when I want to see a detail, guess what I do? I zoom in, and move closer. This is where high resolution on those devices becomes important. Not at the standard "laboratory condition" distances, but when I want to inspect something closer.
Am I abnormal in this?
Human eye (Score:4, Informative)
Wikipedia says:
Angular resolution: about 4 arcminutes, or approximately 0.07Â
Field of view (FOV): simultaneous visual perception in an area of about 160Â Ã-- 175Â.
So that's about 2200 x 2400 if the screen is at the correct distance. Further away and you need less resolution. Closer and you won't see the whole image.
One Megapixel Per Fist (Score:3)
VR (Score:2)
Devices like the Oculus Rift need resolution to go way higher. I once calculated (perhaps completely incorrectly) that an 11K display was the threshold of "retina" for the Rift, although I'd imagine 8K would be close enough. This is a 5-7" display we're talking about here.
Re: (Score:2)
I was about to say the same. I own a development kit, and the pixels are really visible at today's 720p in a 7 inch panel. People call this the "screen-door effect as the space between the pixels resembles the wires in a screen door). For this tech to be as crisp as we all would like, what you suggest sounds about right. Even though they are showing off the HD prototype (1080p), they are very careful not to say the problem is gone with that model. And even if the pixels dissipated, it's in VR that you
um no (Score:2)
Multiple Screens? (Score:2)
Intel vs. AMD MHz race all over again (Score:2)
This is the MHz was all over again. After a certain point, 99% of users stop caring because it's "good enough."
I once ran a test on my Note 2 with different screen brightness and lighting conditions, with different people. I asked them to guess the screen resolution between (a) 720p and (b) 1080p, playing a movie in 720p (the actual screen resolution).
The ones who said they could tell the difference actually got it wrong, claiming it was 1080p.
Most just didn't care one way or the other.
Let the engineers lea
Resolution Independent GUIs Needed (Score:2)
Until we move to resolution independent GUIs and software I really don't care about insanely high res screens.
Projectors need as much resolution as you have wal (Score:4, Insightful)
No (Score:2)
I mean, I'm not sure I'd ever have any reason to care personally about a laptop screen better than 1920x1200... but on the other hand, I can't actually *buy* a laptop screen with 1920x1200, so no, we clearly aren't, until I can (again).
3D and beyond (Score:4, Informative)
Moore's law has allows us to double display densities nearly as fast as CPU and memory had been improving.
The addition of a simple lenticular or image mask can turn any LCD in to a glasses free display.
An additional increase in resolution will then turn this in to a multiview display.
A bit more resolution and a micro lens array can then create a light field display.
Beyond that is digital holography.
It's all fairly cut and dry, standards are already falling in place to accommodate and stream this level of video and even capture live video like this.
So any software developer that assumes we've hit the limit will looks as foolish as Bill Gates saying no one would ever need more then 640k of memory.
http://videotechnology.blogspot.com/search?q=Lenticular [blogspot.com]
http://videotechnology.blogspot.com/search/label/3D [blogspot.com]
http://videotechnology.blogspot.com/search?q=Multiview [blogspot.com]
http://videotechnology.blogspot.com/search/label/Digital%20Holography [blogspot.com]
Re: (Score:2)
That might have been a laughable statement now because more memory (as well as faster processors and other advances) allowed for computing applications that we couldn't foresee at the time, but the limitation on displays isn't "what are we using it for" but "what can the human eye see." Perhaps we'll wind up implanting devices in our eyes to increase our eye's resolution limit (and somehow get around the fact that our brains might not be able to deal with ultra-HD reality), but short of that there's a hard
Re: (Score:3)
But you always have the "My screen resolution is better than yours" crowd that will fall for the device with the better specs in droves so you can bet device makers will be designing and building resolutions that you and I can't ever hope to see.
But one should be careful to note that the issue is pixels per inch and not overall resolution here. 720P might be overkill on a 2" screen, but it might be way too low for the latest movie theater screen. Even at the best PPI you can see, the next frontier will be
Re: (Score:3)
My point being, even if you can see it, having more resolution is not necessarily a good thing.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Change the settings on your TV and/or BD player. Usually it's some kind of strange over-scan or post-processing mode with a