Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Displays Cellphones Handhelds Apple

Are We At the Limit of Screen Resolution Improvements? 414

itwbennett writes "A pair of decisions by Motorola and Ubuntu to settle for 'good enough' when it comes to screen resolution for the Ubuntu Edge and the Moto X raises the question: Have we reached the limit of resolution improvements that people with average vision can actually notice?" Phone vs. laptop vs. big wall-mounted monitor seems an important distinction; the 10-foot view really is different.
This discussion has been archived. No new comments can be posted.

Are We At the Limit of Screen Resolution Improvements?

Comments Filter:
  • by vikingpower ( 768921 ) on Thursday August 01, 2013 @11:06AM (#44447397) Homepage Journal
    reading TFA...
  • already passing it (Score:5, Insightful)

    by tverbeek ( 457094 ) on Thursday August 01, 2013 @11:06AM (#44447401) Homepage
    We're already past the level where I can benefit from higher resolution on phones. I'm over 40 and already have reading glasses, but I'd need to get special phone-only glasses to see any more detail or smaller type.
    • by Anonymous Coward

      I feel your pain. I can no longer do any glasses-free browsing on my smartphone without a lot of squinting and resulting headache. I fear that increasing resolution will just tempt younger developers (who have yet to encounter the joys of presbyopia) to design things in even smaller fonts.

      • by Nadaka ( 224565 )

        you can use a 2 finger "stretch" gesture to zoom in.

        • you can use a 2 finger "stretch" gesture to zoom in.

          Only works for some things and some sites...

        • Doesn't seem to work with my Motorola V750. How about a 1-finger gesture?
      • Re: (Score:2, Informative)

        by Anonymous Coward
        Resolution. Is. Not. Font. Size.
        • by Dins ( 2538550 )
          Of course it isn't. But if a web site is designed for a 1080p monitor and the font size is not adjusted upwards when someone's viewing it on a 1080p smart phone, the type is damned small...
    • You shouldn't have to. What we should be/are concentrating on is better reflow and text to speech. Higher resolution should be a benefit as text becomes less blocky making shape recognition easier. Just because resolutions are higher doesn't mean you should have smaller text if you don't want it. With so many different size devices you should be able to load and manipulate content on demand. So if you don't want images because of connection or space constraints, your choice. Images should also be vect
    • by djbckr ( 673156 )

      We're already past the level where I can benefit from higher resolution on phones. I'm over 40 and already have reading glasses, but I'd need to get special phone-only glasses to see any more detail or smaller type.

      Indeed, I use 1.5 glasses for reading, and 2.0 glasses for my phone.

    • I agree 100%

      A friend of mine has designed the world smallest font: 3x3 for upper case which includes 2x2 for lowercase.

      On the iPhone 5 with ~326 ppi I can't read it so it looks like 300 dpi is "good enough" for screen. (Between 600 and 1200 dpi for print.)

      The problem is the cost of getting a proper 300 dpi monitor that is 24" diagonal = ~19" wide by ~15" tall makes for an effective resolution of 5700 x 4500 well over 4K resolution.

      It is going to be quite a while before the economies of scale deliver cheap

  • by earlzdotnet ( 2788729 ) on Thursday August 01, 2013 @11:07AM (#44447405)
    We've reached this point with some devices, but a screen isn't a high enough resolution until Anti-Aliasing isn't needed in any form.
    • by Tynin ( 634655 )

      We've reached this point with some devices, but a screen isn't a high enough resolution until Anti-Aliasing isn't needed in any form.

      Came here to say the same thing. I'm looking forward to the new 4K monitors finally starting to come out, which may spell the end for AA.

      • Re: (Score:3, Insightful)

        by Seumas ( 6865 )

        I'm really excited for 4k monitors, but it's going to be awhile before really high quality ones that are great for work (color accuracy and reproduction, no weird problems exhausting your eyes like a lot of gaming-specific monitors) as well as great for gaming (responsive, negligible lag/input-delay/ghosting) are available. Even longer before they are around $3,0000 (which is about the price at which I'd pull the trigger on at least one of them).

        Hopefully, by the time those exist, GPUs will exist that can f

        • You're not going to get the response times you want until we go back to electron/phosphor tech instead of physically moving pixels. I get a new trinitron off ebay periodically because even the fastest "gaming" screens these days are still so slow compared to a CRT that I can see the blur just from moving around ingame like a smeared oilpainting.

    • by gl4ss ( 559668 )

      yeah.

      the mentioned devices do it for parts sourcing and money reasons - and not wanting to go higher density than what's available on default configs shipped with os on the os they're shipping with.. (android, yes edge ships with android... or might ship. but they do state that it will ship with android and then later with the ubububutouch).

      that's the usual line anyways, what's cheap enough is good enough - for now. and that for now part is what companies like to skip in their shitty materials.

      and it's real

    • I can't wait for that day.

      But knowing gamers, we could have a 1,920,000 x 1,080,000 pixel 15 inch screen and they'd STILL turn on 16xAA.

      I actually thought this was the whole reason why Apple is pushing their Retina display - they can get more performance out of their portable GPUs if they can stay away from wasting power on an AA pass.

  • No (Score:5, Interesting)

    by wangmaster ( 760932 ) on Thursday August 01, 2013 @11:07AM (#44447409)

    Come back and talk to me again when the average laptop and desktop screen hits high density PPI :)

    • What, isn't 1366x768 good enough for everybody? Ugh.
    • Re:No (Score:5, Insightful)

      by Andrio ( 2580551 ) on Thursday August 01, 2013 @11:31AM (#44447729)

      Phones? Yes (There's not much benefit going past 1280 * 800 )

      Tablets? Getting there (Nexus 7 at 1080p, Nexus 10 at 2560 * 1600)

      Monitors? NO! Let me put it like this. Most monitors sit somewhere between the previously mentioned phone and tablet resolutions, despite being 2-5 times the size.

  • no (Score:3, Insightful)

    by iggymanz ( 596061 ) on Thursday August 01, 2013 @11:07AM (#44447417)

    I have rather poor vision, having to use different lens for reading, computer, distance...and I can still see the difference between 1080i and 4K monitors, a person with 20/20 should be able to benefit from even higher resolution (and I suspect even higher contrast ratios).

    We know from testing a significant part of the female population would notice higher bit color space too.

    • by azav ( 469988 )

      1080i?

      There are no 1080i monitors. The i stands for interlaced, which means that under high data rate of playing back a video, every other line of the current frame is skipped and filled in in the next frame.

      The monitors are p, which stands for progressive and the progressive is progressive scan, as in top to bottom. This, today, is not really relevant on non CRT displays either since the CRTs used scanlines to display the image.

      FYI, a 1080p display should be 1920 x 1080 square pixels.

    • by sjames ( 1099 )

      But many people with less than perfect but better than dismal vision will tend to use their phones with uncorrected vision so they don't have to get out their reading glasses on the move.

      • well, some of us old farts use the reading portion of our /distance bifocals, which we change to monitor glasses when we get to work. so at work the phone is blurry. soon we'll get to heads up display in light weight glasses that actually zoom and focus, but that don't look like dork-ware (google glass,etc).

    • Pretty near everyone would notice a higher color space. Not because we can distinguish between two colors one bit apart, but because there are some colors that just aren't displayable on a current RGB screen. Oranges are a common example.
  • by jellomizer ( 103300 ) on Thursday August 01, 2013 @11:08AM (#44447419)

    If you build for the average person, you are doomed to fail. Because 1/2 of the population is above average. Also there are the finer details that a person doesn't fully recognize. The average person cannot tell the difference between 720p and 1080p. However if you have them side by side (with colors/contract/brightness matching) They will see the a difference.

    • by Anonymous Coward on Thursday August 01, 2013 @11:30AM (#44447715)

      Because 1/2 of the population is above average.

      Half the population is above (or below) the median.

    • You're assuming Normal distribution. If it is logarithmic distribution, which I'd put more money on, then you're wrong. Only a small number of people can see better than average, 20/20. Many see far worse, and some, like myself for a time, have vision like 20/15. It doesn't last, and "half" the population isn't any where near it.
      • have vision like 20/15. It doesn't last, and "half" the population isn't any where near it.

        hrm, I've been contact-lens corrected to 20/15 for the past 28 years.

    • Wait, we're talking about digital movie projection, as in machines that will be used to show "Transformers 7: Incomprehensible Jump-Cut Explosiongasm!" and you're worried about it being commercial failure because too many people are above average?

      ...

      (Oh god, when did I get so old?)

  • I remember someone did a test of this when Steve Jobs came out with "retina" claim. For a young child holding a phone at arm's distance 900 ppi was really "retina" resolution. I think we are likely one double short of retina resolutions on our higher resolution devices. 20 megapixel for a laptop, 5 megapixel for a phone is probably genuinely the limit.

    Right now our hardware isn't fast enough to handle that much resolution so it is still a balancing act.

    • Re:900 dpi (Score:5, Interesting)

      by Trepidity ( 597 ) <[delirium-slashdot] [at] [hackish.org]> on Thursday August 01, 2013 @11:20AM (#44447583)

      It's a bit complex, because the retina doesn't really have a static resolution: it integrates information from constant movements, responses nonlinearly to different patterns of photon impacts, and has different sensitivies across different parts. You could put a ballpark number on it, but it's difficult to really sort out what the "resolution of the retina" is.

      To quote a paper:

      Many would say that new display technologies, 9 megapixel panels and projectors for example, are coming ever closer to “matching the resolution of the human eye”, but how does one measure this, and in what areas are current displays and rendering techniques still lacking? [...] The resolution perceived by the eye involves both spatial and temporal derivatives of the scene; even if the image is not moving, the eye is (“drifts”), but previous attempts to characterize the resolution requirements of the human eye generally have not taken this into account. Thus our photon model explicitly simulates the image effects of drifts via motion blur techniques; we believe that this effect when combined with the spatial derivatives of receptive fields is a necessary component of building a deep quantitative model of the eye’s ability to perceive resolution in display devices.

      Pretty interesting stuff, from a project that tried to build a photon-accurate model of the human eye [michaelfrankdeering.com].

  • by Princeofcups ( 150855 ) <john@princeofcups.com> on Thursday August 01, 2013 @11:14AM (#44447509) Homepage

    Didn't laser printers show us that 300dpi is still a bit jaggy, and 600dpi is perfectly smooth at arm's length? When screen resolution is around 400dpi then we are probably done.

    • The article quotes researchers delivering numbers between about 240 dpi and 477 dpi. When 300 dpi laser printers were popular, I remember being able to spot the dots. However, I had to try. Since then 600+ dpi laser printers have taken over the market, and I can't easily spot the dots with the newer high-resolution laser printers.

      As such, the observations from both the print and the display researchers are consistent. Somewhere between 200 and 400 dpi the technology becomes "good enough" for many peopl

      • by jbolden ( 176878 )

        My guess is you can easily see the difference between 600 dpi and 2400 dpi print, especially for a photo. Print something on your 600 dpi printer that came from a fashion magazine. Resolution is worse on screens than on paper but no the cutoff isn't where you think it is.

        • by sribe ( 304414 )

          My guess is you can easily see the difference between 600 dpi and 2400 dpi print, especially for a photo. Print something on your 600 dpi printer that came from a fashion magazine. Resolution is worse on screens than on paper but no the cutoff isn't where you think it is.

          As long as you're talking about dots that are simply on or off, yes. As soon as you start using dots whose size can be modulated, the comparisons get much fuzzier (haha), and of course fewer dots are needed.

      • Comment removed based on user account deletion
    • Printers are 300/600 dpi in 2 bit colour, or 2 bit mono. Displays have at least 6 bit colour, and usualy 8 bit.

    • by MasterOfGoingFaster ( 922862 ) on Thursday August 01, 2013 @12:05PM (#44448157) Homepage

      Didn't laser printers show us that 300dpi is still a bit jaggy, and 600dpi is perfectly smooth at arm's length? When screen resolution is around 400dpi then we are probably done.

      300dpi didn't cut it for dithered images - 600dpi was close, but not quite enough. The winner was the 1200dpi laser printers.

      When you have a grayscale image you want to print on a single-color device, you use dithering to create the illusion of gray shades. A 1-to-1 mapping of pixels to printer dots gives you 2 colors - black and white. Photos look horrible. Double the printer resolution so you have a 2x2 dot array for each pixel and you have 16 possible shades. Double it again for a 4x4 dot array per pixel and you have 256 possible shades. So if you want a 300 pixel-per-inch gray scale image to look good, you need a printer resolution of 1200dpi.

      Now, all this changes for RGB displays, since each pixel can be from 16 to 256 shades each. But less depth per pixel might be compensated for by smaller pixels and a higher density.

      I remember in the early days of computer graphics, it was believed that 24-bit color (8-bit each Red, Green and Blue pixels) was the pinnacle. But once 24-bit color became widely available, we discovered it wasn't enough. When edited in Photoshop, often a 24-bit image would show banding in the sky, due to rounding errors in the math involved. Adobe added 48-bit color (16-bits per RGB channel) the rounding errors became much less visible. Today cameras capture 8, 12,14 or 16 bits per RGB channel, and using HDR software we get 96-bit color.

      My point is we have a history of thinking we know where the limit is, but when the technology arrives, we discover we need a little bit more....

      • Comment removed based on user account deletion
    • Yea, but they will just start on refresh rates then.. Followed by 3D?

      It is not close to the end.. ;)

    • Laser printers are bitonal devices and need extra resolution for dithering. That's why there is even a detectable, although slight, difference between 600dpi and 1200dpi.

  • how about sorting out readability in bright sunlight and battery life (without losing the gains in the other factors)?

  • If you want to make high resolution true holograms, you'll need to square the pixel density (intuitively, each pixel has to contain an entire high resolution image which would correspond to the hologram as seen though a pinhole placed on this pixel).

    Bring on the 1M PPI displays.

    • that's overengineering. you only need to provide a a position-dependent view for each eyeball in the room. so number of viewers x 2 x 2D pixel resolution.

      • I agree, it gets you 98% there, but an eyeball isn't a pinhole either.

        • the rods and cones of the eye are on a surface, we only need concern ourselves with paths that terminate on that surface, they can originate from a surface

        • It's pretty darn close. Although, I think your point should be that this only assumes there is ONE observation point and ONE observer. Often this is not true, because there may be multiple people. Also, a standard 3D representation using 2 x 2D images also will assume a fixed distance between the eyes, while actual distance may vary between people and the orientation of their head. Plenty of room for innovation here.
    • Comment removed based on user account deletion
  • seems like... (Score:4, Informative)

    by buddyglass ( 925859 ) on Thursday August 01, 2013 @11:19AM (#44447577)
    It's a matter of PPI and typical viewing distance. Phones are often held about a foot from your face. Computer monitors are usually two or three feet away from your face. TVs are significantly further away. Greater distance = eye is more tolerant of lower PPI. That's why the iPhone 5 is ~326 PPI, a Macbook Pro with Retina is ~220 PPI, an Apple 27" Thunderbolt Display is ~110 PPI and a 65" 1080p TV is ~35 PPI.
  • by jones_supa ( 887896 ) on Thursday August 01, 2013 @11:23AM (#44447631)
    What is the size of the smallest pixel that can currently be made using LCD technology?
  • For my phone, screen resolution is good enough(tm). Screen power consumption is in drastic need of improvement. It's consistently the biggest drain on the battery.

  • If we are getting 1080p on 5" phones you hold 10" from your eyes, I want similar resolution on my 30" desktop that I sit 20" from.

    Maybe my math is wrong, but 2x distance should require 1/2 the pixel density. But 6x the size would be something around 6000x3000 on my desktop I think. I am happy with 2650x1600, but it could use 4x the pixels I guess.

    I am happy with 52" 1080p in my den at 8' but 4k would be better...

    I have been craving more pixels since I found I could make my 486 33 run some games in xga mode,

  • by HangingChad ( 677530 ) on Thursday August 01, 2013 @11:35AM (#44447791) Homepage

    Have we reached the limit of resolution improvements that people with average vision can actually notice?

    Hasn't really slowed the push toward 4K in video production. While it's sometimes handy to have the frame real estate in production, it takes up a crapton more space, requires more power to edit and it's mostly useless to consumers. Even theater projection systems can't resolve much over 2K.

    But if the industry doesn't go to 4K, then who will buy new cameras, editing software and storage hardware? And consumers might never upgrade their "old" HDTVs. Think of the children!

    • requires more power to edit and it's mostly useless to consumers.

      who cares about consumers? Give me a 16K video sensor and then I can zoom or re-crop the video in post and still deliver a nice 4k product. It's simply a matter of the cost of hardware.

      • The video sensor has nothing to do with it. The size and quality of the lens is what determines image quality and resolution. Image sensors far outdo the lenses they're paired with right now.

      • Well manufacturers who want to make money probably do. Consumers barely care about DVD vs Bluray.
  • I agree "at normal viewing distances" I don't have perfect vision, but when I want to see a detail, guess what I do? I zoom in, and move closer. This is where high resolution on those devices becomes important. Not at the standard "laboratory condition" distances, but when I want to inspect something closer.
    Am I abnormal in this?

  • Human eye (Score:4, Informative)

    by the eric conspiracy ( 20178 ) on Thursday August 01, 2013 @11:40AM (#44447853)

    Wikipedia says:

    Angular resolution: about 4 arcminutes, or approximately 0.07Â

    Field of view (FOV): simultaneous visual perception in an area of about 160Â Ã-- 175Â.

    So that's about 2200 x 2400 if the screen is at the correct distance. Further away and you need less resolution. Closer and you won't see the whole image.

    • I once did the back-of-napkin calculations to make a scale-independent metric. Astronomers know that if you hold your fist at arm's length, your fist occludes roughly ten arc degrees in whatever direction you measure across your fist. My search found that someone's 20/20 eyes can generally resolve details to about 1 arc minute (didn't read Wikipedia's rationale). If that much screen area contains one megapixel or more, then the screen is well within the definition of a "Retina" display (at the given vie
  • by Guspaz ( 556486 )

    Devices like the Oculus Rift need resolution to go way higher. I once calculated (perhaps completely incorrectly) that an 11K display was the threshold of "retina" for the Rift, although I'd imagine 8K would be close enough. This is a 5-7" display we're talking about here.

    • by Knutsi ( 959723 )

      I was about to say the same. I own a development kit, and the pixels are really visible at today's 720p in a 7 inch panel. People call this the "screen-door effect as the space between the pixels resembles the wires in a screen door). For this tech to be as crisp as we all would like, what you suggest sounds about right. Even though they are showing off the HD prototype (1080p), they are very careful not to say the problem is gone with that model. And even if the pixels dissipated, it's in VR that you

  • Everyone at my work over 40 can't see a damn thing so their 1920x1080 monitors are at 1280x800. That awful pixel shrink ratio results in blurry crap which is almost as difficult to read. If the resolution was 10x higher, it would be a lot less noticeable because larger resolution pixel blurs would be possible. So no, they should keep increasing it.
  • Multiple screens can turn data into information when applied.
  • This is the MHz was all over again. After a certain point, 99% of users stop caring because it's "good enough."

    I once ran a test on my Note 2 with different screen brightness and lighting conditions, with different people. I asked them to guess the screen resolution between (a) 720p and (b) 1080p, playing a movie in 720p (the actual screen resolution).

    The ones who said they could tell the difference actually got it wrong, claiming it was 1080p.

    Most just didn't care one way or the other.

    Let the engineers lea

  • Until we move to resolution independent GUIs and software I really don't care about insanely high res screens.

  • by Quantus347 ( 1220456 ) on Thursday August 01, 2013 @12:19PM (#44448367)
    I have a friend that is a huge fan of a projector for his primary display. When you take even high end resolution and project it out to 12 feet across, there is no such thing as too much resolution.
  • by neminem ( 561346 )

    I mean, I'm not sure I'd ever have any reason to care personally about a laptop screen better than 1920x1200... but on the other hand, I can't actually *buy* a laptop screen with 1920x1200, so no, we clearly aren't, until I can (again).

  • 3D and beyond (Score:4, Informative)

    by John Sokol ( 109591 ) on Thursday August 01, 2013 @01:51PM (#44449827) Homepage Journal

    Moore's law has allows us to double display densities nearly as fast as CPU and memory had been improving.

    The addition of a simple lenticular or image mask can turn any LCD in to a glasses free display.
    An additional increase in resolution will then turn this in to a multiview display.

    A bit more resolution and a micro lens array can then create a light field display.
    Beyond that is digital holography.

    It's all fairly cut and dry, standards are already falling in place to accommodate and stream this level of video and even capture live video like this.

    So any software developer that assumes we've hit the limit will looks as foolish as Bill Gates saying no one would ever need more then 640k of memory.

    http://videotechnology.blogspot.com/search?q=Lenticular [blogspot.com]
    http://videotechnology.blogspot.com/search/label/3D [blogspot.com]
    http://videotechnology.blogspot.com/search?q=Multiview [blogspot.com]
    http://videotechnology.blogspot.com/search/label/Digital%20Holography [blogspot.com]

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...