Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Displays Hardware

Input Lag, Or Why Faster Isn't Always Better 225

mr_sifter writes "LCD monitor manufacturers have constantly pushed panel response times down with a technique called 'overdrive,' which increases the voltage to force the liquid crystals to change color states faster. Sadly, there are some side effects such as input lag and inverse ghosting associated with this — although the manufacturers themselves are very quiet about the subject. This feature (with video) looks at the problem in detail. The upshot is, you may want to test drive very carefully any display boasting low integer millisecond pixel response times."
This discussion has been archived. No new comments can be posted.

Input Lag, Or Why Faster Isn't Always Better

Comments Filter:
  • Common knowledge (Score:3, Informative)

    by Rotaluclac ( 561178 ) on Friday February 06, 2009 @02:28PM (#26755561) Homepage

    I really thought this was common knowledge.

    When I bought my Eizo LCD last summer, the first thing I did was read around. These issues came up immediately.

    Long story short: Prad [] was my friend.


  • by dotancohen ( 1015143 ) on Friday February 06, 2009 @02:40PM (#26755749) Homepage

    Then go with a large brand name, and get a common model. One of the advantages of buying in meatspace is that there is _less_ selection, so you only have the common (and supposedly mainstream tech) models to look at.

    Are these differences that anyone but hardcore gamers could notice? I do notice when LCD monitors look green / yellow or when they have low viewing angles, but the whole 6/7/8 bit and response time thing: is it noticeable?

  • by Who Is The Drizzle ( 1470385 ) on Friday February 06, 2009 @02:43PM (#26755795)
    No, plasmas have near instantaneous response times that are pretty much identical to what you get on a CRT. The issues you get with a plasma is called "phosphor lag" which has to do with the three colors not quite lining up perfectly and it gives you a trailing image of the colors. It's especially noticeable on high contrast edges or if things are moving really quickly. It can be especially noticeable in gaming, but at least IMO it's much less annoying an artifact than the ghosting, smearing, and horrible motion resolution you get with LCDs (and yes they are present even on 120hz LCDs before someone brings that up).
  • by Anonymous Coward on Friday February 06, 2009 @02:45PM (#26755823)
    Even humans who are finely in-tune with this sort of thing can't detect changes under about 10ms.
  • by ShadowRangerRIT ( 1301549 ) on Friday February 06, 2009 @02:47PM (#26755847)
    You exaggerate the effect of latency grotesquely. Getting less than 1 ms of latency is not necessary; humans can't perceive or react fast enough for that to make a huge difference. When the human eye has a rough "frame rate" of 25 fps, providing input more quickly than roughly double that (or 50 fps) will simply mean that roughly half the information is lost. 50 fps translates to about 20ms between frames. 5-10 ms latency on an LCD (typical for computer monitors) is still sufficient to convey "real time" information to a player, particularly given that mean human response time for visual imagery is roughly 180-200 ms. Humans simply don't perceive time quickly enough for that to matter.
  • by Cowclops ( 630818 ) on Friday February 06, 2009 @02:55PM (#26755951)
    I knew somebody would make some gross misstatement like "The human eye only sees at 25 fps anyway"

    And for that, here is the obligatory link to []

    In short, the shortest flash a human eye can see depends on a lot of things. These factors are explained thoroughly on that web site. The tl;dr version is this: The human eye can discern A LOT MORE than 25 fps.
  • by Anonymous Coward on Friday February 06, 2009 @03:02PM (#26756051)

    The tl;dr version is this: The human eye can discern A LOT MORE than 25 fps.

    It can discern that, yes. That's easy. You see a 30fps movie, you see a 60fps movie, the latter is noticeably smoother. The question is if it matters. That is, if the human eye needs to discern 100fps, or if going that much higher in terms of a monitor or video game is just bragging and/or l33t graphic card wankery.

  • by Mprx ( 82435 ) on Friday February 06, 2009 @03:09PM (#26756137)
    The refresh rate needed to avoid flicker with an impulse light characteristic display is unrelated to the frame rate needed for perfectly realistic motion quality. Note however that non-flicking sample and hold displays such as LCDs will produce lower motion quality than impulse response displays of the same refresh rate because of the temporal smearing. (see [] for explanation).
  • by omnichad ( 1198475 ) on Friday February 06, 2009 @03:14PM (#26756225) Homepage
    But CRT's go black on those pixels between slow refreshes. That's strobing that LCD doesn't even come close to. The LCD is at full brightness 100% of the time, and the pixels only change color when they're told to.
  • Re:same old... (Score:5, Informative)

    by ivan256 ( 17499 ) on Friday February 06, 2009 @03:22PM (#26756359)

    CD-ROMs don't. They use "Zone CAV". It's much cheaper and easier to make a drive spin at a constant angular velocity. Unfortunately that results in higher data rates at the outer edges of the disc, so what drives do is they split the disc up into zones. The disc is spun faster for a zone closer to the center of the disc.

    Older CD-ROM drives used straight constant-angular-velocity, and would advertise the fastest data rate (which was at the outer edge of the disc).

    The only time a modern CD drive will spin with constant linear velocity is when it's playing back audio in real-time. And even then, many players buffer now, so they use the Zone CAV method anyway.

  • by Anonymous Coward on Friday February 06, 2009 @04:06PM (#26756903)

    This is true 10ms response time is 100 frames per second. The displays touting 6-4ms are talking about a non-realistic grey-grey metric. 6ms average for any subpixel to stabilize at 50% doesn't directly translate into frames per second. The digital signal the display syncs up to is typically 60hz anyways. That means the best the display could do is 16.5ms latency.

  • by Anonymous Coward on Friday February 06, 2009 @05:51PM (#26758261)
    [H]ardOCP has a display section on their forums that's fairly decent. There are usually a couple of anal retentive posters that go through a lot of displays so they can give some decent feedback on various models.

    There aren't any perfect monitors, btw, so you'll basically be sifting through reviews looking for displays that have acceptable flaws at a price you're willing to pay.

    Here's a quick overview:

    IPS/S-IPS: 8-bit displays with best color, very little if any ghosting, very little color shifting, highest cost (if you can even find them)

    PVA/MVA/S-PVA: I think these are 8-bit panels as well, but most have color shifting when viewed dead on. On larger panels this means the left side of the panel can look different than the right side if you sit close. Mid range cost.

    TN: 6-bit panels with lowest price. Quality ranges from junk to fairly decent. Most panels use this technology and most people are happy with them.

    One last bit of fun: "Panel lotteries." Many manufacturers will make a LCD model that uses a good panel then switch it out with a different one later in it's life without changing it's model number. It happens fairly often, so keep it in mind.
  • by karnal ( 22275 ) on Friday February 06, 2009 @06:02PM (#26758415)

    Actually, the issue here is probably more due to the fact that movies are shot at 24 frames per second. 24 doesn't fit into 60 properly, so there will be times where the scene repeats more in one set of refreshes than another. See wiki entry on Telecine, notably telecine judder: []

    With a 120hz refresh, 24 can go into 120 evenly, so you won't see any choppiness.

  • by A Friendly Troll ( 1017492 ) on Friday February 06, 2009 @06:07PM (#26758485)

    Er isn't more brightness and gamut a good thing for pictures that INTEND those qualities? There's always the brightness and saturation knobs for you to turn down if need be.
    A display which has a higher gamut will always be able to adjust to a lower gamut, while the reverse is not true. Same thing with brightness.

    Sadly, it doesn't quite work that way.

    The DTP standard is calibration to 120 or 140 cd/m2, depending on the lighting. On some monitors, that's impossible to achieve; even that value is too high for dim environments. Right now I'm using a CRT which is - subjectively speaking, as I don't have a colorimeter - around 70 cd/m2, and I find it very comfortable as the only light in my room is an incandescent 60W bulb.

    With some backlights, getting a low level of brightness is extremely hard, so monitor manufacturers resort to a really nasty trick: panel blocking. It basically means software control of brightness (crushing dynamic range), and tends to lead to very, very poor black levels and contrast because the backlight is too powerful and bleeds through the panel itself. It might help if you think of this as an audio system: you have an amp dialed all the way up to eleven, so you have to use your sound card's volume control to lower the amplitude of sound waves. The hiss and static when nothing is playing will still be present, and instead of having all 16 bits of possible amplitude values, you have artificially decreased it by a couple of orders of magnitude and got a very small dynamic range (that would be the contrast for our monitor).

    In a similar fashion, you cannot adjust a wide gamut monitor to standard sRGB gamut without losing dynamic range and without software emulation. Gamut is a hardware property of the backlight. For an example, see a comment I've recently left in another story; I couldn't think of anything better but it should be simple enough to understand: []

    The wide gamut problems cannot be solved until we have at least 10-bit visual content paths. The software needs to work with 10+ bits per colour, the graphics card needs to output extra levels, and the monitor needs to know how to display them. Until all that happens (and it's not happening any time soon; think at least five years from now), the best thing you can get is an imperfect software emulation, either through the monitor's built-in DSP, or some code in the operating system. The only thing that does work is a fully colour-managed environment (Photoshop), but the amount of crap you have to go through to make your images look good to everyone is mindboggling.

    Gamut and colour management are incredibly complex topics, with whole books written about them. I won't pretend I understand everything, but I know enough to understand that wide gamut had been brought upon us by marketing droids instead of engineers.

  • by Moraelin ( 679338 ) on Friday February 06, 2009 @08:00PM (#26759765) Journal

    I've got a LCD panel with 5 ms latency and I don't notice problems when gaming. If you're quick enough to say anything over 1 ms is too slow, you're a pretty hardcore (and quick) gamer. And if you're that good, you're probably best served by a pro setup anyway, not low-level consumer grade shit. But I'm not as twitch quick as I used to be, and my gamertag certainly isn't "Fatal1ty," so 5 ms seems fine to me.

    1. You seem to assume that there actually is some kind of pro gamer gear. All the pro LCDs are actually as in graphics artist pro, and usually actually have the slowest response times of them all. It's "pro" as in "it'll look like that when printed too" (and maybe we'll throw calibration hardware and software in too, 10bit per colour component instead of 8 if it's a several thousand dollar model, led backlight, etc), not as in "it'll display the image in 1ms". It's mostly static images that'll get displayed on those.

    The very panel that goes into one already works against you. The fastest ones are TN+Film, but those tend to be in 6 bit per component and dithering instead of 8, have shitty viewing angles (often to the extent that you can see a slight difference between the centre and the corners just because the line from the pixel to your eye falls differently), and at least according to the "+Film" part creates more non-homogenity too. The most accurate ones are VA ones (as in, MVA or PVA), but those are also the slowest by far. Guess which goes into a "pro" level display for graphics professionals? Right.

    2. If you have that fast reflexes and actually live or die by shooting 1ms earlier, most TFT's have an extra problem: most first buffer the whole image, then scale/display it, because it's the easiest way to deal with scaling an image of a different resolution. Unfortunately they do it even when you use their native resolution.

    I.e., what you see on the screen is actually what they received 1 to 3 frames in the past. At, say, 60 fps, on some models you can actualy see the image as it was received 50ms ago. I.e., the difference between 1ms and 5ms latency of the panel is entirely the wrong bottleneck to optimize there.

    (Since you mentioned Fatal1ty, last I've heard he used a CRT, btw.)

    Better models in this aspect are starting to appear, but it took a while and they're still few and far in between. Mostly because it's not one of the numbers dangled in front of the fashion victims, so there was very little incentive to do anything at all about it.

    3. The numbers you get told are by and large... well, not lies, but the standard was written by the vendors for their benefit not yours. E.g., a 5ms display if it's measured black-to-white-to-black can be actually faster than a 1ms grey-to-grey with massive overdrive, and produce less ghosting.

    The short and skinny was that the black-to-white-to-black standard was already a lie by itself, and only used because it was the smallest number you can measure without overdrive. The standard as defined by the vendors lets them ignore the first and last 10% of the moving from colour A to colour B. Even that ought to give you cause for thought: that number didn't say "it will reach colour B in time X" but merely "it will get within 10% of colour B within time X". A 10% error is piss-poor on the logarithmic scale of the eye. And it lets them ignore the long asymptotic rest of the curve. But in a transition from black to white or back they can ignore more of the long tail than in a grey to grey transition, according to their own bogus standard, so that's why everyone quoted that.

    This all changed when someone invented overdrive. The idea here basically is that you can accelerate faster and overshoot the finishing line if you want to. The measured time still is "in how much time you can get within 10% of the finishing line." It doesn't matter that then you overshoot by 50% and spend even more time coming back asymptotically from the _other_ side. But you can't do much overdrive o

  • by NitroWolf ( 72977 ) on Friday February 06, 2009 @09:23PM (#26760487)

    This is one of the reasons why I refuse to buy LCDs for gaming, both on my desktop and for consoles. Other factors include refresh rates, variable resolution, and numerous quality problems (dead or stuck pixels, color reproduction, viewing angle, brightness uniformity, etc).

    Given a choice, nobody would prefer to play on a laggy ISP, so it's really awful that manufacturers don't inform about multiple-frame image processing delays on 60hz monitors.

    CRT technology is so mature and LCD so comparatively half baked that I'm totally revolted by the general consensus to throw out completely superior performance in favor of smaller form factor (it's not like they're moved often).

    I spent months last year looking for a flat panel to buy that I would want to game on, and came up empty handed, so I simply abstain.

    I'm currently using a ViewSonic P220f from a friend after my 8 year old Sony GDM f500r was recently retired, both 21". My consoles are on a 34" Sony WEGA KV-34HS510.

    When my tubes finally give out in a few years, I'll be looking for something far better than LCDs to replace them with.

    This tired old refrain almost wears me out. With 5 minutes of Googling, you can find an awesome LCD> I'm a hard core gamer - I spent some time researching monitors and ended up with several different ones and they all work great. I've never had a dead pixel issue with any quality monitor I've purchased. I've never had a DOA. Lag on a quality LCD is immaterial to gaming, as is ghosting. If you are experiencing these issues, you have a shitty LCD. My current monitors are a pair of 30" Dells. Somehow, I always manage to score at the top of the scoreboard in COD4, TF2, etc...

    If you can't find a flat panel that you want to game on, you aren't looking very hard. There are plenty out there. I recommend the Planar 2611W for a 26" monitor. These Dells are exceptionally nice - in fact, I had no idea 2560x1600 made that big a difference in enjoyment when playing a game. I will never go back to 1920x1200. Show me a CRT that does that resolution at 30" that isn't several thousand dollars (Do they even exist?).

    So yeah - the tired BS about LCDs being unfit for gaming is just that - BS. I will gladly play you any day you want and we'll see who is superior. I'll have the "handicap" of the "laggy" LCD to go against your awesome skillz and CRT response.

  • by MightyYar ( 622222 ) on Friday February 06, 2009 @10:15PM (#26760893)

    You could browse around here [], but honestly I didn't have a hard time... but it was like 2 years ago now. The important thing is to search for an IPS or (P)VA panel and stay away from the TN stuff... those are the ones that change the most when you change angle. This site lets you know what kind of panel a monitor has. [] I ended up settling on an Acer AL2051W with an P-MVA panel. It is significantly better than a TN screen for viewing angle, but isn't as good as an IPS in that regard. Also it has a glossy screen which drives some people nuts... me too, sometimes!

    Here's an awesome rundown at anandtech [].

    Some links for you:
    Dells get a mention []
    Some discussion about the $$$ Apple monitors []

    Whatever you do, don't give in and buy cheap :) My wife (who only does office stuff) has a cheap TN panel and honestly, it hurts me to look at it even for web browsing :)

  • by skreeech ( 221390 ) on Friday February 06, 2009 @10:48PM (#26761105)

    In addition to the other reply to this parent.

    I have not read of a flat panel monitor that will accept a 120hz source. With motion interpolation turned off a 120hz lcd should look exactly like a 60hz one but be easier on the eyes.

    With a CRT you could play Quake, CS, or whatever at 120fps while with a 120hz lcd the video card will only get polled at 60hz.

  • by X0563511 ( 793323 ) on Saturday February 07, 2009 @01:02AM (#26761787) Homepage Journal

    No, shot at 24 FPS but processed to and played back at some screwed up ratio.

    Every third frame is displayed twice, I believe.

    This assumes you watch an NTSC signal. The numbers are different for PAL.

  • by mathew7 ( 863867 ) on Monday February 09, 2009 @04:08AM (#26780685)

    Actually grey-to-grey measurement is correct. It's not to 50%. It's the time a requested shade (not black or white) turns to another requested shade. On TN matrices, changes from pure black to pure white (or reverse) is done very fast compared to changing between 2 shades of grey. So they give the grey-to-grey which represent a "worst-case" timing.
    See [], although it's an old article (2004) it is still good reading material.
    Quote: "Measurements suggest that the response time is the smallest when the pixel's state (color) is transitioning from black to white."

What this country needs is a good five cent ANYTHING!