Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
GUI Input Devices XBox (Games) Games

Measuring Input Latency In Console Games 160

The Digital Foundry blog has an article about measuring an important but often nebulous aspect of console gameplay: input lag. Using a video camera and a custom input monitor made by console modder Ben Heck, and after calibrating for display lag, they tested a variety of games to an accuracy of one video frame in order to determine the latency between pressing a button and seeing its effect on the screen. Quoting: "If a proven methodology can be put into place, games reviewers can better inform their readers, but more importantly developers can benefit in helping to eliminate unwanted lag from their code. ... It's fair to say that players today have become conditioned to what the truly hardcore PC gamers would consider to be almost unacceptably high levels of latency to the point where cloud gaming services such as OnLive and Gaikai rely heavily upon it. The average videogame runs at 30fps, and appears to have an average lag in the region of 133ms. On top of that is additional delay from the display itself, bringing the overall latency to around 166ms. Assuming that the most ultra-PC gaming set-up has a latency less than one third of that, this is good news for cloud gaming in that there's a good 80ms or so window for game video to be transmitted from client to server."
This discussion has been archived. No new comments can be posted.

Measuring Input Latency In Console Games

Comments Filter:
  • by TheSambassador ( 1134253 ) on Sunday September 06, 2009 @01:50PM (#29332723)
    Only in the ports that the PC gets from the consoles (or even ones that happen to be released on both systems) do I notice the horrible latency. It's awful in Oblivion, Fallout 3, Bioshock, and plenty of others. Part of it has to do with V-Sync, but turning that off doesn't eliminate all of it. I can't believe that 133ms is the norm. I've grown up a PC gamer, and that's definitely one of the top reasons I *hate* console FPS games.
    • Re: (Score:3, Interesting)

      by Kral_Blbec ( 1201285 )
      Side note about Oblivion and Fallout 3. I think it is delayed intentionally to make it feel like someone actually is moving and make it more RPG-like.They aren't supposed to be twitchfests. Many FPS have the char move so fast it isn't humanly possible, turning, running, switching weapons etc.
    • Interesting, and I thought that was just my setup getting old (on both Bioshock and Fallout 3)... I'm used to 0-latency gaming (never got into consoles other than Mario Kart 64 and Mario Tennis), so that was a bit of a shock...

    • I grew up a PC gamer too, and that's why I'm utterly desensitized to input lag. In other words, slogging through Counter-Strike with a dial-up modem builds character. Maybe younger gamers growing up with the rise of online consoles will be even further conditioned to latency.
    • Re: (Score:3, Insightful)

      by KulSeran ( 1432707 )

      1) Input is often sampled only once per frame. That is why quake at 120fps feels more responsive, the time between you pressing a button and the game noticing you pressed the button is reduced.

      2) Input and actions are often determined on a per-frame basis. Meaning the fastest delay you can get is a single frame. Consoles tend to have games that run at a target frame rate (30, 24, 60) that determines how much visual flavor the game can have (60hz leaves less time to draw and update stuff than 30hz). So, at

    • Feature, not bug.

      Humans do no move infinitely quickly. We can not always carry out the actions that we want to instantly. In fact, I would rather have input lag, with a pretty animation on screen showing *why* things are lagging than have everything magically happen instantly.

      • by KDR_11k ( 778916 )

        When it comes to things like aiming my arm introduces the realistic delay all by itself. Delaying it further just causes confusion because your physical motion is over while the ingame action keeps going for a bit.

      • Cmon, that's silly.

        If you're supposed to be controlling the character, then the "natural" lag should be all that I have to deal with. We don't need the character emulating input lag when my own real life body already takes care of it.

        And if you were confused, I'm talking about mouse lag. If it takes a little bit to accelerate to a speed when moving, that's fine and to be expected. If it takes time to draw my sword I will, of course, accept that. However, if moving my mouse a little bit to the lef
        • If you're supposed to be controlling the character, then the "natural" lag should be all that I have to deal with. We don't need the character emulating input lag when my own real life body already takes care of it.
          But your real life body *doesn't* take care of it. Your finger moves all of 3mm, that takes very little energy, and very little time. At least when compared to the energy/time required to e.g. swing a sword all around your body and smack it into an enemy.

          The lag between me deciding to move my n

          • I feel like you didn't read the entire post, and it seems like you're missing the point.

            I clearly stated that I was talking about mouse lag. Your examples are clearly things that are expected... when I press a button to swing my sword, it should take time for my character to swing the sword.

            However, there shouldn't be a delay (as little delay as hardware will allow) between the time you press the button and the time that your character STARTS to swing the sword.

            "Realistic" is NOT the same as "
            • Correct, what my point is is that this study isn't distinguishing between "realistic" and "laggy". My feeling with oblivion was actually that it was fairly accurate with how long things should take. I actually felt the combat system was one of the more fluid that I've ever seen.

        • As a side note, yes, games lacking animations suck, actually, in oblivion, the completely static jump animation pisses me off more than the strafe non-animation. But that's a separate discussion.

    • Have you tried GTA4? It's a nightmare. Missions are partially next to impossible because of it. And then of course ther's a bug, where the lag goes up to 2 seconds. *And now* please add the stuttering of a crappy engine adaptation on anything less than a quad-core CPU.

    • As far as any Half-Life 1 based games are concerned, input that registers server side would change drastically based on the client FPS.

      http://www.fortress-forever.com/fpsreport/ [fortress-forever.com] for a detailed analysis of the situation of forcing fps_max in settings. Scroll down to the very bottom for the tl;dr graph.

      I used to force mine on 101 (like every noob recommends) before I read this, and there is a noticable increase in speed when I lowered it to 50. So much, that it's become impossible to shoot the autoshotty (its

    • This study is completely bogus. Take a look at the Call of Duty video. They begin counting the frames when the first of the trigger LEDs lights, but the third trigger does not light until their frame count reaches 3. The gun fires at the 7^th frame. Is this a valid test? Are you telling me that the Call of Duty will fire your gun the second the trigger button is even slightly depressed? Moreover, by their own admission, the study did not take into account the delay induced by the monitor itself, which they

    • I like V-Sync. I find the tearing distracting.

      But if you use V-Sync, you really need to disable all buffering. Triple Buffering + V-Sync will murder your input times. I remember back when I had a crappy videocard, and my FPS dropped to 15, the input lag would spike into the hundreds of miliseconds.

      I used nHancer [nhancer.com] to disable all buffering and pre-rendering, and now I'm good. I do use vsync because my eyes notice pixel anomalies, and constantly focusing on tearing is worse than a tiny bit of input lag. I play

  • Reality check (Score:5, Interesting)

    by girlintraining ( 1395911 ) on Sunday September 06, 2009 @01:51PM (#29332731)

    ...average lag in the region of 133ms. On top of that is additional delay from the display itself, bringing the overall latency to around 166ms.

    Considering that until very recently all displays had an inherent lag of about 70ms -- and that new [LCD] technology has pushed that higher. But we're only considering half the equation: The average human response time for auditory or visual input is 160--220ms. This increases as we age. We are also part of this system and we're a helluva lot more lagged than our technology is.

    I want an upgrade.

    • The average human response time for auditory or visual input is 160--220ms

      But that doesn't have anything to do with how much lag we can detect

      • But that doesn't have anything to do with how much lag we can detect

        You're saying we can't measure the time from when a person receives an input until there's a neurological response?!

        • Because we have no way of knowing if the person has detected the input until there is a neurological response, yes, I'm saying we can't. Please stop playing with big words that you don't understand.
          • by Spaham ( 634471 )

            of course we can, by managing how and when the input is emited !
            Or if we record the activity, we can tell the many steps the processing goes through.
            (at least electrically, that is)

          • Because we have no way of knowing if the person has detected the input until there is a neurological response, yes, I'm saying we can't.

            The only meaningful test here is ABX [wikipedia.org]. Present player with A, B, and X. A is the system with less latency than B, and X is randomly either A or B. Run test multiple times and see whether player's determination of X is significantly different than it would be by pure chance (50%). The player doesn't have to be able to quote the latency difference, merely detect it, perhaps b

        • I think you guys are referring to two different we's. The "How much lag we can detect' we was referring to how much on-screen lag players can detect while playing while you seem to be referring to how much mental lag researchers have found in people's responses.

          • I think you guys are referring to two different we's. The "How much lag we can detect' we was referring to how much on-screen lag players can detect while playing while you seem to be referring to how much mental lag researchers have found in people's responses.

            Close. I'm looking at the entire system, not just the technology side but also the human side. Granted, the computer and its peripherals are the easiest to modify by far, but looking at the entire loop (Computer-display-person-input-computer) is the only way to make informed choices about improving the quality of real-time applications (which is the ultimate goal of this research).

            • by MattRC ( 1571463 )
              Personally, I don't care about the human side of the input lag question. That's irrelevant to me because at present that line of questioning is unlikely to lead anywhere. I don't see any evidence for the argument that "looking at the entire loop ... is the only way to make informed choices about ...". As I see it, I can make an informed choice by looking at the display alone - choosing the right display will result in a clear, measurable improvement. It sounds like choosing the right platform will have
              • Whether it's 150 ms or 1,500 ms, I can't change it, and everyone else in my age group is on the same playing field.

                No, but if you want a game to appeal to a wider audience, maybe a game that isn't as latency-sensitive would be beneficial. This way, 30 year old gamers wouldn't be outgunned by 20 year old gamers on account of a 50ms reaction time difference.

                • Re:Reality check (Score:5, Insightful)

                  by sahonen ( 680948 ) on Sunday September 06, 2009 @05:40PM (#29334527) Homepage Journal
                  Actually, raw reaction time, which doesn't even change too much between 20 and 30, is not the primary element of skill at first person shooters. I've looked at the raw reaction time (i.e. click your mouse when you see a light turn on) of many gamers, some who absolutely dominate me and some at or below my level, and there was no real correlation between that reaction time and skill. From what I've gathered, I've determined that skill at FPS games is more a function of experience and training rather than raw reaction time.

                  The basic categories that set an elite gamer apart from an average or newbie gamer go something like this:

                  Predicting your opponent and being unpredictable yourself: Knowing where your opponent is going to be, and acting in a manner that your opponent can't predict. If you can put your crosshair where you know your enemy is going to be, and he can't do the same, you're going to win even if he has better raw reaction time than you. This is a function of experience with the game.

                  Decision making: Evaluating the importance of the various high-level goals in the game, deciding which ones to prioritize, and acting on that decision. Making better decisions, making them faster. Again, a function of experience with the game.

                  Aiming skill: If an enemy appears on your screen away from your crosshair, how quickly and accurately you can move your mouse to put the crosshair over him. This is a function of training, learning exactly how much mouse movement corresponds to how much movement on screen, and being able to precisely produce that movement with your hand. This is often confused for reaction time when watching people play, but really, the reaction time component is only in seeing the enemy and deciding to shoot him. The rest is muscle memory.

                  This is where input lag really hurts, it's very very important that your field of view appears to correspond to your mouse movements with absolutely no lag. Console games don't suffer from this because aiming with console controllers is far less precise than using a mouse, so the input lag "hides" behind the imprecision of the joystick. When the game meets the PC where people are using mice, the lag between moving your mouse and your on screen view changing becomes perceptible.

                  Movement skill: The ability to manipulate your controls to allow you to travel faster. Not just finding the most efficient routes, but being able to use quirks in the game's movement code to give yourself more velocity. Another function of training, getting the control inputs just right can be difficult to master.

                  Teamwork: In team-based games, communication, chemistry, planning, and effective group decision making.
        • I meant we as gamers

          • Re: (Score:1, Insightful)

            by Anonymous Coward

            there are 2 different lags.

            1. Something happens in screen, small lag, you press a key.

            2. you press a key, small lag, something happens in screen.

            Later one is very very detectable, while first one doesn't matter so much.

    • Re:Reality check (Score:4, Interesting)

      by Mprx ( 82435 ) on Sunday September 06, 2009 @02:14PM (#29332925)

      The only inherent display latency of a CRT is the time taken for the beam to arrive at any particular part of the screen. In the worst case this is one frame, which at a reasonable refresh rate (100Hz+) will be only 10ms or less. A good LCD (there's only one on the market, the ViewSonic VX2268wm) updates in the same line by line fashion as a CRT, and will add only a few more milliseconds switching time latency.

      Of course you still have the latency in the input/processing/rendering stages, but this doesn't have to be very high (increase input sampling rate, avoid any interpolation, disable graphics buffering, etc). The only reason most modern console games are unplayable is because reviewers all ignore latency, and low latency can be traded for higher graphics detail which the reviewers pay attention to.

      Perceived latency has nothing to do with reaction time.

      • Real reality check (Score:1, Insightful)

        by Anonymous Coward

        Ignoring the flamebait (only one good LCD? Really? I'm pretty happy with my Philips), I must say that as someone who has been a PC gamer most of his conscious life, I'm pretty impressed with what consoles had (and still have) to offer. When I was introduced to the Zeldas for the N64 there were so many things going through my head at once that I couldn't tell what the first impression was, but I'm pretty sure it wasn't "unplayable". Neither did I notice severe amounts of lag, but even if it's true that conso

      • Often the real problem players have isn't the latency itself, because our brains will accommodate almost any lag as long as it's uniform(witness the lack of "frames" for most movies, despite being (usually) at a mere 24fps). What causes the problem is actually when you have more than one set of stimuli that are going at different rates. This is most noticeable with audio and video not being in sync.

        With an LCD display, this is magnified greatly unless you are going directly from the computer or machine t

    • Considering that until very recently all displays had an inherent lag of about 70ms

      CRTs have a lag of nearly zero. Perhaps ones with 3D comb filters have more. Back in the old days (NES, Atari), a video game could directly affect the current color at the electron beam, giving a lag of nearly zero. It's only gotten worse since. Same for controllers, where they either had a separate wire for each button (e.g. Atari), or had a simple shift register that could be read in under a millisecond.

    • Most large screen LCD tvs have a lot of digital processing before you get to see the output. For most applications, this is fine, but for important ones (like playing Melee*), it makes the TV unusable. In these cases, you usually have to dig through the menus to find a game mode option and turn it on. It doesn't fix the whole problem though, the best way is to go with a CRT.

      *Yes, my priorities are a bit unconventional and possibly screwed up.

    • by jadin ( 65295 )

      The average human response time for auditory or visual input is 160--220ms.

      So what you're saying is from the initial stimuli to seeing my response happen is 293ms - 353ms? Compared to if our technology was 'perfect' it would be 160 - 220ms?

      Seems pretty obvious why people want faster response technology..

      [*]apologies if I'm misinterpreting the data

    • "The average human response time for auditory or visual input is 160--220ms."

      Where did these numbers come from?
    • by ildon ( 413912 )

      Human brain lag and input lag don't overlap, they add on to each other. So it's kind of moot for this discussion.

    • My LCD monitor (which is about 2 years old now) claims to have a response time of 8ms, is this very fast, or am I missing something?
    • by Tom ( 822 )

      Sorry, but human visual processing time does not figure into the equation. The brain compensates for that, which is why our experience of the world appears to be immediate despite the processing time required.

      Lag is also something you can train for, unfortunately. If you are playing a lot of low-latency FPS games, you become more aware of it, because you're training your brain for fast reaction times. Like everything else in the human body and mind, how well you perform depends on how much you train/use it.

    • Re: (Score:3, Informative)

      by Hurricane78 ( 562437 )

      The average human response time for auditory or visual input is 160-220ms.

      You know exactly that you're talking bullshit. The statement is true, but is irrelevant, because this is the response time when the pipelining of predicted actions does not work. How else would we be able to do any high-speed actions?

      The brain *expects* a bang and a flash when we press the pistol trigger. If it's too late, this will show later, when the predictions and reality are compared again.

      You see the monster, and pipeline a shot, some ms later, your hands press the trigger. Now you get the signal of

    • Human lag time mostly doesn't matter. The brain compensates for it. For example, when you catch a ball, your brain knows how it needs to correct for the fact that what it sees is what was some time ago, and what it tells the muscles to do won't happen for some time, and it's done this your entire life so it has a pretty good idea exactly how long those delays will be.

      Toss in an additional 133 ms, and you've totally fucked it all up. The brain tries to calcuate responses just as it always has, yet the cal

  • DDR? (Score:3, Interesting)

    by koinu ( 472851 ) on Sunday September 06, 2009 @01:51PM (#29332735)

    Anyone can make a comment how the lags affect gameplay on DDR? I still hesitate to buy an LCD TV and stay with my CRT, because I am not sure about it. When playing DDR, I usually listen to the music and the rhythm, so I really don't know exactly what would happen with a LCD TV.

    I've seen people playing DDR with Samsung LCD TVs on Youtube. It seems it's working well.

    • Re:DDR? (Score:5, Insightful)

      by bjorniac ( 836863 ) on Sunday September 06, 2009 @02:07PM (#29332859)

      DDR or any rhythm/timing based game will be perfectly fine with a fair amount of lag so long as the lag is consistent. The game isn't based much on reaction times, more hitting the pads at the right intervals. Once you get accustomed to the lag (which should happen naturally as you dance) the actual amount won't matter so much - you just have to move 160ms before the arrow hits the circle or whatever, something you will have been doing already, moving to land on the beat, rather than waiting for the beat and then moving. This differs from, say, a shooter like counter-strike, where you have to react as fast as possible to what is a non-rhythmic, supposedly non-predictable event (unless the opposing team comes out in synchronized swimming formation).

      Inconsistency in lag would be a killer here, as it is everywhere, as it would be essentially adding a random component to your timing that you have no control over. But any time you do rhythmic work you're doing predictable lag compensation already - eg clapping on the beat requires you to start the motion before the beat happens rather than react to it.

      • by koinu ( 472851 )

        I'm almost sure that when you have an audio lag, the result would be pretty bad (at least you can correct some values in the main settings). I've heard people complain about songs that are off-sync. And second thing is... there are people who can read about 1000 arrows a minute. This means 60ms between arrows!! And I can play at about average of 250ms between arrows, and I pretty suck at this game. When you look at 160ms average lag and my reaction time of 250ms (and less, because it's average, of course!)

      • Actually one of the most fun things I've tried with an FPS was writing a very simple program that would move the mouse five pixels in a random direction 20 times a second.

        It starts out as insanely annoying, especially on the desktop, but after a few minutes in the game, you end up finding it a lot more challenging than normal. Coop becomes even more fun when you're running in formations because you might accidentally shoot your friends. Or miss them when you're actually trying to frag them in revenge.

      • DDR != any rhythm/timing based game. For example, early IIDX and Pop'n Music, pre-timing adjustment settings, are more or less unplayable on modern TVs. You're actually "building" the songs -- constructing them note by note -- as the basic framework plays on. Try doing that with terrible lag, it ain't pretty. In short, this:

        "...any rhythm/timing based game will be perfectly fine with a fair amount of lag so long as the lag is consistent."

        is completely false.
    • Re:DDR? (Score:5, Informative)

      by Anpheus ( 908711 ) on Sunday September 06, 2009 @02:29PM (#29333061)

      One thing Rock Band has done, and presumably this came from somewhere else or has propagated to Guitar Hero and other rhythm games, is that you can set the video latency and audio latencies separately and finely tune the system so that it looks and sounds like you want it to be.

      Rock Band 2's guitar controller actually has a tiny light sensitive component and a cheap microphone, so that you can auto-set your game. It's really very handy, and took only fifteen seconds or so. The result was that when a note crosses the "active line" of the game is when I should both strum it / hit it / sing it and hear the result.

      Are you certain there is no way to do the same thing with DDR?

      • by koinu ( 472851 )

        I haven't seen such thing, but I can manually correct audio delay in main settings, as far as I remember. I don't need it though, because I have a simple CRT and an analog HiFi system. There should not be any big lags. At least I can play without problems.

      • The PS1/PS2 versions did have such an option. I haven't played a home DDR game since DDR Extreme so, I don't know if they do anymore. I think beatmania IIDX still does though.

    • I've spent a lot of time comparing how my rhythm games perform on various CRTs and LCDs, and I can tell you that the experience is orders of magnitude better on a CRT. However, if you're playing at low difficulties (1-10 steps) or low BPM (500 or so), then you are probably okay with an LCD. This range encompasses essentially all play that is done with your feet, so if you are physically dancing to your rhythm games, then by all means go for it. However, if you are playing rhythm games with your fingers o
      • Re:DDR? (Score:4, Insightful)

        by mwvdlee ( 775178 ) on Sunday September 06, 2009 @03:02PM (#29333349) Homepage

        I'm sorry, perhaps I'm misunderstanding you, but in the world of music 500 BPM is far from "low". Most "danceable" music generally is somewhere between 120 and 130 BPM, Drum-and-bass (which most people would consider quite fast) is about 170-180 BPM. Finding anything over 200 BPM is uncommon and usually for novelty sake. Perhaps the measurement you're talking about is something else than Beats Per Minute?

        • by Mprx ( 82435 )
          And a lot of people dance to drum and bass at half tempo. I used to dance at full speed, but I find it too tiring now. 500bpm is fast enough that its hard to distinguish individual beats.
        • by koinu ( 472851 )
          Yeah... 500 bpm is pretty high, in my opinion, too. Perhaps he is speaking about the highest rate during the song, there are songs where you go above for a very short time. But on average, it's a bit difficult to keep a constant rate of 500 bpm. But it does not mean, that it's not possible, of course.

          See here (from ITG):
          http://www.youtube.com/watch?v=CpTcN2zTqKY [youtube.com]

          • Re: (Score:2, Informative)

            He shouldn't have referenced "BPM" because it's not really accurate, but by "500 BPM," he's talking about the rate at which the notes are falling down the screen. Many people take advantage of Hi-speed settings which allow you to increase this rate, thus decreasing the total number of notes your brain has to process at any given time. So a 125BPM song at Hi-speed 4 scrolls the notes at a rate of "500BPM." The actual beats per minute remain the same, though, obviously.
            • Re: (Score:3, Informative)

              by Judinous ( 1093945 )
              Yes, by BPM I was referring to the DDR setting used to control the speed at which the notes flow past the screen. The reason that it must be turned up for higher difficulty songs has less to do with the number of notes on the screen at once, and more to do with the amount of separation between them. At low speeds, there are not enough vertical pixels separating the notes to distinguish the order that they are actually coming in, and whether they are simultaneous (jumps) or not. When played at "normal" sp
            • by koinu ( 472851 )

              Hmm... I see. What I was talking about is when you have 400 steps in your song and the songs length is 1:40 then you get 240 steps per minute (on average of course). This is pretty fast, in my opinion. But songs usually can have faster passages (where you have very dense steps). Well... but I don't need to tell you this, I think.

    • Re: (Score:3, Informative)

      by nbates ( 1049990 )

      I used to have a monitor connected to my wii. Then I bought a Samsung LCD TV and I noticed the lag. Not directly, but indirectly. Both my partner and I noticed that we got worst in playing. We seemed to miss the markers every time.

      I went through the manual and didn't find any lag data, but I found a "game mode" option. Turning the option on improved the experience and our scores. So I guess that you should read the manual before you buy an LCD TV to check if it has a "game mode". I read that this mode redu

  • Just as a point of comparison, the typical latency you want in pro audio applications between when a guitarist plucks a string, and when they hear the note, is less than 15ms. This makes me think that the 80ms might be *acceptable*, but it's by no means ideal.
  • by silverspell ( 1556765 ) on Sunday September 06, 2009 @02:20PM (#29332969)
    It may be that console gamers have learned to expect around 100-150ms of input latency, perhaps thanks to visual cues that help to justify the latency on some level. (If I decide to jump, it takes a certain amount of time to react to my thought and make that happen; if I tell Mario to jump, maybe he takes about the same amount of time to react to the stimulus. It makes a certain kind of sense.)

    But I assure you that musicians find that level of latency unacceptable. When you're playing a software synth live, performing with other musicians, even 75ms of latency is very noticeable and makes you feel like you're playing through molasses. Same thing with recording -- if it takes longer than 25-30ms to hear my own sound coming back at me, I definitely notice it. Virtuosic music regularly exceeds an input density of 50ms per event!
  • Something I never understood in arcade games was the "shooting pause", where a maximum amount of button presses (usually associated with the fire button) yielded a maximum number of projectiles before failing to register additional button presses. A noticable break in visual projectiles was observed before the additional button presses are again registered. I can't think of a single game I've ever played where the shooting pause increased dramatic tension, added to the atmosphere of enjoyment or balanced
    • by Mprx ( 82435 )
      The point of limiting the number of player projectiles on screen is to provide a risk/reward mechanic by encouraging you to move closer to the enemies. You'll do more damage, but you'll have less time to react. Rewarding risky behavior is generally good design.
    • Re: (Score:3, Informative)

      I think what you see is simply hitting the max number of inflight bullets? Software limited yes but probably based on what the hardware can handle.
      If the game uses hardware sprites (quite possible) it may be limited by the total number of sprites on screen.

      So when you hit this max number you wont be able to fire any "new" bullets until an old one hits something or goes offscreen.

      • Yes a good example is Galaxian for a limited number of projectiles on screen at any one time. An example of a shooting pause was Centipede. Both instances you fire projectiles at different speeds, but both seem to have an ai exploit where an incoming opponent "knows" you are approaching your shooting pause and will attempt to collide with you. The Centipede example depicts an extremely frustrating experience since you are essentially on rapid-fire with random pauses. The Galaxian example is also strange
      • by slim ( 1652 )

        when you hit this max number you wont be able to fire any "new" bullets until an old one hits something or goes offscreen.

        It's a classic gameplay mechanic. In Space Invaders, there's one player bullet on screen at a time -- so if you miss you've a long wait before you can fire again. In Asteroids it's three.

    • The real reason they did it is because if too many things are on the screen at the same time, the game slows down. By limiting the number of bullets, they even out the work load. Otherwise you could have many dozens on the screen and the game play would suffer. ... or at least that was the original reason; the original arcade games were really up against performance brick-walls they tried very hard to hide, and nearly always succeeded.

      These days it may well be a motif they use, or simply to help with gamepl

  • by Visoblast ( 15851 ) on Sunday September 06, 2009 @02:47PM (#29333223) Homepage

    On the old Atari 2600, the game has to be written around rendering fields (half frames) of video. On NTSC, that is 59.94 fields per second, or a little under 16.7ms. Input is usually read during vertical blanking between fields. That makes for not much more than 33.3ms latency in the worst case of input change just after vertical blanking.

    Maybe new isn't really better.

    • by Ant P. ( 974313 )

      New is better up to a point.

      In the 16-bit era you could do processing in the hblank too, forget the vblank. Sega used it for better-looking water (palette swap), Amiga could do the same to get a huge amount of colours on screen, or even run two different horizontal resolutions at the same time.

    • Lower latency isn't necessarily better. As long as latency is belong a certain threshhold, it's fine. Trading off the capability of a modern console vs. the capability of a vintage console, I'll take the modern console any day.

      Gameplay style, on the other hand, is a different matter. The modern games I enjoy playing the most are the ones they classify as "retro". Mega Man 9, Geometry Wars, Pac Man Championship Edition, etc. You could port most of these games to Atari 2600, Colecovision, or NES, and sti

  • Kernel developers have complained that UI latency doesn't have very good measures under Linux. Now here's a methodology for measuring it. This could lead to kernels better optimized for the user experience that were provably so.

    I don't think though, for the Linux kernel or for a video game, that pure latency is exactly the right measure. I think the standard deviation of latency is an important measure too. A user should be able to reliably predict the latency. They may not consciously do so, but their

  • I don't know what caused it, but my normal Call of Duty: World at War server that i used to ping 50 to just jumped to about 120 on average. It really messed with my accuracy for a few rounds until i adjusted. I didn't think it would make a difference but it definitely did.
  • by billcopc ( 196330 ) <vrillco@yahoo.com> on Sunday September 06, 2009 @03:30PM (#29333609) Homepage

    OK, I'll be the first to concede that I am more sensitive (or attentive) to lag issues, being an audio/video hack myself, but how can 4+ frames of lag be ignored or even tolerated in any action game ?

    I already consider the 3-frame LCD lag inacceptable and utterly shameful.. I mean the data is there, put it up already! If the de-crapifying filters need that much lookahead to function, they need to be refactored to use look-behind, and if the copycat engineers can't fix it, at least give an option to disable it per-port so we can play our games.

    Now on the development side, as a so-so game dev myself, I can't think of any valid excuse for Killzone's 12 frames of lag. What the hell are they doing in the loop ? Here's what a game loop is supposed to look like :


    for (;;)
    {
        if(button_pushed(1) && ga_hasammo(ga_PEW_PEW))
        {
            ga_plWeapon::spawn_bullet(); /* MOTHERFUCKING PEW PEW!!!1!! */
        }

        render_scene();
    }

    Notice the lack of "sleep(9000)" statements ? So that's what, 20 usec worth of code ? Take input, spawn bullet, play sound and draw the goddamned frame already! If that takes you 200 msec to process, then your game is really running at 5 fps with a shit ton of interpolated frames in-between, and you should probably go back to writing Joomla plugins.

    Ten years ago, this shit would not have flown. We used to tweak the everloving crap out of our loops, and VSYNC was the norm, which made late frames painfully obvious. To deal with it, we used hard-timed loops and every single piece of code had to obey the almighty strobe. You had 16 or 33ms to render your frame, and if that wasn't enough well, you had to tweak your code. Today, now that even game consoles have gone multicore, there is no excuse. You could even have one thread acting as a clock watcher, monitoring the other tasks and telling them to hustle (e.g. degrade) if they're falling behind.

    To prioritize anything else is to betray the game's purpose: to entertain via interactivity. If a game is going to sacrifice interactivity, I might as well go watch Mythbusters instead :P

    • Re: (Score:1, Informative)

      by Anonymous Coward

      A lot of this comes from developers trying to exploit the concurrency possible in modern systems. So, at 30 fps, if you sample input in the main thread (hopefully early in the frame, so 33 ms before the simulation is done) -> renderthread runs behind the main thread (up to 33 ms) -> GPU runs behind the render thread (up to 33 ms) -> CPU/SPU post processing (up to 33 ms) -> wait for next vsync (if you're unlucky you miss it) -> any frame processing the TV does (god knows how many ms), and then

    • by Anonymous Coward

      Most modern console games process graphics stuff in parallel with engine updates. So on a given frame, it moves the entities around in the engine (simulating physics, applying animation, etc). On the next frame, the graphics code renders the entity in that new position. Then there are another 1-3 frames of buffering due to CPU-GPU communication, triple buffering, and hardware output lag (the number of frames depends on how the developers configure things).

      For a game running at 60 fps, 4-5 frames of delay

    • Today, now that even game consoles have gone multicore,

      That doesn't help things. In fact, it makes them worse. Concurency causes lots of issues that each have their own solutions. One simple one is double buffering all the data. This puts all your threads a frame behind, but it means that you get to use more CPU since every thread has data in its input buffer instead of waiting on other threads.

      Notice the lack of "sleep(9000)" statements ? So that's what, 20 usec worth of code ? Take input, spawn bullet,

    • by Barny ( 103770 )

      One of the reasons I think some PC devs "get it" was when Team Fortress 2 recently added a delay into the scout pistol shooting, because they had inadvertently made it so you could fire a bullet from the thing "as fast as the input device would allow" so you had smart arses using programmable keyboards and mice to fire their whole clip in less time than a human could fire 2 bullets, so as simple as your loop looks, obviously a WHOLE lot more code needs to be there, such as checking rate of fire for instance

  • At the Hot Chips symposium last month, Rich Hilleman, Creative Director for Electronic Arts, commented on the 100ms delay inherent in the Wii remote (Wiimote). I assumed there was an issue in the delay involved in sensing the accelerometers, but this article shows 100ms is not any different from other consoles.

    I wonder what Rich Hilleman was really getting at? Maybe people are more sensitive to delays when they are a result of a full-body-type movements rather than a button-press.

    This is interesting
  • One video frame? With a normal camera? That's 1000/30 = 33.333... ms. From making music, I know when you start to notice lag, and some people can notice this at around 10 ms, and I get into trouble above 30 ms. So you would have to have at least the double temporal resolution, to get useful results.

    • by Barny ( 103770 )

      Why 30? I avoid games that don't have a good enough engine that I can get at least 60FPS.

      16.67ms... That explains why I don't like consoles so much :)

  • Seeing lots of comments about LCD screens - i'm thinking of upgrading from my old 17" to something around 21-24" LCD widescreen.

    Am a gamer (not completely hardcore though) - so response time would be good. Am aware of the refresh issues with LCD's. Also do some photography stuff so good colour reproduction would be handy (after calibrated etc), but viewing angle not so important.

    Any ideas? Looked around for reviews and found a few conflicting reports - suggestions much appreciated! Budget is low
    • I have an (actually, four) Acer P241W screens. 2ms refresh, which is pretty good. 1920x1200. No real complaints - the "default" color settings seem to be quite different from one screen to the next, but rgb/brightness/etc. are all pretty configurable, so you can calibrate them. If you research, the major complaints I'd expect to see are stuck pixels (seems common; I kept trading in at Staples until I found a "clean" one), and there's some weird thing with XP + NVidia + a VGA cable and not being able to
    • by Mprx ( 82435 )

      A large proportion of latency is a multiple of the frame time, so increasing frame rate will have great latency advantages as well as improving the motion quality and reducing the sample and hold blur.

      There are currently 3 true 120Hz LCDs.

      ViewSonic VX2265wm (defective brightness control)
      Samsung 2233rz (which has slightly higher latency, and also defective brightness control)
      ViewSonic VX2268wm (only LCD without serious defects).

      Note that these are all TN panel, so they will have unacceptable color unless vie

  • The amount of latency is not really an issue as much as the consistency of latency. There's nothing more frustrating than getting fragged because YOUR input was processed late because of too much going on, or for any other reason. I recall missing tons of jumps in Megaman 2 because of this, so it's hardly a new problem.

  • Just some thoughts from research I've done that used or at least looked at reactions and reaction times. If game makers are already thinking about these things, good for them. If not, got an opening for a cognitive psychologist in game design?

    As noted, reaction times are greater than your response lags. A good human reaction time is around a third of a second. If your lag times are cut according to the refresh rate, a person's reactions could get placed with an earlier or a later frame. But because percepti

    • by Mprx ( 82435 )
      Training enough that you can react before conscious thought is the only way to play twitch games competitively. Ordinary hardware carefully selected and tuned for low latency is enough for this.
  • An LCD (or preferably OLED) monitor + source running at 100fps, or even better, 200fps would mean no more flicker, super smooth video, and almost no input lag, and in the case of OLED, longer display lifetime (because less voltage is needed since the pixels' duty cycle can be higher). It's a win all round.

    Let's all switch already. Okay, recorded video data will be 2-4 times bigger, but it'll be so worth it.

  • Street Fighter II for the SNES, I videotaped a play session because I was suspicious. I then stepped through it frame-by-frame.

    Usually, when both me and the computer player were defending, and I started an attack... the opponents counter-attack animation would begin before my attack animation started...

    That is, the game artifically lagged in order to increase the difficulty beyond what a human opponent could provide. (aka, it cheated)

  • Anyone who played TF1 all played at 600ms+ on their 14.4k modem. We grapple hook just fine. Console noobs have it easy. Wow I sound like him [timecube.com]

  • There are so many types of game where this kind of lag doesn't matter, or can be compensated for.

    Starting with the obvious ones: anything turn based is unaffected. Input could get very sluggish indeed before it broke a game like Civ or XBLA Carcassone. Battles in RPGs like Final Fantasy are the same.

    Even a lot of action games don't depend on instant responses. Yes, something like Quake III is all about twitching. But something like Bioshock has a much more measured style, which is *not* ruined by a three fr

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...