Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Displays Games

Standards Group Adds Adaptive-Sync To DisplayPort 82

MojoKid (1002251) writes "Over the past nine months, we've seen the beginnings of a revolution in how video games are displayed. First, Nvidia demoed G-Sync, its proprietary technology for ensuring smooth frame delivery. Then AMD demoed its own free standard, dubbed FreeSync, that showed a similar technology. Now, VESA (Video Electronics Standard Association) has announced support for "Adaptive Sync," as an addition to DisplayPort. The new capability will debut with DisplayPort 1.2a. The goal of these technologies is to synchronize output from the GPU and the display to ensure smooth output. When this doesn't happen, the display will either stutter due to a mismatch of frames (if V-Sync is enabled) or may visibly tear if V-Sync is disabled. Adaptive Sync is the capability that will allow a DisplayPort 1.2a-compatible monitor and video card to perform FreeSync without needing the expensive ASIC that characterizes G-Sync. You'll still need a DP1.2a cable, monitor, and video card (DP1.2a monitors are expected to ship year end). Unlike G-Sync, a DP1.2a monitor shouldn't cost any additional money, however. The updated ASICs being developed by various vendors will bake the capability in by default."
This discussion has been archived. No new comments can be posted.

Standards Group Adds Adaptive-Sync To DisplayPort

Comments Filter:
  • by gigaherz ( 2653757 ) on Tuesday May 13, 2014 @05:00AM (#46987595)
    The protocol used for digital signaling is internally surprisingly similar in concept to the analog equivalent. The idea of "adaptive" sync is that instead of starting a new frame after a fixed exact period, it can be "or later". There's no other technology involved other than allowing a frame to come late.
  • Re:It's a great idea (Score:3, Informative)

    by michelcolman ( 1208008 ) on Tuesday May 13, 2014 @06:18AM (#46987869)

    I have to wonder why the idea of adaptive vsync wasn't thought of earlier or implemented into display standards earlier. It just seems like such an obvious idea once you've heard of it. Surely someone else in the graphics/display industry must have had the idea before NVidia?

    It's just a vicious compatibility circle.

    CRTs have a fixed frame rate for technical reasons.
    Therefore graphics cards have a fixed frame rate to support CRTs
    Therefore LCD displays have a fixed frame rate to support graphics cards
    Therefore graphics cards continue to have a fixed frame rate

    New stuff has to remain compatible with old stuff, so nobody even thinks of breaking the circle. Until now, fortunately.

  • by 50000BTU_barbecue ( 588132 ) on Tuesday May 13, 2014 @09:21AM (#46988667) Journal

    It's tragic to hear the kind of nonsense people tell themselves. It's like a cyclist buying a car and saying "that's silly, why would a car have a speed?"

    It's the same thing, dingus!

    A monitor is just a high-speed serial device. Stuff comes in at some rate. The only reason CRTs had such tight timing requirements was because of the humongous amount of reactive power flowing in the deflection coils. You can just short them out but then all that reactive power becomes real (waste) heat. Lots of it. So people didn't do that.

    Remember how old Multisync monitors used to click relays as they shifted to different horizontal frequencies? That was the monitor swapping in different capacitors to create the LC tank with the deflection coils. So they could swap the power around between the coil and the cap instead of dissipating it.

    But that meant you better be ready to send me those pixels when I'm ready! I can't wait!

    There is no such large power being bounced around inside an LCD, it's really just thousands of analog voltages being sent to a glass panel. It can wait a bit, the picture won't fade that quickly. Eventually the capacitor that is formed by the LCD shutter will leak, but that takes time.

  • Re:It's a great idea (Score:5, Informative)

    by Immerman ( 2627577 ) on Tuesday May 13, 2014 @10:18AM (#46989127)

    That's clever in that they presumably accomplished that without a back-buffer back when RAM was expensive, but basically you're describing the vsynced based rendering which has been the standard for decades: Wait until the screen starts updating (the vsync), then start working on the next frame to maximize rendering time. It's nothing like G-sync/free-sync/adaptive sync though - you still have the issue that if your screen updates at 60FPS you have exactly 1/60 of a second (~16.67ms) to render each frame.

    Adaptive sync means that if you finish rendering an easy frame in only 14ms the screen can display it immediately instead of waiting an extra 2.67ms for the next scheduled refresh. Even more importantly if a complex frame takes 20ms to render you don't miss the refresh and have to have to wait an extra 13.3ms for the next scheduled refresh, wasting almost an entire frame - instead the screen can hold off on refreshing until the rendering is complete.

    TLDR: Adaptive sync means that if you enter a graphically intensive area that you can only render at 50fps, then your monitor will automatically refresh at 50fps, instead of continuing to try to refresh at 60fps and having to spend every other frame waiting for the rendering to finish, for an effective framerate of only 30fps. (or possibly a jittery 40fps with double-buffering: update,update, wait,u,u,w,...)

I was playing poker the other night... with Tarot cards. I got a full house and 4 people died. -- Steven Wright