Mobile G-SYNC Confirmed and Tested With Leaked Driver 42
jones_supa writes: A few weeks ago, an ASUS Nordic support representative inadvertently made available an interim build of the NVIDIA graphics driver. This was a mobile driver build (version 346.87) focused at ASUS G751 line of laptops. The driver was pulled shortly, but PC Perspective managed to get their hands on a copy of it, and installed it on a ASUS G751 review unit. To everyone's surprise, a 'G-SYNC display connected' system tray notification appeared. It turned out to actually be a functional NVIDIA G-SYNC setup on a laptop. PC Perspective found a 100Hz LCD panel inside, ran some tests, and also noted that G-SYNC is picky about the Tcon implementation of the LCD, which can lead to some glitches if not implemented minutely. NVIDIA confirmed that G-SYNC on mobile is coming in the near future, but the company wasn't able to yet discuss an official arrival date or technology specifics.
No suprise (Score:1)
A dubiously useful, expensive option marketed at people who are willing to pay through the nose to have their games run poorly on a laptop...
Seems like the right audience to market it at to me.
Wut? (Score:3, Insightful)
I'm trying to interpret the summary, but I cannot.
They found a 100Hz LCD inside of what? A graphics card? A laptop? The driver?
Re:Wut? (Score:5, Informative)
The original story goes like this:
1. Nvidia claims that it needs the expensive FPGA chip to make variable refresh rate on current range of monitors. Calls it G-sync, tech adds significant costs and nvidia takes additinal licensing fee from monitors that include the said FPGA board.
2. AMD finds the adaptivesync spec in current VESA spec for embedded displayport used in laptops. Gets VESA to add it to upcoming displayport 1.2a spec for desktop. This does mostly the same thing without needed FPGA board or additional costly licencing fee. Monitors with adaptivesync and same specs end up about 100USD cheaper than monitors with G-sync and same specs.
3. Nvidia openly states that it cannot make G-sync cards compatible with adaptivesync any time soon and that it will continue supporting G-sync. Many pundits wonder how long Nvidia could keep attempting this kind of vendor lock-in on monitors before ceding its position due to rather extreme price differential between G-sync and adaptivesync monitors.
4. Laptops use eDP (embedded displayport) to connect monitor to GPU card which already has adaptivesync in the spec.
5, Alpha driver for nvidia mobile GPU sufraces which is made to work with adaptivesync over eDP, which driver itself calls "G-sync".
Conclusion - Nvidia lied about its adaptation of adaptivesync and it now appears extremely likely that nvidia will be using adaptivesync in its future products and just call it "G-sync mobile" or something similar.
Re: (Score:2, Informative)
An animation running with fixed vsync on a 60 Hz display can only run at 60, 30, 15, ... Hz.
Now if the game engine only manages to render 59 Hz, it will depict 30 Hz due to vsync.
Adaptive vsync allows the system to run at 59 Hz without tearing.
Re: (Score:2)
Isn't double buffering supposed to make it run at 60hz with a single frame doubled?
Yep single buffering would cause a progressive tear moving down the screen constantly.
Re: (Score:2)
I thought vertical sync just removed the tearing and would show 59 Hz on the 60 Hz display with one frame being twice as long.
Anyway Adaptive VSync isn't Adaptive-Sync.
Adaptive VSync is an Nvidia thing which switches between vertical sync on and off depending on the frame rate.
Adaptive-Sync is the the VESA standard which let the graphics hardware decide when the monitor should refresh.
Re: (Score:1)
Or well, forget the first line.
I'm tired and didn't really think it through but just didn't thought it limited to 30 Hz.
I expected it to vary with the in-between frame-rates where the monitor updated at its 60 Hz but the graphics card never delivering any in-between frames content but only full frames and just drop frames in-between or provide no new content if that was the case depending on the actual in-between frames wait period.
Re:Wut? (Score:5, Insightful)
For power-sensitive situations, being able to modify the frame rate can reduce power consumption(especially if combined with 'panel self refresh', also part of the embedded displayport standard, which calls for the panel driver to have enough RAM to store the entire frame so that the display driver and DP link can be shut down entirely if a static image is being displayed, only needing to wake back up when the image needs to be changed.)
The other advantage(and the one that Nvidia would be shooting for) is that it allows you to avoid the 'tearing' you get if your GPU's frame rate differs from your panel's refresh rate and you end up with part of one frame and part of the next frame drawn on the panel at the same time. If you can change the panel refresh rate, you can ensure that it refreshes when, and only when, the GPU has a new frame ready(obviously, you can't push the panel above a certain refresh rate, even if the GPU is doing something simple and could spit out hundreds or thousands of FPS; but it's a lot easier to cap at X FPS than it is to ensure that the framerate never dips under heavy load.)
Mod parent up (Score:1)
You're completely correct, but I'd like to cite one situation where screen-tearing is a huge problem: The accurate preservation of old arcade games and computer systems.
Due to the fact that arcade machines usually had wildly varying resolutions from game to game, it was actually pretty unusual for one to have a fixed refresh rate of 60Hz. Many of them had a vertical refresh rate in the neighborhood of 53-59Hz. Some of them, had refresh rates slightly above 60Hz, such as Pac-Man, which had a refresh rate of
Re: (Score:3)
well there's some other hacked drivers for desktop however.
which provide the benefits of gsync without gsync and indeed it seems gsync just provides a) a certification that the monitor can handle the used displayport extensions and b) the monitor manufacturer paid for it.
and the choice of the specific fgpa chip was for security. the chip doing shit all _nothing_ to enable "gsync" on the monitor except provide authentication to the drivers that it's there and it's safe to use the extensions, basically.
what i
Re: (Score:2)
This clearly, um, incentivises innovation and, er, stuff. It's too bad, really. I could really use a GPU with better thermal efficiency; but Nvidia are being such a bunch of dicks t
Re: (Score:2)
On the plus side, with 'DisplayID' replacing legacy EDID,
Well, that could be good, EDID is fucking awful, and has been a sticking point for Linux since forever. Is DisplayID also fucking awful?
Re: (Score:2)
Not only is it awful, but no one uses it :) DP monitors still use EDID :)
Re: (Score:2)
1. The FPGA *was* required for the tech to work on the desktop panels it was installed in.
2. FreeSync (as I've witnessed so far) as well as the most recent adaptive sync can not achieve the same result across as wide of a refresh rate range that G-Sync currently can.
3. Nvidia could 'make it work', but it would not be the same experience as can be had with a G-Sync module, even with an adaptive sync panel (as evidenced by how this adaptive sync panel in this laptop intermittently blanks out at 30 FPS or when
Re: (Score:2)
1. Actually the current claim is that FPGA is mostly for DRM, as it's basically a DRM wrapper around activesync, which is what G-sync appears to be. According to the guy behind this discovery at least.
2. Factually incorrect. G-sync is in fact inferior to Adaptivesync in refresh rate range in the current implementation. G-sync range is 30-144Hz, where Adaptivesync can handle (depending on the scaler) 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.
Source: http://www.geforce.com/hardwar... [geforce.com]
http://www.anandtech.com/sho [anandtech.com]
Re: (Score:2)
1. That is a false claim - Gamenab didn't even cite the correct FPGA model when he made that DRM claim.
2. G-Sync is actually good down to 1 FPS - it adaptively inserts additional redraws in between frames at rates below 30, as to minimize the possibility of judder (incoming frame during an already started panel refresh pass). FreeSync (it its most recently demoed form) reverts back to the VSYNC setting at the low end. Further, you are basing the high end of G-Sync only on the currently released panels. Noth
Not a huge surprise... (Score:2)
its Nvidia FREESYNC (Score:3)
http://gamenab.net/2015/01/26/... [gamenab.net]
Sure, its asus releasing the driver, suure
Re: (Score:1)
Re: (Score:2)
Unfortunately, that site already got nvidiadotted....
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Gamenab stumbled across the leaked driver and tried to use it to spread a bunch of conspiracy theory FUD. I hope most people here can correctly apply Occam's razor as opposed to the alternative, which is that he supposedly designed those changes, those changes going into an internal driver build that was inadvertently leaked and happened to apply to the exact laptop he already owned.
ExtremeTech picked apart his BS in more detail: http://www.extremetech.com/ext... [extremetech.com]
G-WHAT? (Score:1)
Is G-SYNC some new kind of graphics card heat sink that runs at 100Hz?
Re: (Score:1)