Nvidia Is Ditching Dedicated G-Sync Modules To Push Back Against FreeSync's Ubiquity (arstechnica.com) 45
An anonymous reader quotes a report from Ars Technica, written by Andrew Cunningham: Back in 2013, Nvidia introduced a new technology called G-Sync to eliminate screen tearing and stuttering effects and reduce input lag when playing PC games. The company accomplished this by tying your display's refresh rate to the actual frame rate of the game you were playing, and similar variable refresh-rate (VRR) technology has become a mainstay even in budget monitors and TVs today. The issue for Nvidia is that G-Sync isn't what has been driving most of that adoption. G-Sync has always required extra dedicated hardware inside of displays, increasing the costs for both users and monitor manufacturers. The VRR technology in most low-end to mid-range screens these days is usually some version of the royalty-free AMD FreeSync or the similar VESA Adaptive-Sync standard, both of which provide G-Sync's most important features without requiring extra hardware. Nvidia more or less acknowledged that the free-to-use, cheap-to-implement VRR technologies had won in 2019 when it announced its "G-Sync Compatible" certification tier for FreeSync monitors. The list of G-Sync Compatible screens now vastly outnumbers the list of G-Sync and G-Sync Ultimate screens.
Today, Nvidia is announcing a change that's meant to keep G-Sync alive as its own separate technology while eliminating the requirement for expensive additional hardware. Nvidia says it's partnering with chipmaker MediaTek to build G-Sync capabilities directly into scaler chips that MediaTek is creating for upcoming monitors. G-Sync modules ordinarily replace these scaler chips, but they're entirely separate boards with expensive FPGA chips and dedicated RAM. These new MediaTek scalers will support all the same features that current dedicated G-Sync modules do. Nvidia says that three G-Sync monitors with MediaTek scaler chips inside will launch "later this year": the Asus ROG Swift PG27AQNR, the Acer Predator XB273U F5, and the AOC AGON PRO AG276QSG2. These are all 27-inch 1440p displays with maximum refresh rates of 360 Hz.
Today, Nvidia is announcing a change that's meant to keep G-Sync alive as its own separate technology while eliminating the requirement for expensive additional hardware. Nvidia says it's partnering with chipmaker MediaTek to build G-Sync capabilities directly into scaler chips that MediaTek is creating for upcoming monitors. G-Sync modules ordinarily replace these scaler chips, but they're entirely separate boards with expensive FPGA chips and dedicated RAM. These new MediaTek scalers will support all the same features that current dedicated G-Sync modules do. Nvidia says that three G-Sync monitors with MediaTek scaler chips inside will launch "later this year": the Asus ROG Swift PG27AQNR, the Acer Predator XB273U F5, and the AOC AGON PRO AG276QSG2. These are all 27-inch 1440p displays with maximum refresh rates of 360 Hz.
I will wait some more, then (Score:2)
My current monitor (Samsung 34C890H - yes, I know, terrible name) is starting to act up. When cold booting after being off for a few hours, there's a horizontal lines band at the bottom which gradually disappears.
I have been looking at monitors for the last few months, but am yet to find a good one which is aptly priced.
4K is a must, no bigger than 34" diagonal, G-Sync preferred (G-Sync compatible, maybe), IPS display or better (current is VA and is decent), 144 Hz preferable.
Options were slim to none here
Re: (Score:2)
I bought the AOC AG324UX 2 years ago for ~1.000€, which seem to tick all your boxes apart from G-Sync. It's Freesync Premium certified, which is basically the same thing.
The main selling point for me was the USB-C KVM with 90W power delivery, which allow me to quickly plug my work laptop when working from home.
Re:I will wait some more, then (Score:4, Funny)
Careful - a very vocal minority here gets triggered by any mention of AOC whatsoever.
Re: (Score:2)
I don't care about brand names. It could be anything, as long as it passes testing.
If not, back to the store it goes.
Re: (Score:2)
I think it was a joke about Alexandra Ocasio-Cortez...
Re: (Score:2)
You think correctly!
Re: (Score:2)
Not the same AOC :P
Re: (Score:2)
Re: (Score:2)
Thank you, it's almost perfect.
I've been referring the nVidia compatibility matrix (https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/) and that monitor is not there.
I also looked at DisplayNinja's list and the monitor is there, but not tested.
Have you tested the monitor using nVidia's g-Sync Pendulum test?
The USB-C Power Delivery is certainly desirable, I have the same thing on my Samsung monitor, and I power one of my laptops that way, with the extended display feature as bonus.
Re: (Score:2)
I don't have any Nvidia card in my build unfortunately, ditched them a long time ago, mostly for linux compatibility.
Re: (Score:2)
Understood.
That's the problem with "compatible" labels. They look like they work, in theory, until they don't, in practice.
And I'm too old and jaded to spend endless hours trying to debug this or that incompatibility or behavior. I'd rather pay extra for the guarantee it would work.
Re: (Score:2)
So... with this piece of news, I hope my monitor will last for one more year or so. Then we'll see.
There are many options for G-Sync compatible available now.
The only thing that's happening here is that Nvidia is going to officially bless the tech that's already being used by compatible monitors in the market right now. No new tech is going to be available in a year. You may get a new sticker on an existing monitor though.
Re: (Score:2)
" there's a horizontal lines band at the bottom which gradually disappears."
Edge ICs are starting to come undone and remake contact after the TV heats up. They need to be re-glued back down properly (and the alignment to do so requires a very very high degree of precision)
Re: (Score:2)
Yeah, not going to bother.
When it becomes unusable, off to recycling it goes.
Nvidia finally creates ASIC for expensive feature. (Score:4, Interesting)
I'm amazed it's taken this long for them to make the shift. And that they announced it like this instead of the feature just 'magically' appearing in most monitors for the same cost as without (or close to).
FPGA's aren't for production products... they're for prototypes. Or things you need to change often, even at the customer's premises (don't think G-sync needs that, but ). Killer NIC used them if that helps point at a product pattern. They're always going to be more expensive, use more power, etc.
Guess it wasn't a priority for them to optimize it, and sell more. Or the 'free' alternatives scared them off from spending the effort/money on it before now. Of course, AI has been a valuable direction to focus on instead.
Guess we'll have to see if the governments trying to force them to allow external access to CUDA (or some other way to increase 'AI competition') happens. Like that group with Intel hoping to reimplement the CUDA API for their parts too (for AI inference tasks only). I've only seen a couple projects attempt something like that (WINE for instance) and I assume there are 'reasons' why (I'm not a lawyer, etc).
While I agree NVIDIA spent a lot of effort and time on CUDA, I'm also a fan of competition. And believe there needs to be a balance there somehow. Companies shouldn't need monopolies to function, and customers always end up paying more with them.
Re: (Score:2)
NV wants to sell graphics cards, not chips for your monitor.
NV wants money. Nothing more, nothing less. Gsync was a very easy way to not only do that - trival to manufacture - easy to bill, but also tie it into selling more graphics cards.
Re: (Score:2)
They don't use FPGAs anymore. They did for a while (certainly early modules did), but transitioned to ASICs.
It was kind of silly, they used huge FPGAs that had single-unit MSRPs typically several times higher than the monitors themselves. It actually made financial sense for people who needed small quantities of the FPGAs to just buy a bunch of G-Sync monitors, rip the FPGA out, and throw away the rest of the monitor. Doing so was a big cost saving over buying the FPGAs in anything but large quantities.
Re: (Score:2)
I wish nvidia would just support freesync and work to improve it instead of insisting on a proprietary feature nobody wants. Same with OpenCL vs CUDA, OpenGL/Vulkan vs DirectX, etc. I refuse to support nvidia as well; it's dumb and frustrating.
1440p? (Score:2)
I would have guessed everything new would be at least 2160p.
Is 300Hz really worth it for gaming when resolution could be so much better?
1440p was what I got on a laptop in 2009.
You can still easily find 1080p monitors (Score:2)
Re: (Score:2)
There is NO CHANCE AT ALL that a modern 3D game will run anywhere near 300FPS at 4K+ with all settings on high. Its just not a thing.
But when you back off the resolution to 1080, or even 720.... but still have one of those "4K" gpus....
None-the-less I think an old CRT at 1600x1200 which probably tops out at 75hz is the best possible experience
Re: (Score:2)
"There is NO CHANCE AT ALL that a modern 3D game will run anywhere near 300FPS at 4K+ with all settings on high. Its just not a thing."
So? I just want 300FPS for Quake 3 so I can lock the engine to that and SPEEEEEEEEEEED, and for fucking once have a monitor that can keep the fuck up with me.
Re: (Score:2)
Re: (Score:2)
300FPS was doable at high resolutions at over 1000FPS on a GeForce Ti4400.
No, I need a modern monitor that can actually handle the refresh rate.
Re: (Score:2)
Re: (Score:2)
I find text on 4k much clearer. It's not a question of seeing detail, it's the clarity of the letter shapes.
Re: (Score:2)
Re: (Score:2)
Well over half play at 1080p according to latest steam hardware survey. 1440p is a fifth, and 4k is so tiny it's almost irrelevant at 3,65%
Re: (Score:2)
Re: (Score:2)
I'm more interested in low latency between input and the next frame, rather than a flat refresh rate number. But it turns out the time between scanning out the next frame and your maximum refresh rate are related metrics.
So yes. 120, 240 and 300 Hz are all really nice features to have in a variable refresh rate monitor when your graphics stack can make use of it.
Alternatively, you can run at 2160p with vsync disabled and things will run great. The input latency will still be somewhat low if you can push 4K
Re: (Score:2)
Latency is what kills most emulation for me. Between the latency of the screen, poor LCD motion quality, and latency from input devices, it just feels off.
Shame because FreeSync is great for emulation.
Re: (Score:2)
Totally. I have never been satisfied with arcade emulation. Bullet hell games like 1942 feel slightly differently and instead of being a challenge of skill it feels more like lots of cheap shots and unfair random luck.
Re: (Score:2)
Is 300Hz really worth it for gaming when resolution could be so much better?
If you are a competitive FPS player, and if it's coupled with very low latency, maybe.
1440p was what I got on a laptop in 2009.
It's about frame rates. Most people don't have GPUs that can drive 60fps at higher than 1440p. See:
https://store.steampowered.com... [steampowered.com]
~95% play at 1440p or lower.
Re: (Score:2)
Clearly GPU performance at 1440 doesnt matter even a tiny little bit to the majority. Its never a question, so it certainly aint a motivation.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
2160p is really expensive, rendering wise, and not much visually better than 1440p.
Even as GPUs have gotten faster and could easily render 2009 content at 2160p, the content has ramped up the geometry and textures and lighting to more than make up for the extra GPU capacity.
Is it any better than FreeSync? (Score:2)
Why are they relying on hardware additions when clearly monitor firmware/software can already 'sync' to variable frame rates.
Re: (Score:2)
Why are they relying on hardware additions when clearly monitor firmware/software can already 'sync' to variable frame rates.
That's the whole post. They aren't going to rely on hardware any longer, and are going to bless software solutions.
Today, Nvidia is announcing a change that's meant to keep G-Sync alive as its own separate technology while eliminating the requirement for expensive additional hardware.
Re: (Score:2)
Re: (Score:3)
Nvidia cards do support FreeSync. But G-Sync has additional features that require communication outside of the standards.
Things like their low latency mode which report back when the screen is drawing the screen. Or Pulsar which is a 1000hz backlight strobe that syncs to VRR. Neither of these are supported in the FreeSync standard.