Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Displays

Nvidia Is Ditching Dedicated G-Sync Modules To Push Back Against FreeSync's Ubiquity (arstechnica.com) 45

An anonymous reader quotes a report from Ars Technica, written by Andrew Cunningham: Back in 2013, Nvidia introduced a new technology called G-Sync to eliminate screen tearing and stuttering effects and reduce input lag when playing PC games. The company accomplished this by tying your display's refresh rate to the actual frame rate of the game you were playing, and similar variable refresh-rate (VRR) technology has become a mainstay even in budget monitors and TVs today. The issue for Nvidia is that G-Sync isn't what has been driving most of that adoption. G-Sync has always required extra dedicated hardware inside of displays, increasing the costs for both users and monitor manufacturers. The VRR technology in most low-end to mid-range screens these days is usually some version of the royalty-free AMD FreeSync or the similar VESA Adaptive-Sync standard, both of which provide G-Sync's most important features without requiring extra hardware. Nvidia more or less acknowledged that the free-to-use, cheap-to-implement VRR technologies had won in 2019 when it announced its "G-Sync Compatible" certification tier for FreeSync monitors. The list of G-Sync Compatible screens now vastly outnumbers the list of G-Sync and G-Sync Ultimate screens.

Today, Nvidia is announcing a change that's meant to keep G-Sync alive as its own separate technology while eliminating the requirement for expensive additional hardware. Nvidia says it's partnering with chipmaker MediaTek to build G-Sync capabilities directly into scaler chips that MediaTek is creating for upcoming monitors. G-Sync modules ordinarily replace these scaler chips, but they're entirely separate boards with expensive FPGA chips and dedicated RAM. These new MediaTek scalers will support all the same features that current dedicated G-Sync modules do. Nvidia says that three G-Sync monitors with MediaTek scaler chips inside will launch "later this year": the Asus ROG Swift PG27AQNR, the Acer Predator XB273U F5, and the AOC AGON PRO AG276QSG2. These are all 27-inch 1440p displays with maximum refresh rates of 360 Hz.

This discussion has been archived. No new comments can be posted.

Nvidia Is Ditching Dedicated G-Sync Modules To Push Back Against FreeSync's Ubiquity

Comments Filter:
  • My current monitor (Samsung 34C890H - yes, I know, terrible name) is starting to act up. When cold booting after being off for a few hours, there's a horizontal lines band at the bottom which gradually disappears.
    I have been looking at monitors for the last few months, but am yet to find a good one which is aptly priced.
    4K is a must, no bigger than 34" diagonal, G-Sync preferred (G-Sync compatible, maybe), IPS display or better (current is VA and is decent), 144 Hz preferable.
    Options were slim to none here

    • by r1348 ( 2567295 )

      I bought the AOC AG324UX 2 years ago for ~1.000€, which seem to tick all your boxes apart from G-Sync. It's Freesync Premium certified, which is basically the same thing.

      The main selling point for me was the USB-C KVM with 90W power delivery, which allow me to quickly plug my work laptop when working from home.

      • by 93 Escort Wagon ( 326346 ) on Tuesday August 20, 2024 @07:08PM (#64722182)

        Careful - a very vocal minority here gets triggered by any mention of AOC whatsoever.

      • Thank you, it's almost perfect.
        I've been referring the nVidia compatibility matrix (https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/) and that monitor is not there.
        I also looked at DisplayNinja's list and the monitor is there, but not tested.

        Have you tested the monitor using nVidia's g-Sync Pendulum test?

        The USB-C Power Delivery is certainly desirable, I have the same thing on my Samsung monitor, and I power one of my laptops that way, with the extended display feature as bonus.

        • by r1348 ( 2567295 )

          I don't have any Nvidia card in my build unfortunately, ditched them a long time ago, mostly for linux compatibility.

          • Understood.
            That's the problem with "compatible" labels. They look like they work, in theory, until they don't, in practice.
            And I'm too old and jaded to spend endless hours trying to debug this or that incompatibility or behavior. I'd rather pay extra for the guarantee it would work.

    • So... with this piece of news, I hope my monitor will last for one more year or so. Then we'll see.

      There are many options for G-Sync compatible available now.

      The only thing that's happening here is that Nvidia is going to officially bless the tech that's already being used by compatible monitors in the market right now. No new tech is going to be available in a year. You may get a new sticker on an existing monitor though.

    • by Khyber ( 864651 )

      " there's a horizontal lines band at the bottom which gradually disappears."

      Edge ICs are starting to come undone and remake contact after the TV heats up. They need to be re-glued back down properly (and the alignment to do so requires a very very high degree of precision)

  • by OneOfMany07 ( 4921667 ) on Tuesday August 20, 2024 @04:13PM (#64721796)

    I'm amazed it's taken this long for them to make the shift. And that they announced it like this instead of the feature just 'magically' appearing in most monitors for the same cost as without (or close to).

    FPGA's aren't for production products... they're for prototypes. Or things you need to change often, even at the customer's premises (don't think G-sync needs that, but ). Killer NIC used them if that helps point at a product pattern. They're always going to be more expensive, use more power, etc.

    Guess it wasn't a priority for them to optimize it, and sell more. Or the 'free' alternatives scared them off from spending the effort/money on it before now. Of course, AI has been a valuable direction to focus on instead.

    Guess we'll have to see if the governments trying to force them to allow external access to CUDA (or some other way to increase 'AI competition') happens. Like that group with Intel hoping to reimplement the CUDA API for their parts too (for AI inference tasks only). I've only seen a couple projects attempt something like that (WINE for instance) and I assume there are 'reasons' why (I'm not a lawyer, etc).

    While I agree NVIDIA spent a lot of effort and time on CUDA, I'm also a fan of competition. And believe there needs to be a balance there somehow. Companies shouldn't need monopolies to function, and customers always end up paying more with them.

    • by Guspaz ( 556486 )

      They don't use FPGAs anymore. They did for a while (certainly early modules did), but transitioned to ASICs.

      It was kind of silly, they used huge FPGAs that had single-unit MSRPs typically several times higher than the monitors themselves. It actually made financial sense for people who needed small quantities of the FPGAs to just buy a bunch of G-Sync monitors, rip the FPGA out, and throw away the rest of the monitor. Doing so was a big cost saving over buying the FPGAs in anything but large quantities.

  • I would have guessed everything new would be at least 2160p.

    Is 300Hz really worth it for gaming when resolution could be so much better?

    1440p was what I got on a laptop in 2009.

    • I prefer them since it's enough space to code on and I game at 1080p. My old man eyes aren't going to see the finer details anyway. And this way I don't end up downscaling from the panels native resolution.
      • Good on you.

        There is NO CHANCE AT ALL that a modern 3D game will run anywhere near 300FPS at 4K+ with all settings on high. Its just not a thing.

        But when you back off the resolution to 1080, or even 720.... but still have one of those "4K" gpus....

        None-the-less I think an old CRT at 1600x1200 which probably tops out at 75hz is the best possible experience
        • by Khyber ( 864651 )

          "There is NO CHANCE AT ALL that a modern 3D game will run anywhere near 300FPS at 4K+ with all settings on high. Its just not a thing."

          So? I just want 300FPS for Quake 3 so I can lock the engine to that and SPEEEEEEEEEEED, and for fucking once have a monitor that can keep the fuck up with me.

        • CRT strobing means excellent motion clarity at any framerate. Below 90Hz the flicker becomes obnoxious though.
      • by AmiMoJo ( 196126 )

        I find text on 4k much clearer. It's not a question of seeing detail, it's the clarity of the letter shapes.

    • by Luckyo ( 1726890 )

      Well over half play at 1080p according to latest steam hardware survey. 1440p is a fifth, and 4k is so tiny it's almost irrelevant at 3,65%

      • I have a 4K 120 fps TV plugged into my PC. I almost never game on it, the 1080p gaming monitor is just better.
    • I'm more interested in low latency between input and the next frame, rather than a flat refresh rate number. But it turns out the time between scanning out the next frame and your maximum refresh rate are related metrics.

      So yes. 120, 240 and 300 Hz are all really nice features to have in a variable refresh rate monitor when your graphics stack can make use of it.

      Alternatively, you can run at 2160p with vsync disabled and things will run great. The input latency will still be somewhat low if you can push 4K

      • by AmiMoJo ( 196126 )

        Latency is what kills most emulation for me. Between the latency of the screen, poor LCD motion quality, and latency from input devices, it just feels off.

        Shame because FreeSync is great for emulation.

        • Totally. I have never been satisfied with arcade emulation. Bullet hell games like 1942 feel slightly differently and instead of being a challenge of skill it feels more like lots of cheap shots and unfair random luck.

    • Is 300Hz really worth it for gaming when resolution could be so much better?

      If you are a competitive FPS player, and if it's coupled with very low latency, maybe.

      1440p was what I got on a laptop in 2009.

      It's about frame rates. Most people don't have GPUs that can drive 60fps at higher than 1440p. See:
      https://store.steampowered.com... [steampowered.com]

      ~95% play at 1440p or lower.

      • 57% have 1920x1080 primary desktop resolutions.

        Clearly GPU performance at 1440 doesnt matter even a tiny little bit to the majority. Its never a question, so it certainly aint a motivation.
      • The motion blur from looking around in an FPS game even at 120Hz makes me feel sick. Good strobing in VR games spoiled me, it'll be a must-have feature for my next gaming monitor. Alternatively framerates in the ballpark of 1000Hz could work, but since games would need to run that fast as well I don't think that's going to happen.
    • For things in motion, without strobing much more blur comes from the frames being far apart temporally than from 1080p resolution being too low. 1080p makes tons of sense when anything moves if it enables higher framerates.
    • by Junta ( 36770 )

      2160p is really expensive, rendering wise, and not much visually better than 1440p.

      Even as GPUs have gotten faster and could easily render 2009 content at 2160p, the content has ramped up the geometry and textures and lighting to more than make up for the extra GPU capacity.

  • Why not just be compatible with FreeSync? Even this old 75hz monitor from 2016 has FreeSync ability.

    Why are they relying on hardware additions when clearly monitor firmware/software can already 'sync' to variable frame rates.
    • Why are they relying on hardware additions when clearly monitor firmware/software can already 'sync' to variable frame rates.

      That's the whole post. They aren't going to rely on hardware any longer, and are going to bless software solutions.

      Today, Nvidia is announcing a change that's meant to keep G-Sync alive as its own separate technology while eliminating the requirement for expensive additional hardware.

      • That's not at all what it says. Instead of it being their own scaler hardware (G-Sync module) they're getting it built into MediaTek's which the G-Sync module would have previously stood in for.
    • Nvidia cards do support FreeSync. But G-Sync has additional features that require communication outside of the standards.

      Things like their low latency mode which report back when the screen is drawing the screen. Or Pulsar which is a 1000hz backlight strobe that syncs to VRR. Neither of these are supported in the FreeSync standard.

"Mach was the greatest intellectual fraud in the last ten years." "What about X?" "I said `intellectual'." ;login, 9/1990

Working...