Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Displays

AMD Stops Certifying Monitors, TVs Under 144 Hz For FreeSync (arstechnica.com) 49

An anonymous reader quotes a report from Ars Technica: AMD announced this week that it has ceased FreeSync certification for monitors or TVs whose maximum refresh rates are under 144 Hz. Previously, FreeSync monitors and TVs could have refresh rates as low as 60 Hz, allowing for screens with lower price tags and ones not targeted at serious gaming to carry the variable refresh-rate technology. AMD also boosted the refresh-rate requirements for its higher AdaptiveSync tiers, FreeSync Premium and FreeSync Premium Pro, from 120 Hz to 200 Hz.

Here are the new minimum refresh-rate requirements for FreeSync, which haven't changed for laptops. AMD will continue supporting already-certified FreeSync displays even if they don't meet the above requirements. Interestingly, AMD's minimum refresh-rate requirements for TVs goes beyond 120 Hz, which many premium TVs max out at currently, due to the current-generation Xbox and PlayStation supporting max refresh rates of 120 frames per second (FPS). Announcing the changes this week in a blog post, Oguzhan Andic, AMD FreeSync and Radeon product marketing manager, claimed that the changes were necessary, noting that 60 Hz is no longer "considered great for gaming." Andic wrote that the majority of gaming monitors are 144 Hz or higher, compared to in 2015, when FreeSync debuted, and even 120 Hz was "a rarity."

This discussion has been archived. No new comments can be posted.

AMD Stops Certifying Monitors, TVs Under 144 Hz For FreeSync

Comments Filter:
  • Complete nonsense (Score:4, Insightful)

    by gweihir ( 88907 ) on Friday March 08, 2024 @05:30PM (#64301367)

    60Hz is perfectly fine for most gaming. It is even ok for shooters, as most players are kidding themselves about any differences higher rates make, although for a few exceptional players a higher FPS rate may make a small difference.

    • Re:Complete nonsense (Score:4, Interesting)

      by Luckyo ( 1726890 ) on Friday March 08, 2024 @05:38PM (#64301379)

      Depends on age. You may have aged out of the reflexes needed to get the differential if you feel they're the same. Most young people can easily benefit from going from 60 to 144. Most older people can as well, but some lose their reflexes faster than others. Young and trained ones can benefit from going up to 240 at least. There's at least one live demonstration video on youtube showing this here:

      https://www.youtube.com/watch?... [youtube.com]

      • That's only an artefact of games tying input processing to output. If they're actually independent, and frame rate is 60 (not 30 as in another bug), it's impossible for humans to notice a difference.

        • by Luckyo ( 1726890 )

          This is factually false, because the main input you have from the game toward the player is visual. If once is getting one update every 1/60th of a second and other 1/240th of a second, one with 1/240th has at least the following advantages even at instant input speed for both:

          Faster ability to notice new objects appearing.
          Faster ability to do correct relative speed, range and acceleration pattern measurement of any object on screen.

          Former is critical for "who shoots first wins" that is prevalent in pretty

          • by gweihir ( 88907 )

            Sure, "faster". For a value of "faster" that does not matter for regular people because their reflexes are dog-slow anyways. You are just kidding yourself.

            • by Luckyo ( 1726890 )

              The value of faster in a shooter is equal to coming first or second.

              Reward for first place if victory. Reward for second place is death.

          • Re: (Score:3, Interesting)

            Search for "best human reflexes ever recorded". Average is 250+ms. Exceptional reflexes are ~150 ms. Let's say 100ms to be generous with our example here. Now, let's cut that in half to really emphasize the point here. 50ms.

            60 FPS is 16ms per frame. Given that you can fit three 60 FPS frames in this impossible figure of human reflexes, I think it is safe to say that there is zero advantage to higher FPS. It may just look smoother. The only thing I'd consider is the input processing being different with high

            • by hvdh ( 1447205 )

              Depending on a lot of details, a game event at a random time will be visible to 120Hz players 4-16ms earlier than to 60Hz players.
              After seeing it, both players' reflex time starts. For equal reflexes (no matter how slow they be), the 120Hz player's reaction will also happen 4-16ms earlier.
              For most types of game logic, 120Hz player has an advantage. He shot first, reached the flag first etc.

            • by Luckyo ( 1726890 )

              >I think it is safe to say that there is zero advantage to higher FPS.

              There's a video link in my initial post that features a top tier player, a casual player and someone in between the two extremes.

              All of them objectively gained in success rate when going from 60 frames per second to 240 frames per second. You're not objecting to a hypothesis. You're objecting to proven reality.

              And reason for why reality is the way it is is actually extremely simple. In competitive games that require fast reaction such

              • That is a sponsored marketing video. They are getting paid to convince people that it matters. Firstly, I think it is arguable that it does matter in Counter Strike, since the Source engine is pretty stupid with bullet interpolation. However, any advantage would be coming from the game loop running better, and not what you are seeing on your screen. I'd wager that nobody would be able to tell the difference if the game was running at 240hz but it was motion blurred down to a 60FPS display.

                > For first, yo

                • by Luckyo ( 1726890 )

                  You can't even address the point raised, you continue to claim that #2 place is totally cool, because you're only behind by so much, you don't understand how system latency even works and think that making what is essentially a metronome is relevant here, and you still can't process that motion tracking matters, and the only opposing point you can generate is "but they're getting paid for making this video".

                  I guess there's nothing left to discuss.

                  • Actually I do understand how system latency works, and, as stated, the margin of error was determined to be 2 miliseconds, as tested with an auto-clicker. It is not measuring against any set points, it is measuring how well you can space 1 second intervals -- searching for a "best fit" of your clicks (giving you a further advantage).

    • You can get buy with 60 Hz in the overwhelming majority of situations in day to day computing and even in gaming.

      Higher refresh rates have a lower frame time, and this can become helpful in scenarios where an engine can't maintain a full 60 Hz refresh. The pathological case of missing every other frame tends to drive the input lag and refresh rate to something more equivalent of 30 Hz. Rather than simply stepping down to 59 Hz or whatever you think it ought to be. If your system can't push 60 Hz / 16.67 ms

    • I've never liked 60Hz. It used to be standard for CRT monitors to run at 85Hz, and you could clearly see the difference. Of course it's a different technology, but there's a huge difference between 60Hz and something higher. I can't use anything lower than 75Hz personally.

      • 60Hz CRT monitors flickered visibly, I could perceive flicker up to about 85Hz in my peripheral vision. Everything from LCD newer doesn't.

        • Most lcds don't "fade" like crts do. A crt has a single "laser" passing through every pixel of the screen. An lcd has one massive backlight and a display layer with every pixel having its own "shutter". the pixels fade and turn black on a crt hence the flicker and the half images on a camera. Lcds don't
    • Maybe your eyes are getting old, but while I agree that 60Hz is fine for most games, that's predicated on most games not involving any kind of fast action. 144Hz may be over the top, but Stevie Wonder would prefer gaming on a 120Hz screen over a 60Hz screen. These even dates well back into the CRT days.

      Personally I chase other qualities in a monitor, so mine only does 60Hz, but I don't pretend that any game with any fast movement doesn't benefit from something higher.

    • 60Hz is perfectly fine for most gaming. It is even ok for shooters, as most players are kidding themselves about any differences higher rates make, although for a few exceptional players a higher FPS rate may make a small difference.

      It's about quality. The quality difference between 60hz, 144hz, and 280hz are all noticable. Yes, I play at 280hz and occasionally Windows would fuck up and drop to 144hz and I'd notice it instantly upon launching a game. It may or may not make you a better player, but gaming at

    • I agree. I play a lot of FPS games. I normally use a gaming laptop connected to a 60hz monitor. I'm generally happy if I can get a frame rate of 60fps. If I used a desktop, I could see wanting higher frame rates.

      For my laptop screen, I can switch it between 60hz and 240hz. In games, I can't notice a difference. On the Windows desktop, the only way I can tell the difference is if I drag a window around in rapid circles. The 240hz is a bit smoother.

      That being said, when it's time to buy a new monitor,

      • by gweihir ( 88907 )

        The dragging is an interference effect. You can do it to detect higher frequencies with a sensor (eyes) that cannot see the higher frequencies. It does not indicate you have any advantage with the higher frequencies.

    • by antdude ( 79039 )

      I can see flickers at 60hz. 75hz is better. ;)

      • by gweihir ( 88907 )

        Nobody can see flickers at 60Hz. The human eye is simply not capable of it. Some people are capable to see flicker up to around 30Hz, but they are a small minority. You are likely seeing some interference effect.

        • by antdude ( 79039 )

          Even with olds school CRT via VGA? Because I could back in those days.

          • by gweihir ( 88907 )

            No, you could not. You can perceive indirectly via interference that there is flicker, but you cannot see that flicker above something like 30 Hz. Unless you are talking about interlaced displays, these are refreshing one half of the screen at 25 or 30Hz and that is just bare visible for most people.

            • by antdude ( 79039 )

              I don't know if they were interlaced displays. I just remember SVGA resolutions with big CRT monitors via KVMs & VGA connections before I went to flatscreen monitors.

            • a) at 60Hz on a CRT, I could see flicker, but only from the side of my eye. That is, if the monitor were in the "corner of my eye".
              b) regardless of being able to see it, I would get a headache.
              Pretty sure this required resolutions above 640x480. Don't recall if it was for 800x600 or 1024x768. It *has* been over 15 years after all.

              Regardless of any claim you can make that that is impossible...

              Next thing you'll tell me that I couldn't hear the whine of a flyback transformer in a TV or monitor.

              • by gweihir ( 88907 )

                No, you cannot. What you see is an inference effect coming from your eyes "vibrating" and it gives you zero speed advantage. That inference is caused by flicker, but you would need to be able to see the flicker itself to get any speed advantage.

                The "whine" of a flyback transformer is audible to younger people as it is around 15kHz.

                Seriously, get some _basics_.

    • by Targon ( 17348 )
      There is a threshold of around 85Hz that really feels like a big improvement over just 60Hz, but in this day and age, you see 60Hz, 120Hz, 144, then into that 200+ range. Honestly, we need to see another generation from both NVIDIA and AMD. NVIDIA really relies on DLSS when it comes to most of its video card lineup for those who care about frame rate, and in general, AMD is doing decently as long as you don't turn on ray tracing.
      • by gweihir ( 88907 )

        You "see" none of these. The human Eye is limited to about 20Hz on average with some (few) people going as high as 30Hz. That was the reason for the original rate of 24 FPS for movies.

        What you can see is inference effects. While these may increase or decrease you sense of quality, they do _not_ influence your reaction speed.

        • by mrbax ( 445562 )

          The human Eye is limited to about 20Hz on average with some (few) people going as high as 30Hz.

          The human eye can detect flicker at 50–90 Hz but reports are showing the possibility to distinguish between steady and modulated light up to 500 Hz. [nih.gov]

          What you can see is inference effects.

          Presumably you mean "interference effects", but what you refer to is actually a sampling effect in time (aliasing). This is a different (but related) issue to flicker.

          • by gweihir ( 88907 )

            I mean interference. And that is how the human eye can _detect_ flicker, but it cannot actually see flicker. In order to get any speed advantage form faster displays, it would need to be able to see flicker. It cannot.

            • by mrbax ( 445562 )
              You are wrong.
              • 1. I cited scientific evidence to the contrary.
              • 2. There is nothing to "interfere" with.
              • 3. I can see 60 Hz flicker. Consciously.
              • by gweihir ( 88907 )

                Nope. You just have no clue how things work.
                1. You did not understand what you quoted
                2. Sure there is, your eyes are "vibrating" and that cause interference. You could not even really see without that vibration.
                3. You can see the flicker. You cannot see the individual images. And only that would give you speed advantages.

                • by mrbax ( 445562 )

                  You just have no clue how things work.

                  Oh really. I am an imaging scientist with a PhD. What are you?

                  1. You did not understand what you quoted

                  Garbage. You are out of your depth.

                  2. Sure there is, your eyes are "vibrating" and that cause interference.

                  Nonsense. You clearly understand neither human vision nor sampling theory.

                  You could not even really see without that vibration.

                  What on earth are you talking about? Try using the correct scientific terms instead of "gobbledygook".

                  Nobody can see flicker

    • Just because YOU can't tell the difference between 120 FPS and 60 FPS does NOT mean 60 FPS is "fine". For me 120 is silky smooth, 60 is NOT. Instead of speaking for everyone you should ONLY be speaking for yourself: "I found 60Hz is perfectly fine for most gaming but YMMV."

      The sweet spot is around ~100 Hz when decreasing returns kicks in. I discovered this back in the late 90's when I had a CRT monitor and noticed that when looking at it from the corner of my eye it would flicker at 60 Hz but not at 100 H

  • Good (Score:3, Insightful)

    by gotamd ( 903351 ) on Friday March 08, 2024 @05:40PM (#64301383)
    This has been a long time coming. Companies have been throwing FreeSync labels on cheap, 60hz panels for a while and selling them as "gaming" monitors. 120hz is a much better baseline these days.
    • Honestly I was surprised they raised the minimum to 144Hz instead of 120Hz myself, because for broad-spectrum use outside of purely twitch-reflex gaming 120Hz is WAY better since it can do 24hz, 30hz, and 60hz without any odd frame doubling issues so it works far better than 144Hz for watching movies and shows, and avoids that weird 'this cutscene feels particularly OFF' effect that can happen when a game has it's cutscenes locked to 30Hz but keeps rendering at the vsync rate.

  • It's just a certification. There will still be adaptive sync monitors below 144hz.

  • Things are janky and flickery at 60Hz and I notice it. I have the option of going 120Hz or 144Hz but I go 120 because my gaming rig fans are spin slower compared to 144, and I don't notice a difference in the games I play. So 120Hz is the absolute sweet spot for me for both gameplay smoothness and fan noise.
  • The requirements for 1080p are what are being talked about here, but the new requirements for 1440p and 4k are not mentioned. Those who aren't as worried about competitive multi-player games will still want at least 120Hz, but will then go to 1440p or 4k displays for higher quality visuals. A 120Hz 4k monitor still won't help if your video card can't hit 60fps at 4k.

The trouble with being punctual is that nobody's there to appreciate it. -- Franklin P. Jones

Working...