Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Hardware

AMD Debuts Radeon FreeSync 2 For Gaming Displays With Stunning Image Quality (venturebeat.com) 67

AMD announced Tuesday it is introducing Radeon FreeSync 2, a new display technology that will enable monitors to show the exact intended image pixels that a game or other application wants to. The result will be better image quality for gamers, according to AMD. From a report on VentureBeat: With the FreeSync 2 specification, monitor makers will be able to create higher-quality monitors that build on the two-year-old FreeSync technology. Sunnyvale, Calif.-based AMD is on a quest for "pixel perfection," said David Glen, senior fellow at AMD, in a press briefing. With FreeSync 2, you won't have to mess with your monitor's settings to get the perfect setting for your game, Glen said. It will be plug-and-play, deliver brilliant pixels that have twice as much color gamut and brightness over other monitors, and have low-latency performance for high-speed games. AMD's FreeSync technology and Nvidia's rival G-Sync allow a graphics card to adjust the monitor's refresh rate on the fly, matching it to the computer's frame rate. This synchronization prevents the screen-tearing effect -- with visibly mismatched graphics on different parts of the screen -- which happens when the refresh rate of the display is out of sync with the computer.
This discussion has been archived. No new comments can be posted.

AMD Debuts Radeon FreeSync 2 For Gaming Displays With Stunning Image Quality

Comments Filter:
  • >> AMD's FreeSync technology and Nvidia's rival G-Sync allow a graphics card to adjust the monitor's refresh rate on the fly, matching it to the computer's frame rate.

    Hmmm...this sounds like something that could cause "screen lag" if the card tries a 1+ second refresh rate because it thinks the computer is busy. (We see it in networking - and networked games - way too often.) Can someone please tell me I'm wrong?
    • by Tukz ( 664339 )

      In a manner, yes. With F/G-Sync you get increased input lag.
      It's not as bad as with Vertical Sync, but it's still there.

      • by Anonymous Coward

        Isn't it quite the opposite? To avoid the negative effects of vertical sync often double or triple buffers are used. If the graphics card controls the refresh it can show the full image as soon as it is done with rendering. This potentially reduces all the lag from the buffering that is currently used.

        How can you reduce the lag even further under the assumption of a fixed FPS?

      • With F/G-Sync you get increased input lag.

        Why?

    • by aliquis ( 678370 )

      I thought it signaled the monitor "refresh now" when it had a whole image rendered.
      I don't know if normally without V-sync just one buffer is used and if that's the one sent to the monitor and if the graphics card start render onto that and if so that would mean that what has already been drawn there which eventually get sent to the screen would be more up to date than if you waited until the whole scene was drawn or whatever one always let it render completely and use two buffers and switches which is the

      • by fisted ( 2295862 )

        I don't know if normally without V-sync just one buffer is used

        No. Double buffering (or even triple buffering) is a decades-old thing that has nothing to do with vsync.

        • I wouldn't say it has nothing to do with vsync. It helps ensure you have a frame ready for the next vertical sync by displaying a couple of frames behind - at the expense of screen lag (unless the game is rendering a couple frames ahead with some kind of predictive algorithm).

          • by fisted ( 2295862 )

            The point is to only draw an image once it has been completely rendered. Synchronizing that with the display's refresh rate (vsync) came way later.
            With no double-buffering you get horrible flicker because you're modifying the frame buffer while the display is drawing it, so what gets drawn are partial images, or the frame buffer right after it has been erased for the next image, etc.

            • Thanks for not disagreeing and repeating me...I guess.

              • by fisted ( 2295862 )

                I'll give you the benefit of the doubt and assume you're trolling. Since you're doubtlessly going to deny then, I'll follow up by recommending to re-read my post real slow. That and perhaps familiarize yourself with the basics of the topic you're trying to discuss.

                • You just gave a verbose version of exactly what I said. And now you're not even refuting that fact, just jumping to some weird trolling claim.

                  Did you not even realize I wasn't the GP poster and not continuing their argument? I'll agree with you that they have no idea how vsync/buffering works together.

                  The only argument I made was that double-buffering and vsync are related because one is necessary for the other to work well (despite one coming along way before the other).

                  • by fisted ( 2295862 )

                    You just gave a verbose version of exactly what I said.

                    To me, it sounded like you were largely conflating double buffering and vsync, or implying that people came up with double buffering primarily to be able to vsync.
                    I pointed out that double-buffering and vsync are entirely different concepts (also working on different levels). If you're rendering fast enough, there would be nothing stopping you from doing vsync with a single buffer. People don't do that not because i's a stupid idea (it is, of course), but rather because by the time vsync became a thing, d

                    • To me, it sounded like you were largely conflating double buffering and vsync, or implying that people came up with double buffering primarily to be able to vsync.

                      You didn't get that from anything I said.

                      If you're rendering fast enough, there would be nothing stopping you from doing vsync with a single buffer.

                      Scheduling. You would have to precisely schedule your drawing for the VBI period rather than whenever the CPU/GPU are free. That means putting literally everything on hold to draw a frame at a specific time rather than doing it whenever you have spare cycles.

                      I'm fairly sure nobody would have thought there was a point to creating such a feature as vsync without the existence of double/triple buffering when its entire purpose is to avoid tearing.

                    • by fisted ( 2295862 )

                      Scheduling. You would have to [...]

                      Hence "if you're rendering fast enough", and the part that you missed when jumping from that sentence to the 'reply' button.

                    • It's not just about speed. If you render really fast at any other point, you will have tearing - no matter how fast you can render. And honestly, I'm not sure the GPU gives you that information to schedule off of.

    • Hmmm...this sounds like something that could cause "screen lag" if the card tries a 1+ second refresh rate because it thinks the computer is busy.

      The graphics card doesn't set a frame rate based on how busy the computer is - the new thing is that it tells the monitor what to display and the monitor does it right away, instead of waiting for the next 60th of a second to roll around.

      If your computer's having a tough time rendering, and can only mange 50fps, this would have previously resulted in stutter as the monitor's apparent output varied between 60fps and 30fps, because frames could only be displayed at each 1/60th of a second interval. Now they c

    • by GuB-42 ( 2483988 ) on Tuesday January 03, 2017 @01:05PM (#53598345)

      This Freesync 2 technology should be able to give you the best possible response time your display is capable of without artifacts (i.e. no tearline).

      The way rendering works is by using a double buffer. The back buffer is the canvas where the GPU draws the picture where the front buffer contain the completed previous frame, intended for display. When the GPU part of the drawing is complete, the buffers are swapped.
      Further down, the RAMDAC (is it still how it is called?) scans the front buffer and send the data to the monitor, line by line. The monitor then processes the data and displays the image.
      The problem with the usual fixed framerate is that the scanning is a continuous process, going top down ($refresh_rate) times per second no matter how fast the GPU is drawing new frames, which mean that the image may change mid-display, creating a tearline effect. To avoid this, it is possible to wait for the drawing to complete but it causes lag (that's vsync). It mean that gamers had to choose between an ugly tearline and increased lag.
      Freesync/G-Sync fix the problem by synchronizing the GPU rendering, RAMDAC scanning and display. So when a frame is complete, the scanning starts immediately afterwards and the monitor starts the display process at the same time. If the monitor is able to follow, there is no extra lag.

      Freesync 2 goes a step further and addresses the data processing part of the monitor. Unlike old CRTs, modern monitors do plenty of things before lighting up pixels : contrast, scaling, color correction, etc..., and it can cause more lag. This is too bad because it is something your GPU can do better and faster. And it is exactly what Freesync 2 does : it takes some image processing out of the monitor and on to the GPU where it belongs.

      • A RAMDAC is used for analog signals. Digital signals don't use a Digital to Analog Converter.
    • by epyT-R ( 613989 )

      Essentially, g/freesync is variable refresh rate capability. This is different than the old fixed refresh standard which comes from the old days of CRT refresh strobes. Regardless of what the video chip was doing, the screen had to be refreshed before the phosphor decayed too much or all the eye would see is a flickering mess. The 'multisync' monitors that came later still operated at fixed rates once a mode was set (60hz, 75, 120 etc). LCD panels mimicked this behavior to retain compatibility with esta

  • So now our console ports will look like console ports! Oh, wait...
  • Take our word for it folks, we have all the pixels!
  • by arth1 ( 260657 ) on Tuesday January 03, 2017 @11:39AM (#53597643) Homepage Journal

    It will be plug-and-play, deliver brilliant pixels that have twice as much color gamut and brightness over other monitors, and have low-latency performance for high-speed games.

    Can someone please explain how FreeSync2 has any influence at all on any of that?
    (Except possibly increase latency slightly, because you can only delay drawing through synchronization, never display what hasn't been rendered yet.)

    • Re: (Score:3, Informative)

      by JustNiz ( 692889 )

      It doesn't. This is a blatantly misleading (and therefore presumably bought-and-paid-for by AMD) article.

      • It's a certification that ensures:

        1) certain refresh rate range (higher than FreeSync 1)
        2) that monitor supports LFC (further increasing supported refresh rates)
        3) that monitor supports HDR
        4) that if game engine bothered, HDR output will be supported WITHOUT forcing in-monitor chips to do re-calculation of colors, straight out of GPU

    • by Kjella ( 173770 ) on Tuesday January 03, 2017 @12:57PM (#53598263) Homepage

      Can someone please explain how FreeSync2 has any influence at all on any of that?

      FreeSync 2 comes with a developer API that will let developers access a HDR pipeline on non-HDR operating systems (that's the extended gamut and brightness bit) while skipping the HDR display's layer of tone mapping (that's the lower latency). If the developers don't do anything or you already do HDR, you only get the latter. And they can do that because FreeSync 2 monitors tell the GPU what HDR capabilities they have so the GPU can deliver a custom tailored output and the monitor display it unprocessed.

    • by Gadget_Guy ( 627405 ) on Tuesday January 03, 2017 @01:11PM (#53598397)

      Freesync 2 is all about adding HDR support for the existing Freesync standard. There is more information in the arstechnica article [arstechnica.co.uk]:

      HDR on PC is a more complex beast than just panel brightness, though. First, a game performs colour tone mapping after an engine renders a scene. Then, when the frame is passed to a monitor, it's tone-mapped yet again to fit the display's supported colour range. That may or may not be the same colour space required by HDR10 or Dolby Vision. This two-stage process takes time and introduces latency.

      With FreeSync 2, AMD is removing the second step, connecting the game engine directly to the HDR display. When you plug in a FreeSync 2 display, the display announces its HDR capabilities, and the AMD graphics driver will shuttle that information over to the game engine. This ensures that gamers get the best possible image quality, because the game tone-maps to the screen's native colour space, while also reducing input lag. Unfortunately, it also means that in order for FreeSync 2 and HDR to work, AMD needs the specific colour and brightness capabilities of every FreeSync 2 monitor, while games and video players must be enabled via AMD's API. AMD is going to have to win over a lot of hardware partners to make FreeSync 2 a reality.

      So they are getting more colours by mandating HDR and increasing performance by removing a stage from the rendering process by allowing the game to use to exact colour space of the monitor.

      • How does that work on a technical level: do you still use a 24-bit RGB color space, or do you need something else?

        • You need more than that. From Microsoft's DirectX Graphics Infrastructure (DXGI) documentation [microsoft.com]:

          As displays support greater ranges of color and luminance (e.g. HDR), apps should take advantage of this by increasing bit depth. 10-bit/channel color is an excellent starting point. 16-bit/channel color may work well in some cases. Games that want to use HDR to drive an HDR display (a TV or Monitor) will want to use at least 10bit, but could also consider using 16bit floating point, for the format for the final

      • AMD also applies stricter rules, e.g. LFC must be supported to get FreeSync 2 certified, etc.

      • by K10W ( 1705114 )

        Freesync 2 is all about adding HDR support for the existing Freesync standard. There is more information in the arstechnica article [arstechnica.co.uk]:

        HDR on PC is a more complex beast than just panel brightness, though. First, a game performs colour tone mapping after an engine renders a scene. Then, when the frame is passed to a monitor, it's tone-mapped yet again to fit the display's supported colour range. That may or may not be the same colour space required by HDR10 or Dolby Vision. This two-stage process takes time and introduces latency. With FreeSync 2, AMD is removing the second step, connecting the game engine directly to the HDR display. When you plug in a FreeSync 2 display, the display announces its HDR capabilities, and the AMD graphics driver will shuttle that information over to the game engine. This ensures that gamers get the best possible image quality, because the game tone-maps to the screen's native colour space, while also reducing input lag. Unfortunately, it also means that in order for FreeSync 2 and HDR to work, AMD needs the specific colour and brightness capabilities of every FreeSync 2 monitor, while games and video players must be enabled via AMD's API. AMD is going to have to win over a lot of hardware partners to make FreeSync 2 a reality.

        So they are getting more colours by mandating HDR and increasing performance by removing a stage from the rendering process by allowing the game to use to exact colour space of the monitor.

        hmmm I am not convinced until I see some real analysis of this, TFA sounds like marketing BS with no real numbers or facts to backup. The HDR has less to do with the gamut and they don't even mention what they are comparing to, shittiest TN panel is my guess (there are some decent ones now). Gamut is more factor of the screens native bitrate and is oft extended with FRC (frame rate control). The HDR if genuine giving a lower black point with real shadow detail and higher white point WITHOUT just applying a

    • by Luthair ( 847766 )

      Unfortunately VentureBeat is really not the site that should be linked to. From more technical sites apparently specs like HDR10 perform tone mapping on both the source and the display which add latency, AMD is proposing a protocol where the GPU would perform tone mapping onto the space used by that specific monitor which would eliminate latency i the pipeline.

      Its too bad they're choosing to use the FreeSync brand for this since it doesn't appear to be related to the original.

    • by AHuxley ( 892839 )
      Think of it as what the better photography and movie production hardware and software have had for years.
      The OS, software, driver, GPU knows about the LCD and its factory settings.
      In the past user would have to buy a supported brand of display with a supporting display card and software.
      i.e. the hardware and software would be "broadcast" ready without needing a user to worry about creating new settings for every project.
      Now games and consumer gpu's are getting the same ability to detect each other and
  • by BenJeremy ( 181303 ) on Tuesday January 03, 2017 @11:42AM (#53597653)

    One thing is not related to the other.

    Freesync is just a way to handle variant refreshes without screen tearing. Those refreshes can happen faster or slower. If a refresh happens faster than the LCDs can make the transition (which is rare, and only will really be an issue on whole scene changes, and likely you'll never see ghosting anyway), it will still happen.

    That said, Sync tech has to do with human perception of changes that respond more precisely, and eliminating stutter (which happens because the refresh can't occur at the cyclical vertical refresh, which is mostly an artifact of CRT tech anyway). It is frustrating that nvidia has pushed proprietary sync tech that is costly to implement, rather than go with "Free Sync" which only requires firmware changes for most basic monitor controllers.

    It seems like AMD's real push here is to maintain Free Sync capability as monitor manufacturers increase the color gamut and enhance LCD response times.

    • One thing is not related to the other.

      Freesync is just a way to handle variant refreshes without screen tearing. Those refreshes can happen faster or slower. If a refresh happens faster than the LCDs can make the transition (which is rare, and only will really be an issue on whole scene changes, and likely you'll never see ghosting anyway), it will still happen.

      That said, Sync tech has to do with human perception of changes that respond more precisely, and eliminating stutter (which happens because the refresh can't occur at the cyclical vertical refresh, which is mostly an artifact of CRT tech anyway). It is frustrating that nvidia has pushed proprietary sync tech that is costly to implement, rather than go with "Free Sync" which only requires firmware changes for most basic monitor controllers.

      It seems like AMD's real push here is to maintain Free Sync capability as monitor manufacturers increase the color gamut and enhance LCD response times.

      Yeah. As far as I can tell they are trying to say they have added some HDR meta-data to FreeSync though and called it FreeSync2.. Not sure how much of that is just marketing speak though.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      > Freesync has nothing to do with color gamut.

      Sure, but if you RTFAnandtechArticleOnTheTopic you find that Freesync 2 is:

      Freesync (with guaranteed LFC) + EDID information that's _correct_ (did you know that EDID lets monitors inform video cards of their color gamut? It's true!) + an API for software to uniformly and reliably query that EDID information + some uniform and correct instructions on how to do proper colorspace transformation + a testing lab to verify that a manufacturer (whether hardware or s

    • by AmiMoJo ( 196126 )

      The relationship is that the PC controls the monitor directly, controlling when it refreshes the image and now how it displays colour. Without Freesync the monitor takes the video signal, processes it and refreshes the display on its own.

      This sort of thing has existed in other display technologies for a while. First you had THX certified video that dictated your brightness and colour settings, and with HDR 4k the video signal actually includes extra for setting those parameters in order to make up for the l

    • Freesync doesn't. Freesync 2 does.

  • >> FreeSync 2,... will enable monitors to show the exact intended image pixels that a game or other application wants to.

    Since when ever was this NOT happening? , specially with digital interfaces such as HDMI. This is total bullshit

    • by Gadget_Guy ( 627405 ) on Tuesday January 03, 2017 @01:25PM (#53598519)

      Since when ever was this NOT happening? , specially with digital interfaces such as HDMI.

      Consumer-level LCD monitors don't show the full colour gamut [lifewire.com]. It varies between monitors exactly how much of the computer's idea of the colour range can be displayed. This will only get more complicated as monitors start offering HDR.

      FreeSync 2 allows the monitor to tell the computer exactly what it can display so the graphic cards can output the exact colours that can are supported. This eliminates the need for the monitor to convert the video's colours on the fly. Supposedly this makes it faster to display, although given how fast monitors are now I'm not sure how much difference it will make.

    • Many monitors do not support the full color spectrum, instead they dither colors to be 'closer'. This can cause unwanted artifacts. Also it will make some pixels 'not what the computer software intended'. Using a digital interface such as hdmi or displayport will not help in this case.

      • Sort of right but you're looking at the wrong problem. This doesn't solve dithering. You still have to dither if you want to fill in the gaps in what the monitor can display, but even if you didn't want to, this is a problem limited to the most cheapest and nastiest of screens with a technology that is rapidly going out of fashion (6bit TN film panels).

        Instead what is being targetted here is a new range of monitors with a larger colour gamut than standard, and the fact that there's no way to currently tell

    • If you send 255,0,0 to your monitor I guarantee you'll end up with a different red shade as when I send it on mine. The monitor is oblivious to what colour you actually want to display. In Windows we call this colour management. In games we call this non-existent.

      This is the problem trying to be solved here.

  • Is this as opposed to all those monitors that just display whatever pixels they want to?

    Those must suck.

    • So all of them then. After all there's no way to tell a monitor what colour you should display. At the moment all you can tell a monitor is what to do with the output of each pixel. The resulting colour itself is entirely left up to the monitor itself.

  • It has more support and no mouse lag. Nvidia for life

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      you understand that g-sync is a objectively inferior standard when it comes to latency right? Because it requires additional hardware processing on the display's end to enable g-sync. Whereas FreeSync makes use of an optional ISO standard extension for LCD screens (that was mostly designed for laptops as a battery saving measure but is now being used on desktops for stopping screen tear) and therefore doesn't require any additional hardware cost.

      Say what you want about 'support' (I personally have opinions

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...