First AMD FreeSync Capable Gaming Displays and Drivers Launched, Tested 63
MojoKid writes Soon after NVIDIA unveiled its G-SYNC technology, AMD announced that it would pursue an open standard, dubbed FreeSync, leveraging technologies already available in the DisplayPort specification to offer adaptive refresh rates to users of some discrete Radeon GPUs and AMD APUs. AMD's goal with FreeSync was to introduce a technology that offered similar end-user benefits to NVIDIA's G-SYNC, that didn't require monitor manufacturers to employ any proprietary add-ons, and that could be adopted by any GPU maker. Today, AMD released its first FreeSync capable set of drivers and this first look at the sleek ultra-widescreen LG 34UM67 showcases some of the benefits, based on an IPS panel with a native resolution of 2560x1080 and a max refresh rate of 75Hz. To fully appreciate how adaptive refresh rate technologies work, it's best to experience them in person. In short, the GPU scans a frame out to the monitor where it's drawn on-screen and the monitor doesn't update until a frame is done drawing. As soon as a frame is done, the monitor will update again as quickly as it can with the next frame, in lockstep with the GPU. This completely eliminates tearing and jitter issues that are common in PC gaming. Technologies like NVIDIA G-SYNC and AMD FreeSync aren't a panacea for all of PC gaming anomalies, but they do ultimately enhance the experience and are worthwhile upgrades in image quality and less eye strain.
Re: (Score:3)
Vsync prevents the frame from being written to while it's displayed, avoiding tearing. FreeSync/Gsync makes the monitor wait for the frame to be rendered and ready for display instead of the other way around.
Re: (Score:2)
I imagine this will be standard in the next generation of display connectors.
I don't think it's being mandated in DisplayPort 1.3 though, for whatever reason. [wikipedia.org]
Re: (Score:2, Informative)
Vsync keeps a frame from being written to the monitor until the monitor is ready for it. Say for example, your card can generate 70 FPS and your display is clocked at 60 FPS, your card will draw a frame, wait for the end of the previous frame, send the newest frame, then start generating the next frame. You lose out on the extra 10 fps that your card could be producing. On the other hand, if you run up against a heavy scene and your card's capability drops to 59 fps, it will take more than a single refre
Re: (Score:2)
A good summary. Here [geforce.com] is Nvidia's overview of G-SYNC.
Comment removed (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I don't know if you're obtuse on purpose or not, but I'll explain in any case: if a person wants to be able to take advantage of adaptive vsync, but they have an NVIDIA GPU their only choices are to buy into NVIDIA's lock-in or buy a new GPU *and* a new display. Both situations are anything but ideal.
Or buy a monitor without adaptive sync and live without it.
Or buy a monitor with adaptive sync (freesync) and wait for Nvidia to support it in newer drivers.
Or buy a monitor with adaptive sync (freesync) and buy a new gpu that supports open standards.
Buying a new monitor or gpu? Sell the old one.
Re: (Score:2)
Yep, monitors and GPUs are very easy to sell commodity items.
Re: (Score:1)
It's bullshit. NVIDIA should work with AMD and the other manufacturers on this, not against them.
The mistake you're making is to assume that in a free market what's best for consumers is also best for corporations.
Re: (Score:1)
It seems like the laptop version of G-Sync is using the same protocol as FreeSync (i.e it doesn't require any special hardware).
http://www.extremetech.com/ext... [extremetech.com]
So, maybe somebody could hack Nvidia's driver to make it compatible with FreeSync monitors?
Never leave (Score:1)
Re: (Score:2)
Tit-Synced !?
Re: (Score:3)
Playing a video with breasts in motion is actually one of the best ways to check for screen tearing issues... :)
Re: (Score:3)
Now if AMD drivers could not suck (Score:3)
Now, if AMD linux drivers could really not suck, that would be awesome.
Because their drivers are crappy. Their FOSS driver is crappy and their propiertary driver is crappy. They are really putting the cart before the horse here. What they really need to do, is just a massive bug hunt with their drivers. Right now they are lacking.
Oh, and its hurting sales, because people won't buy AMD cards because they are known to be buggy. Even after they fix them, its going to take a lot of them to be seen as reliable.
Re:Now if AMD drivers could not suck (Score:5, Insightful)
What's the problem with FOSS driver? I haven't had any issues in years. Runs anything available at steam with 7950 at least.
It's just an nVidiot screaming the same old bullshit from over a decade ago.
Why still 1080? (Score:4, Interesting)
I've got a 24" monitor that's 10 yeas old and it's native resolution is1900x1200. Why the regression in recent years back to 1080? You'd think monitors today would have continued advancing. Sure, give them 1080 capability, but still they should have a much higher native resolution by now.
Re: (Score:3)
Economy of scale. The HDTV standard settled on 1080p. That was worse than the 1200p that was getting quite commonplace at the time, but close enough that manufacturers could justify consolidating their product ranges into mostly making 1080p for everybody, thus reducing their operating costs. Price of 1080p went down, and the price of 1200p was raised as manufacturers' inclination to supply them dwindled, causing a resultant reduction in demand, and so 1080p became standard. It's a pity because 1900x1200 r
Re: (Score:2)
lots of options available still (Score:3)
Dell U2412M, U2413, U2415 are all 24" monitors with 1920x1200 screens.
Or you can jump up to 27" 2560x1440, 30" 2560x1600 or even 34" 3440x1440
Or you can go to a 4K screen or even a 5K one.
Re: (Score:2)
It's hard to call it a regression when it was driven by popularity.
Also isn't your comment about a year too late to be relevant? Right now there are more QWXGA and QHD screens on the market than ever, let alone the incoming 4k screens. And they are affordable too!
Re: (Score:2)
Also isn't your comment about a year too late to be relevant?
Indeed. Even Walmart is offering greater than 1080p choices now, for both monitors and TVs.
1920x1200 is no longer something to brag about... adjust accordingly lest you start to sound like an old person. :P
Re: Why still 1080? (Score:2)
4K TV is happening ... hang onto your hat and wait for the 4K 60Hz 4:4:4 panels later this year. Almost there.
Of course I'm buying the first 50" 8K display I can get my hands on for less than $2K. After 30 years of upgrading displays, I think I will be done.
Re: (Score:2)
What does Moore's Law have to do with processing power?
Your 4850 was built on a 55nm process. Current GPU's are 28nm. Mobile processors are down to 14nm. Next year it'll be 10nm
Tearing (Score:1, Interesting)
"This completely eliminates tearing and jitter issues that are common in PC gaming."
Adaptive sync should fix tearing but it won't do much for jitter. That has to be fixed in the game program. Jitter occurs when frames, each representing a point in time, are displayed at different times than the ones they represent. A game program must try to advance the simulation time for each frame an amount that matches the time that will elapse before the frame is displayed, but it can be difficult to know what the simu
Re: (Score:2)
Jitter is usually caused by small fluctuations in frame rates which happens in pretty much every game. With GSYNC the monitor displays the frame as soon as it is rendered, so if frame one takes 16.42 ms to render and the next frame takes 16.01 ms to render the motion will still be smooth. With a normal monitor you would get a slight jitter.
The only jitters GSYNC can't fix are in poorly programmed engines, typically seen in bad console ports.
I've bee
Most important question for me: Films? (Score:2)
Re: (Score:2)
Many monitors already support 24000/1001Hz refresh rates.
But yes, playing video with the FreeSync technology is going to be a possibility. For the sub-R290 AMD cards it will actually be the only supported mode (i.e. no FreeSync gaming support).
Most "scaler" chip manufacturers (Score:1)
Most "scaler" chip manufacturers support AMD's FreeSync already. So gg nVidia...
Thats great, now about your shitty response rates (Score:2)
Any word on when we'll get a flatpanel that isn't like watching an oil painting smear around in realtime?
Re: (Score:2)
Let me guess...
You buy the cheapest panels available and drive them from a VGA port.