Radeon Graphics Cards To Support HDR Displays and FreeSync Over HDMI In 2016 (hothardware.com) 37
MojoKid writes: AMD's Radeon Technologies Group has announced a couple of new features for Radeon graphics support in 2016. FreeSync over HDMI support will be coming to all Radeons that currently support FreeSync. FreeSync over HDMI, however, will require new displays. The HDMI specification doesn't currently have support for variable refresh rates, but it does allow for vendor specific extensions. Radeon Technologies Group is using these vendor specific extensions to enable the technology. A number of FreeSync over HDMI compatible displays are slated to arrive early next year from brands including LG, Acer, and Samsung. The first notebook with FreeSync has also launched. Lenovo's Y700 gaming notebook is the first with a validated, FreeSync-compatible panel. The Radeon Technologies Group also announced that support for DisplayPort 1.3, HDMI 2.0a and HDR displays was coming in the 2016 pipeline as well. With current 8-bit panels, the range of colors, contrast, and brightness presented to users is only a fraction of what the human eye can see. When source material is properly mapped to an HDR panel, colors are more accurately displayed representing more closely what the human eye would see in the real world.
Re: (Score:2)
And what does the summary have to do with airplanes, exactly?
Re: (Score:2)
The NEC PA241W [necdisplay.com] does. It's a wide gamut IPS LCD screen with no LEDs except the power light.
By "LED-based," do you mean LED backlighting, or OLED?
Re: (Score:2)
Narrow bandwidth is exactly what you want from a display's color primary. The more pure the display is the more saturated it is and the wider the color gamut. If someone could really create an LCD with a perfect 630nm red, 550nm green and 450nm blue without any other frequencies it would be a fantastic display. The ultimate displays currently available today are lasers because they naturally produce extremely narrow frequency bands exclusively. The second best are probably though LCDs + Quantum dot. W
Re: (Score:2)
Never mind being flat wrong about old CRTs, which are 6 bits per color except for some insanely pricey specialized monitors that don't work with standard equipment. Also wildly wrong about 64 bit color depth. Perhaps this particular internet monkey was thinking, 64 bit total of all components, or 21 bit depth.
Re: (Score:2)
You illustrated my point about specialized hardware nicely.
Freesync is AMD's Dynamic Refresh Rate Thing (Score:2)
https://en.wikipedia.org/wiki/FreeSync [wikipedia.org]. FreeSync is AMD's answer to nVidia's G-Sync. They're both something about doing dynamic refresh rates, so you can use most of your speed updating things that change quickly instead of updating whole screens including the pixels that aren't changing very fast. It works over DisplayPort, but if you want to use HDMI you'll probably need to buy a new monitor (almost certainly your TV doesn't support it yet.) It's apparently marketed toward gamers.
Re: (Score:3)
Re: (Score:1)
HDR is about mapping bigger values that what the bit depth allows through a non-linear color profile
Hey, we could use a power function. intensity = luminance ^ 2.2 should match the human visual system pretty well.
Oh, wait...
Re: (Score:2)
No, HDR is about fucking with contrast in one part of the image and fucking with contrast differently in another part of the image.
It looks terrible every fucking time, and it's less accurate than just linearly plotting everything after setting your curves once for the whole image.
It's absolutely retarded to have a curve that is different over different parts of the same image.
HDR photography involves a person doing this once per image, typically based on a source with a higher range (like a camera's RAW ou
Re: (Score:2)
Don't confuse HDR photography with HDR display.
HDR display comes closer to representing the luminance captured by imagers in the display. Today, linear light-level capture on imagers is crushed by a "gamma" function for transmission and display (Luminance out = code value ^ 2.4).
The gamma function inherently limits effective dynamic range to about 7 to 9 f-stops before you start to see banding between adjacent code values.
Alternative "electro-optic transfer functions" than gamma (such as SMPTE ST 2084 "per
Re: (Score:2)
HDR means "high dynamic range".
"High" is a joke - there's nothing higher about the range for photography or displays.
"Dynamic" means changing across different areas of the image.
"Range" refers to the range of signals (colors) that can be output.
- For "HDR" photos the range is fixed at whatever format you're exporting to, typically someone takes a RAW file, fucks up the contrast in different areas, then exports it as a JPG or other 8-bits-per-channel format.
- For "HDR" displays, this refers t
Re: (Score:2)
It is impossible to separate "HDR" photography from "HDR" displays because they both do the same thing - fuck with contrast in different areas of an image differently in order to overcome limitations of the resolution/gamut of the format/display.
Wow ignorance and attitude what a lovely combination.
Neither inherently 'fuck with contrast'. HDR photography is just capturing a High Dynamic Range of values. So SDR would be 0.01 nits -> 100 nits. HDR would be 0.01 nits to 1,000 nits. Almost every decent camera today can capture at least that much dynamic range.
. The display you used as an example is "HDR" via the "Peak Illuminator" feature. It's just dimming the LED array, as all "HDR" displays are.
Nope. HDR is overdriving the LED array not dimming them. Yes, most HDR displays do use localized dimming but OLED doesn't, it just displays each pixel by itself and it can get up to 60
Re: (Score:2)
No, HDR is about fucking with contrast in one part of the image and fucking with contrast differently in another part of the image.
It looks terrible every fucking time, and it's less accurate than just linearly plotting everything after setting your curves once for the whole image.
It's absolutely retarded to have a curve that is different over different parts of the same image.
Effectively every sentence you wrote is completely wrong. What you are calling "HDR" is actually "localized tone mapping" it's a filter effect like adjusting the histogram, it's not HDR. Your reaction is like someone looking at a red/green anaglyph stereo image (http://www.designcommunity.com/scrapbook/images/125.jpg) without glasses on and saying "This 3D business is AWFUL, the colors are all weird and the image is doubled. It doesn't look dimensional at all!"
HDR just means "High Dynamic Range".
Re: (Score:2)
Full LED displays (NOT LCDs with LED backlights) can easily reproduce a wider range of colors if they use the right LEDs (some displays have added an extra color like yellow, some simply move the RGB LEDs further apart on the spectrum). Other display types can do this as well, but it's not as simple as with LEDs.
Full LED displays are only used in like ballpark score boards and billboards and their color is terrible. Unless you mean OLED, in which case just call it OLED. Almost every LCD display today uses a white LED and an array of filters.
Most "Full LED" displays are just 6500k LEDs with a colored filter. So a white LED + LCD filter array is pretty much the same as a white LED + Filter coating. 'Regular' LCDs are also just as good as OLED at color gamut if not better. You take a UV LED and you use quantu
Re: (Score:2)
Not with a HDR profile. HDR is about mapping bigger values that what the bit depth allows through a non-linear color profile, it's a bit like a 24bit RGB jpeg with an AdobeRGB profile.
This would interest me a lot more if it weren't 2015 and we still don't have a decent fucking colour management system at the OS level that actually forces colours to render correctly on a display.
I feared exactly this situation... (Score:1)
First, NVIDIA comes out with their own implementation of variable refresh rates with G-SYNC, saying they'll never support AMD's approach. Then AMD digs their heels in, insisting that their support is more industry standard. Now, AMD is using "proprietary extensions" to enable it over HDMI, and you know that even if NVIDIA were secretly considering extending support to the AMD approach for DisplayPort, they sure as heck won't support AMD's extensions just to add support over HDMI.
As a nerd and a gamer, I fee
Vendor specific extensions != proprietary (Score:5, Interesting)
Now, AMD is using "proprietary extensions" to enable it over HDMI
The featured article uses the term "vendor specific extensions". I imagine that AMD has every right to license this extension royalty-free to HDMI display manufacturers, just as it did for the DisplayPort version of FreeSync.
Re: (Score:2)
There's nothing proprietary in those extensions since FreeSync is part of VESA standards. If anything, blame HDMI for not following VESA.
Say what (Score:2)
When source material is properly mapped to an HDR panel, colors are more accurately displayed
HDR = high dynamic range = difference of intensity between brightest and dimmest details, nothing to do with color. Well you could fuck with the colors to increase intensity because obviously you can only use 1/4 pixels in an RGBG pattern for red while you can use 4/4 for white, but I don't think anybody would make that trade-off. Rec. 2020 on the other hand will give you a wider color space and 10 bit color better accuracy, they're baked together in UHD broadcast/disc but HDR is just one part and not this
Re: (Score:2)
Brightness is important to color rendition. If you have a pure saturated red screen. But it's at 50 nits it's going to look like an unsaturated drab screen. Crank up that pure red screen to 5,000 nits and the color will be perceived as eye searing fire. If you can only ever reproduce a hue at 100 nits you're missing out on all colors which are very bright. Take a simple gradient: http://onlineteachingtoolkit.c... [onlineteac...oolkit.com]
If your display could only handle the bottom half of the luminance you wouldn't be able
Sync stability (Score:2)
Extra bit depth is all very nice, but what I would really like from a new HDMI standard is faster, more stable syncing when switching resolutions and inputs. Any chance I'm more likely to see that with HDMI 2.0 devices than HDMI 1.3/1.4?
Or is this something caused by HDCP that HDMI can't fix?
Re: (Score:2)
Nooooope. I have a Vizio UHD tv and apparently fullscreen Netflix actually changes your display settings and even through 2.0 when I maximize or minimize the Netflix window I lose input and sometimes the TV completely loses sync and says "no Input" until you change to a different HDMI and back. Sadly it seems to have gotten way worse.