VESA Embedded DisplayPort 1.4a Paves Way For 8K Displays, Longer Battery Life 94
MojoKid writes: The VESA standards organization has published the eDP v1.4a specification (Embedded DisplayPort) that has some important new features for device manufacturers as they bump up mobile device displays into the 4K category and start looking towards even higher resolutions. eDP v1.4a will be able to support 8K displays, thanks to a segmented panel architecture known as Multi-SST Operation (MSO). A display with this architecture is broken into two or four segments, each of which supports HBR3 link rates of 8.1 Gbps. The updated eDP spec also includes VESA's Display Stream Compression (DSC) standard v1.1, which can improve battery life in mobile devices. In another effort to conserve battery power, VESA has tweaked its Panel Self Refresh (PSR) feature, which saves power by letting GPUs update portions of a display instead of the entire screen.
No mention of refresh rate (Score:3, Interesting)
It seems like the bigger sticker with 4k is how most interconnects have been locked to 30hz. eDP 1.4a supports 8k but at what refesh rate ?
Re:No mention of refresh rate (Score:5, Informative)
other articles are claiming eDP 1.4a can support 8K (actually 7680 x 4320) resolution at 60 Hz http://www.tomshardware.com/ne... [tomshardware.com]
Re: (Score:3)
...other articles are claiming eDP 1.4a can support 8K (actually 7680 x 4320) resolution at 60 Hz
:-) Not quite sure you caught the bolded part that is not in the summary. Otherwise, carry on!
Re: (Score:2)
Re: (Score:2, Informative)
No, the refresh rate is *not* listed there, you smug moron. In case you don't have the synapses to deduce this, refresh rate is approximately a function of the display pixel count and link bandwidth.
You could drive an 8k display via carrier pigeon physical link, but your refresh rate would be almost literally at a glacial pace. As GP was noting, many systems that claim 4k support can only drive those 4k displays at gimped, tearing, 30Hz... rather than 60Hz. Thus, this prompted the inquiry regarding whether
Re: (Score:2)
Actually, with enough pigeons and big enough storage on them, you could have great frame rates, just horrible latency :-)
Re: (Score:2)
Re: (Score:2)
I was thinking a pair of 128 GB microSDXC, one for each leg...
Re: (Score:2)
Re: (Score:2)
Geez, you could at least read the question to which I was replying about REFRESH RATE FOR 8K which was NOT in the summary, you fucking ignorant blowhard.
Re: (Score:3)
> @ 60 Hz.
While this is great step in the right direction it is still rubbish.
The sweet spot is between 96 Hz and 120 Hz for flicker free and head-ache free displays.
Once you gamed on a 120 .. 144 Hz monitor there is no going back to crappy 60 Hz.
Re: (Score:3)
You need 120hz for VR displays, which is exactly where 8K embedded displays are actually useful, and the same reason John Carmack wants to try and cram that into an intelligent interlaced format to optimize for a 60 Hz capability model.
Re:Do we need 8K, except for special purposes? (Score:4, Interesting)
Nope. I saw an 8K video at CES. It's jaw dropping, like looking out a window. It's clearly superior to 4K.
Re: (Score:2)
Nice shill, but unless you were standing within like a foot of it or have 20/10 vision you simply can't appreciate the difference. Multiple tests have shown that, unless its a very specific image meant to test the limits of human vision (line pair tests) people don't really care about a 50" TV having anything over 1080p from the average viewing distance of six feet away.
While human vision can far exceed even 8k in total, this is only with specific imagery, meanwhile for the average use a 1440p screen looks
Re: (Score:3)
Some manufacturers are making Adobe 1998 RGB monitors. Even wider gamut is technically possible, but requires engineering compromises like narrow-bandwidth filters which are less efficient, or expensive options like more than 3 channels of LED backlights, or laser backlights, etc.. Sharp did introduce a 4-color TV, has this given them any market advantage?
High brightness is of little use indoors, and is a disadvantage for motion picture display because it requires a higher framerate. Tom's Hardware consider
Re: (Score:3)
It's clearly superior to 4K.
How could you tell it's clearly superior? Did they have side by side demos? I'm also wondering at what size it actually matters - or perhaps more accurately, the size to viewing distance ratio. Very often, these demos have you watching a very large TV from a relatively short distance away which makes it very easy to see the improved quality.
I've got a 60" LCD TV at home, and I'd guess I watch TV from about eight feet away or so. I can more or less discern the difference between 720p and 1080p content in
Re: (Score:3)
What you really need to know is this: Cinemas *at best* have DCI 4K, which has essentially the same resolution as UHD (4096x2160 theoretical, 3996x2160 actual for 1.85:1 and 4096x1716 for 2.39:1. Does cinemas - that can project a wall full with extremely expensive projectors look pixelated to you? No? Then you don't need 8K.
Re: (Score:2)
Still images require greater resolution than movies for the same level of visual satisfaction.
Early laser printers had 300 dpi, which was visibly inferior to printed material. That's 2550 pixels across a letter-size sheet of paper. Can you honestly say there's no use for having three sheets of paper visible at once?
Re: (Score:2)
The GP probably saw Ultra HD, not just 8k resolution. I've seen it too, and it is a major step up from 4k. It's not just the 8k resolution, it's the 120Hz frame rate and increased colour gamut. Sure, some TVs can fake 120Hz, but you get artefacts. Some TVs can fake the extended colour gamut, but look somewhat artificial. Ultra HD really is like looking at a window, the images don't look like a screen any more. If it were not for the lack of parallax you would be hard pushed to tell it wasn't real, with the
Re: (Score:2)
I can more or less discern the difference between 720p and 1080p content in most cases.
Then you need new glasses or retinas or something. The difference between 720 and 1080 is massive.
Maybe to view slashdot in the future ... (Score:2)
I seem to need wider than 1024 pixel window to prevent overlapping columns on Slashdot's main page now.
Re: (Score:2)
Nope. I saw an 8K video at CES. It's jaw dropping, like looking out a window. It's clearly superior to 4K.
What you are seeing is the exceptional quality of a properly calibrated top of the line Tv. Provided that you are not sitting right in front of it you will not notice a difference from a top of the line 4K Tv and if you take a few steps even further back even a top of the line HD Tv will look just as good. I really love it when people compare old TFT LCD's to new higher quality IPS LCD displays and proclaim to be able to tell the huge difference retina makes. Hint it has little to do with resolution.
Re: (Score:2)
So Sharp was the only company at CES that knows how to calibrate their TVs?
Re: (Score:2)
Re: (Score:2)
I could imagine absolutely humongous curved screens being really cool -- the periphery might not contain any information relevant to the plot of the movie, but it would make for a very immersive experience. I call it the 4pi steradian display...
Re: (Score:2)
Apparently, you'd have to sit closer than 3 feet away to a 96 inch 8K screen to be able to see any pixelation. Close indeed.
Re: (Score:2)
Isn't that beyond human perception?
No, its approximately the same resolution as my 24" desktop display (1920x1200) at 50".
I fully intend to replace my 4-arm LCD display rig with a single 50" panel as soon as a good 8K display hits $2000. I'm excited there's a connector for it now.
I believe I'll finally be done buying displays at this point, save for device failure. It's great when technology reaches the "good enough". I'm just not used to it after buying progressively better displays for the past 35 year
Re: (Score:1)
Re:Irrelevent (Score:4)
Yep.
I have an M6400 and rather than upgrade when the motherboard finally gave up the ghost, I bought a new motherboard. Why? Screens have gone backward in time. I have an RGB-led backlit 1920x1200 display, and the new ones have just white LEDs backlighting 1080p displays. Give me another RGB-LED option that is 1440p (in a 17" form factor) and I'll upgrade to a new Precision right now. Until then I'll keep my m6400 chugging along. :-(
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
What is the point (Score:2, Interesting)
Of 8k for the most part? I mean, ok if it's for VR I get it. But for a TV you'd have to have something the size of an Imax screen to appreciate that resolution in any way whatsoever. Heck even for 4k you need a 100"+ screen to actually care at all.
Dear TV makers: Where is our REC 2020 color gamut screens? Or screens with a brightness of 5k nits or more? Or our 10,000 to 1 contrast ratios. You know, things our eyes can readily see a difference in an appreciate beyond "moar pixels!!!" I'd buy a glassesless 3D
Re: (Score:1)
> Heck even for 4k you need a 100"+ screen to actually care at all.
Nonsense. It's even apparent on a 27" monitor.
Re: (Score:2)
Re: What is the point (Score:2)
Most people have better than 20/20 vision and can resolve finer objects than this oft-repeated but fallacious formula states. Don't take my word for it - the infallible experts who wrote the Wikipedia visual acuity page agree.
I know my vision is significantly better than 20/20. When I had my eyes tested last week, I could very easily read text that was a size smaller than the 20/20 line. Additionally, opticians seek to correct vision to 20/20 for each eye working individually. Anyone who has had a recent ey
Re: (Score:1)
No one will ever need more than 640K of ram! You kids and your technology, it's all a fad!
so yeah, uh..I can tell the difference quite well even on small screens. Now, my wife might not be letting me replace the main screen in the TV room with a 4k yet, but...she'll come around. The AVR and content providers we use already support it. If you can't see the difference between 1k, 4k, and 8k, then...your loss?
Re: (Score:2)
Cute, but a flawed analogy. TVs and monitor resolutions are upper-bounded in terms of useful resolution because of the inherently limited resolution of our own Mark I Eyeballs.
Here's a handy chart [carltonbale.com] so you can see whether you or you wife should win the argument of whether to get a 4K set. As an example, if you're sitting about eight feet away from your TV, you'll need about a 70 inch TV to even start to see the difference between 1080p and 4K displays, and you'd need to jump up to 120 inches to get the full
Re: (Score:2)
A 194 inch screen will technically still fit under a 8 foot ceiling.
Difference between TV and monitor use cases. (Score:1)
Not really (Score:5, Interesting)
People confuse the difference between perceptible and optimal. So ya, to see every pixel on a 4k screen, it needs to be pretty big (or you need to be pretty close). However we should stop wanting that. Computer monitor have too long taught us that we should work at a resolution where we can make out each and ever pixel. Rather, the individual pixels should be so small that they are completely imperceptible under any circumstances. That requires a lot more pixels.
As for your other requests, have you done any research on what is available, and the difficulties of what you are asking? This is the real world here, there are real engineering challenges. Let's go one by one:
Rec 2020: That requires laser illuminates. Since the primaries are points along the spectral curve, you have to have monochormatic light sources, meaning lasers. You can get that from laser projectors currently, if you are willing to pay, no consumer displays. Of course it matters little since there is no Rec 2020 content. However you can have a DCI display no problem, the Panasonic 4k displays are just shy of a DCI gamut. Oh and Rec 2020 specifies an 8k resolution, by the way.
5k Brightness: You don't have a power plug in your house sufficient for that kind of brightness, nor would you want to crank a display that high. Go have a look a commercial displays sometime, go see one of these things turned up to 700-800 nits. They are painfully bright in anything but a very brightly lit space. We are talking stuff made for direct sunlight usage. You don't want that in your home. That aside, you'd need a massive amount of power to deal with something like that, and noisy cooling fans to go with it.
10000:1 contrast ratios: You can have that right now. High end LCDs pull it off with backlight dimming, OLEDs can handle it as is. You want an LCD that does it static? Not going to happen, and a basic understanding of how blocking technology works will tell you why. Emissive screens like OLED can do it without much trouble, but of course you are going to have real issues if you want a high bright display out of those since brightness is a killer for emissive technologies.
Seriously, less with the silly whining. If you truly are interested in display technology, go learn about it and the limitations and issues. But don't just bitch and act like people should be able to magically figure out a way around tough engineering challenges. If it was easy, it'd be done already. If you think you have a solution well then, get on that. Go solve it and make a bunch of money.
Re: (Score:2)
Right on all points except one. Rec2020 is achievable with OLEDs. My 4 year old phone actually hits the 3 primaries defined by it almost dead on when measured with a spectral analyser.
But agree, the GP was just whining. If you look up the wikipedia page on Rec2020 you'll find a long and ever increasing list of announcements of displays, cameras, storage standards, projectors etc that are starting to support the standard.
Higher quality when creating final product (Score:2)
Because it's easier to maintain quality when post-processing if your shoots and edits take place in a higher resolution, and you downsample to get the final product. 8k edited to 4k is going to look better than 4k to 4k. Same as why movies are shot in 5k or high
Re: (Score:2)
35mm motion picture film is usually shot across the short axis, which means a width of 24mm. 6000/24 = 250, or 125 cycles per mm. Lenses that can achieve that with a decent MTF across the whole frame run about $30,000, if they exist at all. IMAX is talking about the theoretical limitation of the film itself, and probably not even that, since high resolution color film, at least for consumers, is no longer manufactured.
It gets worse. IMAX is abusing the technical term "depth of field" to imply some sort of i
Re: (Score:2)
Ultra HD is what you want. 8k resolution, REC.2020 colour gamut, 120 fps. There are already TVs on the market that support it.
Bypass all this stuff. (Score:1)
Wait for 3D holographic displays and projectors coming *real soon now*.
8K monitor = 4 x 4K monitors (Score:1)
What happens with thunderbolt? (Score:1)
What happens with thunderbolt? Will there be data only thunderbolt? Will apple have to have Display only ports on the next mac pro?
Re: (Score:2)
Re: (Score:1)
This is a VESA standards deal. Apple will shun it.
They seek to have their proprietary stuff adopted as 'standards' that they own.
Re: (Score:1)
Re: (Score:2)
Can anyone really see the difference? (Score:2)
Latest iMac sure looks nice, but I wonder if 4K at close distance would be any different. After all, it's only considered useful for pretty big TVs. Sounds like number-based marketing like clock speed in Pentium 4 days. What would the framerate be like if I try to play a game on this thing?
13/15" laptops have had the equivalent resolution (Score:1)
I use a 40 HD monitor as my main display, but I would definitely notice the difference the difference in having a 4K monitor (when I run the iPhone simulator - the resolution is high enough that it will not fit on the screen). A 40 inch screen is 600+% larger than that 15 inch laptop. So to keep the real resolution (pixel size) the same for a 40" monitor you would require 5K.
Re: (Score:2)
Diablo III @ 5120x2880: 31 fps [barefeats.com]. That's with the R295X (mobile) card, and presumably also with the i7. I'm not sure if that's average, or minimum. Some games may be playable at those sorts of framerates, but they might not be enjoyable.
Text and photos on the iMac look as high resolution as those in a glossy magazine-- that's the main benefit.
Other possible benefits include editing 4K video with room for palettes and the like.
Re: Can anyone really see the difference? (Score:2)
standard deviation would be a useful addition to benchmarking.
Multilink? (Score:1)
Re: (Score:2)
Madness (Score:2)
Madness
It is madness to increase of the data rate in video signals even further. A monitor cable transmitting 32.4 gbps is not a good solution. It is something engineers can boast about but it also limits cable lengths and makes them more expensive. Trying to get 4K UHD over 20m is as expensive as a PC and the display together.
Instead they should adapt and norm low-latency compression methods for the transmission of video signals. 1 gbps should be more than enough to transport an 8K UHD signal with 100Hz (
Re: (Score:1)
Re: (Score:2)
a) Compression would probably even reduce power consumption. (De)Compression is an inexpensive task if not done by a G/CPU but a specialized chip. The power consumption of a monitor is determined by size and brightness. The rest is a rounding error ;-).
b) Costs for a chip to do the compression would be 1$ if we look at the costs for HD/4K streams (for which most TV sets already have decoding chips).
The only thing i am unsure of: latency. You need have a progressive compression that will not increase the lat
Re: (Score:2)
I have a rather old mac pro, and the mplayerx I believe does not use the GPU and proba
Re: (Score:2)
Compression is cheap if it is only of one type (e.g. videos). General purpose compression is much more difficult (you have first to determine the best algorithm).
On networks the lack of compression is also due to the fact that both sides need to have it and those may far apart (spatial as well as economical). But it is used increasingly nonetheless.
You have to have dedicated hardware for it. With current equipment it cannot be done properly (NVidia has some of it in their SHIELD approach (detached second sc
Re: (Score:2)
Beneficial or not, like it or not, it's coming. (Score:2)
I personally see little value in 8k displays under the size of an /entire wall/ of your house.
That being said, in 10 years, who knows, maybe we'll lay a flat, OLED sheet on a black painted wall and.... you know 240" TV?
As for the frame rate, for the /most part/ I see little value in high refresh displays, but there are uses. If this plus is 'open' (no license fee) and powerful, well so be it. The more performance the better.
That being said I did just discover this recenttly
https://www.google.com.au/sear.. [google.com.au]