Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Displays Graphics Technology

VESA Embedded DisplayPort 1.4a Paves Way For 8K Displays, Longer Battery Life 94

MojoKid writes: The VESA standards organization has published the eDP v1.4a specification (Embedded DisplayPort) that has some important new features for device manufacturers as they bump up mobile device displays into the 4K category and start looking towards even higher resolutions. eDP v1.4a will be able to support 8K displays, thanks to a segmented panel architecture known as Multi-SST Operation (MSO). A display with this architecture is broken into two or four segments, each of which supports HBR3 link rates of 8.1 Gbps. The updated eDP spec also includes VESA's Display Stream Compression (DSC) standard v1.1, which can improve battery life in mobile devices. In another effort to conserve battery power, VESA has tweaked its Panel Self Refresh (PSR) feature, which saves power by letting GPUs update portions of a display instead of the entire screen.
This discussion has been archived. No new comments can be posted.

VESA Embedded DisplayPort 1.4a Paves Way For 8K Displays, Longer Battery Life

Comments Filter:
  • by Anonymous Coward on Friday February 13, 2015 @06:13PM (#49051589)

    It seems like the bigger sticker with 4k is how most interconnects have been locked to 30hz. eDP 1.4a supports 8k but at what refesh rate ?

    • by rubycodez ( 864176 ) on Friday February 13, 2015 @06:26PM (#49051657)

      other articles are claiming eDP 1.4a can support 8K (actually 7680 x 4320) resolution at 60 Hz http://www.tomshardware.com/ne... [tomshardware.com]

      • > @ 60 Hz.

        While this is great step in the right direction it is still rubbish.

        The sweet spot is between 96 Hz and 120 Hz for flicker free and head-ache free displays.

        Once you gamed on a 120 .. 144 Hz monitor there is no going back to crappy 60 Hz.

  • What is the point (Score:2, Interesting)

    by Anonymous Coward

    Of 8k for the most part? I mean, ok if it's for VR I get it. But for a TV you'd have to have something the size of an Imax screen to appreciate that resolution in any way whatsoever. Heck even for 4k you need a 100"+ screen to actually care at all.

    Dear TV makers: Where is our REC 2020 color gamut screens? Or screens with a brightness of 5k nits or more? Or our 10,000 to 1 contrast ratios. You know, things our eyes can readily see a difference in an appreciate beyond "moar pixels!!!" I'd buy a glassesless 3D

    • > Heck even for 4k you need a 100"+ screen to actually care at all.

      Nonsense. It's even apparent on a 27" monitor.

    • by dAzED1 ( 33635 )

      No one will ever need more than 640K of ram! You kids and your technology, it's all a fad!

      so yeah, uh..I can tell the difference quite well even on small screens. Now, my wife might not be letting me replace the main screen in the TV room with a 4k yet, but...she'll come around. The AVR and content providers we use already support it. If you can't see the difference between 1k, 4k, and 8k, then...your loss?

      • Cute, but a flawed analogy. TVs and monitor resolutions are upper-bounded in terms of useful resolution because of the inherently limited resolution of our own Mark I Eyeballs.

        Here's a handy chart [carltonbale.com] so you can see whether you or you wife should win the argument of whether to get a 4K set. As an example, if you're sitting about eight feet away from your TV, you'll need about a 70 inch TV to even start to see the difference between 1080p and 4K displays, and you'd need to jump up to 120 inches to get the full

    • 4K and 8K are not as much of a benefit to TV as it is to monitors. When pictures are in motion, the higher your perception of image detail is not as noticeable by ones eyes. As a monitor for a work environment where images are not motion pictures, 4K and 8K are very easily noticeable. It will give the image more depth and you will be able to use the extra workspace on a 40+ monitor. If you shrink down the image it is more recognizable than if you display the same image in the same size on a HD monitor.
    • Not really (Score:5, Interesting)

      by Sycraft-fu ( 314770 ) on Friday February 13, 2015 @08:27PM (#49052375)

      People confuse the difference between perceptible and optimal. So ya, to see every pixel on a 4k screen, it needs to be pretty big (or you need to be pretty close). However we should stop wanting that. Computer monitor have too long taught us that we should work at a resolution where we can make out each and ever pixel. Rather, the individual pixels should be so small that they are completely imperceptible under any circumstances. That requires a lot more pixels.

      As for your other requests, have you done any research on what is available, and the difficulties of what you are asking? This is the real world here, there are real engineering challenges. Let's go one by one:

      Rec 2020: That requires laser illuminates. Since the primaries are points along the spectral curve, you have to have monochormatic light sources, meaning lasers. You can get that from laser projectors currently, if you are willing to pay, no consumer displays. Of course it matters little since there is no Rec 2020 content. However you can have a DCI display no problem, the Panasonic 4k displays are just shy of a DCI gamut. Oh and Rec 2020 specifies an 8k resolution, by the way.

      5k Brightness: You don't have a power plug in your house sufficient for that kind of brightness, nor would you want to crank a display that high. Go have a look a commercial displays sometime, go see one of these things turned up to 700-800 nits. They are painfully bright in anything but a very brightly lit space. We are talking stuff made for direct sunlight usage. You don't want that in your home. That aside, you'd need a massive amount of power to deal with something like that, and noisy cooling fans to go with it.

      10000:1 contrast ratios: You can have that right now. High end LCDs pull it off with backlight dimming, OLEDs can handle it as is. You want an LCD that does it static? Not going to happen, and a basic understanding of how blocking technology works will tell you why. Emissive screens like OLED can do it without much trouble, but of course you are going to have real issues if you want a high bright display out of those since brightness is a killer for emissive technologies.

      Seriously, less with the silly whining. If you truly are interested in display technology, go learn about it and the limitations and issues. But don't just bitch and act like people should be able to magically figure out a way around tough engineering challenges. If it was easy, it'd be done already. If you think you have a solution well then, get on that. Go solve it and make a bunch of money.

      • Right on all points except one. Rec2020 is achievable with OLEDs. My 4 year old phone actually hits the 3 primaries defined by it almost dead on when measured with a spectral analyser.

        But agree, the GP was just whining. If you look up the wikipedia page on Rec2020 you'll find a long and ever increasing list of announcements of displays, cameras, storage standards, projectors etc that are starting to support the standard.

    • Of 8k for the most part? I mean, ok if it's for VR I get it. But for a TV you'd have to have something the size of an Imax screen to appreciate that resolution in any way whatsoever. Heck even for 4k you need a 100"+ screen to actually care at all.

      Because it's easier to maintain quality when post-processing if your shoots and edits take place in a higher resolution, and you downsample to get the final product. 8k edited to 4k is going to look better than 4k to 4k. Same as why movies are shot in 5k or high

    • by AmiMoJo ( 196126 ) *

      Ultra HD is what you want. 8k resolution, REC.2020 colour gamut, 120 fps. There are already TVs on the market that support it.

  • Wait for 3D holographic displays and projectors coming *real soon now*.

  • People have been focusing on 8K monitor actually (7680 x 4320 at 60hz) as if this cable is hard wired to only support a specific config. It supports multiple MST configurations, which means the ability to configure panels into one screen - using multi streams, or multiple monitors with multiple streams. This means it has more than enough support to drive 3 x 4K monitors using this cable, which I personally could easily put to work (I have 4 monitors of various HD resolutions and sizes). I would like havi
  • What happens with thunderbolt? Will there be data only thunderbolt? Will apple have to have Display only ports on the next mac pro?

    • The revisions are universal across both dedicated DisplayPort connectors, as well as embedded Thunderbolt.
    • This is a VESA standards deal. Apple will shun it.

      They seek to have their proprietary stuff adopted as 'standards' that they own.

  • Latest iMac sure looks nice, but I wonder if 4K at close distance would be any different. After all, it's only considered useful for pretty big TVs. Sounds like number-based marketing like clock speed in Pentium 4 days. What would the framerate be like if I try to play a game on this thing?

    • I did not hear much discussion about how HD on a 13/15 inch laptop was overkill when it became standard more than a decade ago.

      I use a 40 HD monitor as my main display, but I would definitely notice the difference the difference in having a 4K monitor (when I run the iPhone simulator - the resolution is high enough that it will not fit on the screen). A 40 inch screen is 600+% larger than that 15 inch laptop. So to keep the real resolution (pixel size) the same for a 40" monitor you would require 5K.
    • Diablo III @ 5120x2880: 31 fps [barefeats.com]. That's with the R295X (mobile) card, and presumably also with the i7. I'm not sure if that's average, or minimum. Some games may be playable at those sorts of framerates, but they might not be enjoyable.

      Text and photos on the iMac look as high resolution as those in a glossy magazine-- that's the main benefit.

      Other possible benefits include editing 4K video with room for palettes and the like.

  • Why dont they just support multiple cables to one monitor for the higher bitrates?
    • The first monitors out that were higher bandwidth requirements supported multiple cables, but consumers rejected that and preferred the displayport single cable solution. The standards organizations are just trying to make sure they can support what consumers generally want.
  • Madness

    It is madness to increase of the data rate in video signals even further. A monitor cable transmitting 32.4 gbps is not a good solution. It is something engineers can boast about but it also limits cable lengths and makes them more expensive. Trying to get 4K UHD over 20m is as expensive as a PC and the display together.

    Instead they should adapt and norm low-latency compression methods for the transmission of video signals. 1 gbps should be more than enough to transport an 8K UHD signal with 100Hz (

    • So instead of a little bit more expensive cable, you want the monitor to consume more energy and have to basically put a fairly powerful CPU/GPU inside the monitor. Both of those solutions increase the base cost of the monitor way more than the more expensive cable, and increase the ongoing power consumption costs of the monitor. That is really a better solution? For a select few that want to put their computer 20m away then just use a display with a thunderbolt port and use an optical cable (yes it is
      • by mseeger ( 40923 )

        a) Compression would probably even reduce power consumption. (De)Compression is an inexpensive task if not done by a G/CPU but a specialized chip. The power consumption of a monitor is determined by size and brightness. The rest is a rounding error ;-).

        b) Costs for a chip to do the compression would be 1$ if we look at the costs for HD/4K streams (for which most TV sets already have decoding chips).

        The only thing i am unsure of: latency. You need have a progressive compression that will not increase the lat

        • If compression/decompression were such a cheap thing to make standard, why is there not compression/decompression in every networks switch and network card? My networking using the standard low speed 1GBit for the local network is rather slow. For that matter how about the sata controller and hard drives? Or why not compress between the CPU and PCIe bus itself? If compression were cheap then all those could use it.

          I have a rather old mac pro, and the mplayerx I believe does not use the GPU and proba
          • by mseeger ( 40923 )

            Compression is cheap if it is only of one type (e.g. videos). General purpose compression is much more difficult (you have first to determine the best algorithm).

            On networks the lack of compression is also due to the fact that both sides need to have it and those may far apart (spatial as well as economical). But it is used increasingly nonetheless.

            You have to have dedicated hardware for it. With current equipment it cannot be done properly (NVidia has some of it in their SHIELD approach (detached second sc

        • Also some people have already had noticeable lag on the screen for the update of the mouse position when using 4K monitors at 30hz (caused by double-buffering in combination with low refresh rates) - now increase the latency and that would cause people to complain that it is unusable.
  • I personally see little value in 8k displays under the size of an /entire wall/ of your house.
    That being said, in 10 years, who knows, maybe we'll lay a flat, OLED sheet on a black painted wall and.... you know 240" TV?

    As for the frame rate, for the /most part/ I see little value in high refresh displays, but there are uses. If this plus is 'open' (no license fee) and powerful, well so be it. The more performance the better.

    That being said I did just discover this recenttly
    https://www.google.com.au/sear.. [google.com.au]

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...