Forgot your password?
typodupeerror
Displays

4K Displays Ready For Prime Time 207

Posted by Soulskill
from the all-the-pixels dept.
An anonymous reader writes "After the HD revolution, display manufacturers rolled out gimmick after gimmick to try to recapture that burst of purchasing (3-D, curved displays, 'Smart' features, form factor tweaks, etc). Now, we're finally seeing an improvement that might actually be useful: 4K displays are starting to drop into a reasonable price range. Tech Report reviews a 28" model from Asus that runs $650. They say, 'Unlike almost every other 4K display on the market, the PB287Q is capable of treating that grid as a single, coherent surface. ... Running games at 4K requires tons of GPU horsepower, yet dual-tile displays don't support simple scaling. As a result, you can't drop back to obvious subset resolutions like 2560x1440 or 1920x1080 in order to keep frame rendering times low. ... And single-tile 4K at 30Hz stinks worse, especially for gaming. The PB287Q solves almost all of those problems.' They add that the monitor's firmware is not great, and while most options you want are available, they often require digging through menus to set up. The review ends up recommending the monitor, but notes that, more importantly, its capabilities signify 'the promise of better things coming soon.'"
This discussion has been archived. No new comments can be posted.

4K Displays Ready For Prime Time

Comments Filter:
  • Where's The Content? (Score:3, Interesting)

    by CycleFreak (99646) on Friday May 30, 2014 @11:02AM (#47128841)

    So I can get a 4k display for less than $700. Where can I get content worth watching on that display? Not only worth watching, but is somehow made better by all those extra pixels.

    All that aside, seems like it would make for a really nice PC monitor.

    • by by (1706743) (1706744) on Friday May 30, 2014 @11:09AM (#47128883)
      If the BBC released their nature documentary series (Life, Planet Earth, Africa, Frozen Planet, etc.) in 4k, that would really be tempting...

      I'm sure David Attenborough's voice would sound even better in 4k ;)
    • by fuzzyfuzzyfungus (1223518) on Friday May 30, 2014 @11:16AM (#47128949) Journal
      For the moment, PC monitor is pretty much the compelling use case. There are a few pricey white-elephant 4k video sources; but not many. PCs, by contrast, just see a bigger monitor(barring a tediously long list of, sometimes GPU-vendor, even model, specific gotcha interactions with some of the hacks used by certain 4k displays to cope with the fact that none of the common interfaces are quite there yet for 4k, with Displayport, a monitor trying to use MST can get...interesting. With HDMI, I hope you like 30Hz, because them's the breaks, and I assume that EDID is total garbage, as ever). If you do comparatively lightweight stuff, even a modest GPU can probably drive it without incident. Gaming will require some serious punch; but anything remotely modern can run at the resolution it is told to, if you have the power.
    • by houstonbofh (602064) on Friday May 30, 2014 @11:27AM (#47129053)
      This is what you use to create it. As a monitor, content generation and video editing station, it rocks. And without a cheap way to create content, how would it be created?
    • by Rei (128717) on Friday May 30, 2014 @11:45AM (#47129203) Homepage

      I did the calculations and don't care to repeat them again, but depending on your use case, it might help... or it might be totally imperceptible. A medium-large on the other side of a good-sized living room, your eyes shouldn't be able to see the difference. On the other hand, a large computer monitor right in front of you, in many situations you will be able to see the difference. Note that human eyesight isn't a simple matter of resolution comparisons, it gets kind of complicated... there's basic measures of how far apart you can see two black dots or lines separated by white before they merge into one, but the less the contrast, the greater the distance they have to be separated (absolute brightness matters too, as does distance from the center of your field of vision and all sorts of other stuff), and of course your ability to perceive fine detail drops tremendously when viewing moving objects. But in relatively static, high contrast images, on a large screen near the viewer (say, a nice computer monitor), most people shouldn't have trouble seeing the difference in a side-by-side comparison.

      The only problem with this gimmick is that we're basically running into a resolution dead-end here, there's only so far you can go before the improved detail becomes meaningless. I hope for their sake that they come up with true (non-stereoscopic) 3d or something of that nature, or they're going to be running out of TV-sales gimmicks.

      Hmm, I just thought of something that I heard about a good while back but haven't seen any movement on - "peripheral vision" TVs. I seem to recall reading years ago about a type of TV that used lights around the edges to dimly shine the peripheral colors on the TV image around the room parallel to the TV, giving the illusion to your peripheral vision of an expansive screen. I could envision improving that with a video format that includes a lower-resolution peripheral video stream and side projectors instead of simple side lights. Maybe that could be the next gimmick. ;)

      • by Bodhammer (559311) on Friday May 30, 2014 @12:02PM (#47129341)
        http://www.research.philips.co... [philips.com]
        Roll your own:
        https://learn.adafruit.com/ada... [adafruit.com]
      • by Tx (96709) on Friday May 30, 2014 @12:11PM (#47129393) Journal

        "Hmm, I just thought of something that I heard about a good while back but haven't seen any movement on - "peripheral vision" TVs. I seem to recall reading years ago about a type of TV that used lights around the edges to dimly shine the peripheral colors on the TV image around the room parallel to the TV, giving the illusion to your peripheral vision of an expansive screen."

        Philips Ambilight.

      • by strikethree (811449) on Friday May 30, 2014 @12:27PM (#47129501) Journal

        The only problem with this gimmick is that we're basically running into a resolution dead-end here, there's only so far you can go before the improved detail becomes meaningless.

        Why would you even discuss this now. We are NOWHERE near the types of resolutions that my eyes are happy with. Yes, I am an elitist snob who couldn't tell a pixel from a hole in the ground. I do not care. Stop whining about how none of us can tell the difference. I can tell the difference and even if I can not, I believe I can tell the difference.

        I do NOT want to see even a hint of blockiness or fuzziness at the edge of a font. I want curves that appear to be perfect curves. As it stands now, I can clearly see blockiness in all fonts. With hinting turned on, some aspects of the blockiness goes away but it is still there... and now the fonts are fuzzy too. Will 4K solve that? Not even close. Will it be much much better than what we have now? Yes!

        Stop blocking progress with your negative whining about arcs and distinguishability. I may not be able to argue against your science and maths, but science always loses out to reality. Look at the blocks and fuzziness in this message and dare to tell me that I am wrong.

        • by ColdWetDog (752185) on Friday May 30, 2014 @12:53PM (#47129733) Homepage

          Inch. Away. From. The. Screen.

          Slowly.

        • by Gothmolly (148874) on Friday May 30, 2014 @01:36PM (#47130125)

          Maybe you need glasses, it looks fine to me. Or you're being a douche because this is Slashdot. Do you have aerodynamically square Monster cables too?

          • by strikethree (811449) on Friday May 30, 2014 @03:21PM (#47131115) Journal

            Nah. I am not trying to be a douche but I am undoubtedly succeeding. *sigh* Such is life. Here is where I am coming from:

            While the Commodore Amiga 500 was not my first computer, it was the one that brought me very deeply into computing. I first hooked it up to an NTSC television set. The fonts were extremely jagged and the images were extremely blurry. I should probably add that the color was absolutely terrible too. But it worked. I fell deeply in love with my Amiga 500. It was the most awesome computer on the planet. It had a flat 32 bit memory space and preemptive multitasking. It was god compared to the standard IBM PC and Microsoft DOS.

            I eventually was able to afford to buy a used "real" monitor for it. Essentially the same resolution but much higher quality. The fonts were still jagged though.

            Through the years, I have upgraded my monitors continuously, with one of the best monitors being the Apple 30 inch Cinema Display running at something like 3560x1600 or somesuch. A _very_ nice monitor. Currently, I am using a Samsung 48 inch 1920x1080 screen as a display.

            One thing that was common across ALL of the displays is that curves never looked like continuous curves and fonts always looked blocky. It is possible that problem may not be resolution, but I doubt it. I look at photographs of handwriting, images that should show continuous curves, and it still does not look "right". Either it is fuzzy or the pixels intrude.

            Maybe I put my face too close to the monitor. Maybe I just expect too much. Maybe I notice things that other people do not notice. Regardless, No matter how much anti-aliasing I use in Grand Theft Auto IV, lines that are not perfectly vertical or horizontal have a staircase effect. No matter what type of font hinting I use, fonts seem blocky and or fuzzy. Perhaps 1920x1080 is enough and I just want too much.

            4K screens look gorgeous. I look at them at the Sony store in the mall. My eyes are still drawn to the imperfections in the red headed girl's hair (in the demo) despite the fact that it is mathematically and scientifically impossible to see them. *shrug*

      • by kcitren (72383) on Friday May 30, 2014 @12:45PM (#47129665)

        medium-large on the other side of a good-sized living room, your eyes shouldn't be able to see the difference

        That's simply not true. While you won't notice it in level of detail, you will notice it due to the increased dithering and smoothness of color gradients. Things will look better at all normal viewing distances. Although my real hope for the future is in ultra-ultra-ultra high definition displays (think something like the equivalent of a 32K 46" monitor). With that new possibilities actually open up, tie that to a Lenticular lens system an you'll have multiple angle high definition viewing. Imagine a tele-conferencing system where they place the monitor and multiple cameras at the edge of the conference table with a similar setup on the other end. The effect would be more like looking through a piece of glass dividing the table than looking at a flat monitor.

      • by AmiMoJo (196126) * <mojo@NOspaM.world3.net> on Friday May 30, 2014 @12:59PM (#47129793) Homepage

        there's basic measures of how far apart you can see two black dots or lines separated by white before they merge into one

        This is a really common misunderstanding of how human vision works. While it's true you might not be able to distinguish two dot, you can distinguish varying line widths and the sharpness of high contrast edges like text.

        That is why text looks sharper on a 4k display, even at some distance. It's why people can distinguish a 4k display from a 2k display at normal viewing distances. It's why people can tell the difference between a Nexus 7 and an iPad Retina (264 ppi vs 323 ppi) even though both are seemingly beyond the ability of the human eye to distinguish individual pixels. It's why printers and publishers use 300 DPI or more.

    • by EvilSS (557649) on Friday May 30, 2014 @12:39PM (#47129615)

      All that aside, seems like it would make for a really nice PC monitor.

      It probably seems that way because it is a PC monitor.

    • by Hadlock (143607) on Friday May 30, 2014 @12:45PM (#47129673) Homepage Journal

      Netflix? House of Cards and all of their new original series are shot and displayed in 4K if your device and display support it.
       
      Also, there's a much higher quality Samsung 4K display [techreport.com] for $50 more, that is probably the model you want.

    • by Blaskowicz (634489) on Friday May 30, 2014 @03:27PM (#47131163)

      There's tons of trivial sources of content either dynamically generated, made by others or made by yourself.
      Of course, text and PDF, and unix-like terminals. Pictures and photographs ; there's nothing special to do, open a picture that comes out of the camera and look at it.
      For games, you can probably play old RTS and such even on low/mid end graphics card. Else this monitor should allow you playing at non native resolutions.

      There's even new kinds of content that such a high res display would make possible. Scans of old books, 16th to 19th century with their funny letters, fonts and illustrations, or large format 20th century magazines. You can read them already but most often it looks like garbage and too low res, like fax quality. (OCR fails or require special packages btw when you have funny stuff like the way s and t are linked in "forest", or the s letter that looks like the integration symbol, and then old orthographies and stuff)
      High quality scan requires lots of storage, bandwith and display pixels. Now we can have all three on a desktop PC. Reading a 1680s edition of a book written by Newton would be fun, or whatever you can be interested in.

    • Who cares about content? The good news is that finally display makers are getting off their collective asses and producing computer displays at higher resolutions than 1920x1200.

  • by wbr1 (2538558) on Friday May 30, 2014 @11:09AM (#47128885)
    The burst of HD purchases, and the resultant gimmicks were largely for home TV use. The display reviewed is spoken of being used as a display device for a computer. More specifically for PC gaming. The two are NOT the same. Please quit comparing them.

    I may have use of a 4k monitor. I doubt I will ever need a 4K tv, even if source material were readily available. My rarely watched 1080p does just fine. Most consumers would likely agree. For TV/Movie viewing 4k IS a gimmick.

    • by DocSavage64109 (799754) on Friday May 30, 2014 @11:18AM (#47128965)
      I was at Costco the other day and they had a 4K tv on display running a demo with scenes of flowers and mountains. The detail was amazing and I can certainly see myself buying one once they are reasonably priced. For the time being, I'm fine with my 6 or 7 year old 50" 720p plasma tv.
    • Hey, tell that to the manufacturers... For whatever insane reason, they've recently been on a kick of adding features that would actually be useful for monitors (like curvature, which might actually make a difference when you are 18 inches from a rather large screen; but barely matters from across the room, except making the thing harder to wall-mount, and 4k, for which there is essentially zero movie, TV, or cable content; but PCs can spit out on demand) to TVs and then being vaguely confused when the public goes out and buys whatever reasonably big TV is cheapest.

      Thankfully, "TV" now mostly just means "LCD monitor with ATSC tuner and probably more HDMI ports", so using TVs as monitors isn't a big deal (sorry brits, pay your BBC fee!); but it's still weird.
    • by timeOday (582209) on Friday May 30, 2014 @11:25AM (#47129039)
      I disagree that 4k TVs are a gimmick, particularly as they get bigger. Football and soccer would look GREAT on an 80 inch 4k TV. It will change how the games are shot, so you can really get a sense of what everybody is doing instead of following the ball so closely.

      Granted, for now the bitrate rather than resolution is the limiting factor, since cable/satellite/broadcast signals aren't even 1080p, they're 1080i or 720p with inadequate bitrates. But the quality of video through Netflix / Amazon Prime right now was almost unimaginable when youtube launched, less than 10 years ago(!) with 320x240 video only, and it seemed doomed to crash the Internet.

      I also play split-screen games on my TV with my son and it would be great for that, although not with the current generation of consoles.

      • by Hadlock (143607) on Friday May 30, 2014 @01:17PM (#47129961) Homepage Journal

        Football and soccer would look GREAT on an 80 inch 4k TV. It will change how the games are shot, so you can really get a sense of what everybody is doing instead of following the ball so closely.

        Football is shot at close angles specifically to tell a narrative; they will not and are required not to show the full field during a play. This is the view that coaches get on a closed loop. It is available to the public, but only 4 hours after the game ends and you have to pay a special subscription to get it.

    • by houstonbofh (602064) on Friday May 30, 2014 @11:30AM (#47129087)
      You lost that battle years ago. Once widescreen laptops took over, it was decided by the LCD manufactures that no one needed more than 1080p anymore. That is why I am on an "antique" 24in Dell Ultrasharp.
    • I don't see how 3D gets permanent "gimmick" status while 4k doesn't... there are times when seeing things in 3D give you a completely different perspective, feel, immersion, and experience than something not in 3D. There are times when higher resolution does the same. And there are times when both actually seem to make things worse. Curved TVs as well... I run three monitors on my desktop and I'd be ecstatic if I could get the same resolution in a single curved display. If it weren't curved, though, then I'd have to sit farther away to see the edges properly and that distance is beyond the "retina" distance for my monitor's resolution, so I'm kind of wasting pixels.

      Much of the math is different between TVs and monitors and, yes, much of what is gimmicky in one situation is definitely not in another.

    • by jratcliffe (208809) on Friday May 30, 2014 @02:37PM (#47130679)

      Honestly, it's not a gimmick. 4K does look MUCH better than 1080p. It's really pretty spectacular. Of course, it's hugely dependent on screen size and viewing distance. If you're sitting 20 feet away from a 32" screen, 4K/1080/720/480 are going to look exactly the same.

      What 4K gives you is some combination of better picture and/or larger screen size, with a comparable picture quality.

  • by kimvette (919543) on Friday May 30, 2014 @11:11AM (#47128913) Homepage Journal

    But when are the 140Hz or 120Hz 3D capable models going to be available? Even if 3D is limited to 1140p or 1080p I want the capability for 3D gaming and watching 3D movies on my PC. Right now the best I can get is a 1080p, or very soon, a 1440p monitor, and will have to buy separate 4K 2D monitors for 4K. :-(

    • by houstonbofh (602064) on Friday May 30, 2014 @11:35AM (#47129123)
      The problem is bandwidth. How do you get all that data down the wire? The original Korean high def 2550s used Dual DVI, essentially 2 cables. This worked, but was poorly supported. The Seiki 4k is HDMI 1.4, so it is stuck at 30hz. To get 60hz, you need HDMI 2.0, and that is far from common right now. Then you start looking at bus speeds... 4k 120hz is a LOT of data.
    • by level_headed_midwest (888889) on Friday May 30, 2014 @12:33PM (#47129549)

      They will probably be available in a year or two. We moved from hackish 30 Hz split-input panels to native 60 Hz single-input panels in about a year. However anything beyond 60 Hz is pretty much useless except for bragging rights as you can't see it anyway. Broadcast TV and movies are shot at 29.997 and 24 Hz, respectively. The lack of benefit of higher refresh rates is especially true on a display that is capable of displaying static images like an LCD.

      • by kimvette (919543) on Friday May 30, 2014 @03:38PM (#47131305) Homepage Journal

        120Hz is great for 3D because then you get 60Hz for both the right and left eye. Remember for a real desired refresh rate in 3D, you need to double the framerate because most screens use shuttered glasses rather than prismatic displays.

    • by CrashNBrn (1143981) on Friday May 30, 2014 @03:34PM (#47131255)
      Monitors are a dead end. Real 3D or "natural resolutions" will come from hologram-like tech. Of course unlikely most of us will live to see that advance.
      • by kimvette (919543) on Friday May 30, 2014 @03:46PM (#47131413) Homepage Journal

        By your logic, so are PCs, and even smartphones, cars, aircraft - so let's not ask for anything better today because someday generations away something will come along to make all of it totally obsolete. Let's not bother with space exploration with current tech either because someday teleportation or wormholes will be practical.

        Or, we can demand better products today and enjoy them in the meantime.

  • by RyanFenton (230700) on Friday May 30, 2014 @11:13AM (#47128925)

    I got it recently, and it's got 4k at 60FPS, in a 28" size - great for programming.

    Review link [ubergizmo.com]

    Just to try it, I was able to get all the single-player PC Ultima games running in about half the screen real estate:

    ALL THE ULTIMAS [imgur.com]

    It's around $600 when its on sale, so I think it just about matches the model slashvertised here.

    Ryan Fenton

  • OSX is not ready (Score:4, Interesting)

    by timeOday (582209) on Friday May 30, 2014 @11:35AM (#47129127)
    I jumped the gun a while ago and got the Dell P2815Q, which is one of those that only do 4K at 30 hz. I can confirm this is not adequate for a large number of uses :)

    What surprised me is the poor OSX support for 4K. Windows can scale everything (although I had to manually add a display mode to the NVidia advanced settings to even get 1080p!?), but OSX cannot. I am running it on a recent MacBook Pro 15" with discrete graphics.

    The problem is that you cannot chose to run at a lower resolution. Display preferences lists ONLY the native resolution. Using QuickRes (a 3rd party add-on for more resolution choices), none of the lower resolutions work, at least until you go all they way down to 1080p

    In particular, you cannot use HiDPI on an external display (where the application sees a lower resolution, but the OS renders fonts at full resolution). (No, it does not help to enable HiDPI with Quartz Debug, nor with the QuickRes "Enable HiDPI" button). So the menus and all applications are absolutely tiny.

    You could adjust the size of everything on a per-application basis, but then they won't look right when you're working on the laptop display, unless you use something like QuickRes to run the laptop display at its native resolution. I guess I will try that for a few days. So I mainly use my older, power-hungry 2550x1600 30" displays.

    If I could just select the highest of the HiDPI resolutions available for the laptop display in the System Preferences, and mirror *exactly* that to this display, I would be a happy camper. You can't do that.

    I understand an upcoming release will improve support with HiDPI on external displays. But as it stands, I could not recommend a 4K display for a Mac - or a Mac for a 4K display.

  • by BlackHawk-666 (560896) <ivan.hawkes@gmail.com> on Friday May 30, 2014 @12:33PM (#47129555) Homepage

    Maybe I'm getting old, but a 24" monitor running 1080p about 40cm frrom my face seems pretty damn good. About as good as I will ever need. My eyesight is not likely to improve, and despite the fact it is pretty good for me age, I don't really see any gains to be had from doubling the resolution of my monitors (x3).

    I'm the sort of guy who buys the 42" TV because...he knows he can just fucking sit a few feet closer to it if he wants the pixels and screen to appear larger!

    The day I need a 4k monitor for programming is the day I need neckstrain from looking up, down and all around.

  • by the eric conspiracy (20178) on Friday May 30, 2014 @12:37PM (#47129603)

    I want a 4K 40" OLED display for photo work. This would be something that would come a lot closer to the capability of the sensor in my camera than anything than I can buy now.

    In addition the high resolution would be great for displaying large amounts of text, that is for programming.

    28" with crappy color gamat and a ridiculous dot pitch isn't close to what I want.

  • by think_nix (1467471) on Friday May 30, 2014 @01:22PM (#47129999)

    While 4k is technologically cool the joke again is on the consumer. As in Blu Ray "Mastered in 4k" which isn't realy 4k but "Re-mastered" and downscaled to 2k. IIRÄ they are having difficulties getting true 4k onto disc still? Then apart from the few US streaming services (available in US only TM). Sounds like another hype from the content providers to make even more money, unfortunately.

  • by Jartan (219704) on Friday May 30, 2014 @02:40PM (#47130727)

    At that res I really want at least a 32~37 inch screen.

  • by Toshito (452851) on Friday May 30, 2014 @03:05PM (#47130965)

    will look awful on cable...

    They're already compressing the hell out of regular 1080P...

    I would much prefer to have uncompressed 1080P than compressed 4K.

  • I have a "old" 2006 SONY 40 inch TV which maxes out at 1360x768. I watched an 1080p version of "The Good, The Bad, and The Ugly" recently, which, of course, is scaled down to meet my max resolution.

    The picture detail is very detailed. Frankly, "The Ugly" is much more detailed than I care to see.

  • by guacamole (24270) on Friday May 30, 2014 @03:26PM (#47131157)

    I really don't understand how retailers and manufacturers are still getting away with selling $700-800 laptops with those awful 1366x768 or 720p displays. A few times I was looking for a basic laptop with entry level CPU and memory, and a 14-15 inch screen with nice resolution at an affordable price at Fry's or BestBuy. But the sales people always direct me at loaded models that cost +1000 to get that screen.

If I'd known computer science was going to be like this, I'd never have given up being a rock 'n' roll star. -- G. Hirst

Working...