Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Displays

4K Monitors: Not Now, But Soon 186

An anonymous reader writes 4K monitor prices have fallen into the range where mainstream consumers are starting to consider them for work and for play. There are enough models that we can compare and contrast, and figure out which are the best of the ones available. But this report at The Wirecutter makes the case that absent a pressing need for 8.29 million pixels, you should just wait before buying one. They say, "The current version of the HDMI specification (1.4a) can only output a 4096×2160 resolution at a refresh rate of 24 Hz or 3840×2160 at 30 Hz—the latter, half that of what we're used to on TVs and monitors. Connect up a 4K monitor at 30 Hz via HDMI and you'll see choppier animations and transitions in your OS. You might also encounter some visible motion stuttering during normal use, and you'll be locked to a maximum of 30 frames per second for your games—it's playable, but not that smooth. ... Most people don't own a system that's good enough for gaming on a 4K display—at least, not at highest-quality settings. You'll be better off if you just plan to surf the Web in 4K: Nvidia cards starting in the 600 series and AMD Radeon HD 6000 and 7000-series GPUs can handle 4K, as can systems built with integrated Intel HD 4000 graphics or AMD Trinity APUs. ... There's a light on the horizon. OS support will strengthen, connection types will be able to handle 4K displays sans digital tricks, and prices will drop as more 4K displays hit the market. By then, there will even be more digital content to play on a 4K display (if gaming or multitasking isn't your thing), and 4K monitors will even start to pull in fancier display technology like Nvidia's G-Sync for even smoother digital shootouts."
This discussion has been archived. No new comments can be posted.

4K Monitors: Not Now, But Soon

Comments Filter:
  • Get a TV (Score:3, Informative)

    by TechyImmigrant ( 175943 ) on Tuesday June 17, 2014 @07:06PM (#47258697) Homepage Journal

    Why pay $1000+ for a 4K monitor tomorrow when you can pay $500 for a TV today?

    http://tiamat.tsotech.com/4k-i... [tsotech.com]

  • Re:Display Port (Score:5, Informative)

    by sexconker ( 1179573 ) on Tuesday June 17, 2014 @07:50PM (#47258965)

    Why is there no mention of Display Port? Current 4K LCD all accept this, and with the right GPU, you can most certainly drive at 60Hz, full resolution.

    This is more about HDMI being a broken standard to me. I just don't like DisplayPort because it's sort of Apple's thing.

    DisplayPort is AMD's thing, through VESA. It's not Apple's thing.

  • Ow, the ignorance (Score:5, Informative)

    by jtownatpunk.net ( 245670 ) on Tuesday June 17, 2014 @08:11PM (#47259115)

    Was that summary written by someone who's never used a 30Hz 4k display?

    A 30Hz feed to an LCD panel is not like a 30Hz feed to a CRT. The CRT phosphors need to be refreshed frequently or the image fades. That's why 30Hz was all flickery and crappy back in the 90s. But 30Hz to an LCD isn't like that. The image stays solid until it's changed. A 30Hz display on an LCD is rock solid and works fine for a workstation. I know. I've seen me do it. Right now. There are no "transition" issues, whatever that is supposed to mean. Nothing weird happens when I switch between applications. Multitasking works fine. I'm playing multiple HD videos without a hitch. Same way the 30hz 1080 programming from cable and satellite plays just fine on LCDs. Gaming's not great but turn on vertical sync and it's not terrible. I'd rather be running at 60Hz but I got my 4k panel for $400. It'll hold me over until displays and video cards with HDMI 2 are common.

  • Re:display port (Score:2, Informative)

    by Anonymous Coward on Tuesday June 17, 2014 @08:26PM (#47259203)

    Oh you mean v1.2 which came out in 2009, and virtually every DP capable graphics card and monitor supports?

  • Re:Occulus Rift (Score:5, Informative)

    by Solandri ( 704621 ) on Tuesday June 17, 2014 @09:50PM (#47259767)

    Some will call me a troll, but as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.

    Why spend a shitload of money of a new 4K screen and the video card necessary for an acceptable game experience when I'll be able to do VR with a fraction of the cost and with my existing hardware setup?

    You're making a fundamental error many people make when it comes to display resolution. What matters isn't resolution or pixels per inch. It's pixels per degree. Angular resolution, not linear resolution.

    I've got a 1080p projector. When I project a 20 ft image onto a wall 10 ft away, the pixels are quite obvious and I wish I had a 4k projector. If I move back to 20 ft away from the wall, the image becomes acceptable again. It's the angle of view that matters not the size or resolution. 20/20 vision is defined as the ability to distinguish a line pair with 1 arc-minute separation. So within one degree (60 arc-minutes) you'd need 120 pixels to fool 20/20 vision.

    This is where the 300 dpi standard comes from. Viewed from 2 ft away, one inch covers just about 2.5 degrees, which is 150 arc-minutes, which can be fully resolved with 300 dots. So for a printout viewed from 2 ft away, you want about 300 dpi to match 20/20 vision. If it's not necessary to perfectly fool the eye, you can cut this requirement to about half.

    In terms of Occulus Rift, a 1080p screen is 2203 pixels diagonal, so this corresponds to 18.4 degrees to fool 20/20 vision, 39 degrees to be adequate. If you want your VR display to look decent while covering a substantially wider angle of view than 39 degrees, you will want better than 1080p resolution. I'm gonna go out on a limb, and predict that most people will want more than a 39 degree field of view in their VR headset.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...