Forgot your password?
typodupeerror
Displays

4K Monitors: Not Now, But Soon 186

Posted by Soulskill
from the wait-for-16K dept.
An anonymous reader writes 4K monitor prices have fallen into the range where mainstream consumers are starting to consider them for work and for play. There are enough models that we can compare and contrast, and figure out which are the best of the ones available. But this report at The Wirecutter makes the case that absent a pressing need for 8.29 million pixels, you should just wait before buying one. They say, "The current version of the HDMI specification (1.4a) can only output a 4096×2160 resolution at a refresh rate of 24 Hz or 3840×2160 at 30 Hz—the latter, half that of what we're used to on TVs and monitors. Connect up a 4K monitor at 30 Hz via HDMI and you'll see choppier animations and transitions in your OS. You might also encounter some visible motion stuttering during normal use, and you'll be locked to a maximum of 30 frames per second for your games—it's playable, but not that smooth. ... Most people don't own a system that's good enough for gaming on a 4K display—at least, not at highest-quality settings. You'll be better off if you just plan to surf the Web in 4K: Nvidia cards starting in the 600 series and AMD Radeon HD 6000 and 7000-series GPUs can handle 4K, as can systems built with integrated Intel HD 4000 graphics or AMD Trinity APUs. ... There's a light on the horizon. OS support will strengthen, connection types will be able to handle 4K displays sans digital tricks, and prices will drop as more 4K displays hit the market. By then, there will even be more digital content to play on a 4K display (if gaming or multitasking isn't your thing), and 4K monitors will even start to pull in fancier display technology like Nvidia's G-Sync for even smoother digital shootouts."
This discussion has been archived. No new comments can be posted.

4K Monitors: Not Now, But Soon

Comments Filter:
  • Get a TV (Score:3, Informative)

    by TechyImmigrant (175943) on Tuesday June 17, 2014 @07:06PM (#47258697) Journal

    Why pay $1000+ for a 4K monitor tomorrow when you can pay $500 for a TV today?

    http://tiamat.tsotech.com/4k-i... [tsotech.com]

    • by houstonbofh (602064) on Tuesday June 17, 2014 @07:09PM (#47258711)
      I have 2 clients with Seiko 4kTVs as monitors and it is fantastic for them. Another case of "This is not what I need, so no one needs it."
      • Re:Get a TV (Score:5, Insightful)

        by TechyImmigrant (175943) on Tuesday June 17, 2014 @07:20PM (#47258761) Journal

        Frame rate is for gamers. Programmers need pixels.

        That's why TFA is missing the right angle.
        4K is great for programming
              1 - You can see more lines of code
              2 - it doesn't require silly refresh rates)
        4K for gaming is silly. It doesn't meet the basic requirements
              1 - your card can't drive it
              2 - the framerate is low)

        Arguing that 4K is bad because it's no good for gamers is like arguing mobile phones are bad because you can't program on one effectively.
         

        • Frame rate is for gamers. Programmers need pixels.

          What do game programmers need?

          • by Cryacin (657549) on Tuesday June 17, 2014 @07:33PM (#47258843)

            What do game programmers need?

            Sleep, generally.

          • by NormalVisual (565491) on Tuesday June 17, 2014 @09:44PM (#47259735)
            What do game programmers need?

            Multiple displays that work well for the task at hand.
            • by houstonbofh (602064) on Tuesday June 17, 2014 @11:47PM (#47260233)

              What do game programmers need? Multiple displays that work well for the task at hand.

              A 39 inch 4kTV is the equivalent of 4 20 inch 1080p monitors together.

              • by NormalVisual (565491) on Wednesday June 18, 2014 @12:20AM (#47260345)
                A 39 inch 4kTV is the equivalent of 4 20 inch 1080p monitors together.

                But useless for when you need to run the debugged program full-screen while watching what happens in a debugger, network sniffer, etc. at the same time. There really are times when you need multiple displays not just for the added screen area, but because each display is being used for something different.
                • by pla (258480) on Wednesday June 18, 2014 @06:53AM (#47261271) Journal
                  But useless for when you need to run the debugged program full-screen while watching what happens in a debugger, network sniffer, etc. at the same time. There really are times when you need multiple displays not just for the added screen area, but because each display is being used for something different.

                  You can still run a cheap 20" 1080p 2nd monitor while using a 4k as your primary.

                  I say this as a developer, who until recently used a 4-headed machine for most of my work - I haven't bothered to turn on a 2nd monitor since I got my 4k panel. It has the same combined screen real-estate, covers the same portion of my visual field, and has no annoying bezels between sections of screen. Not to mention, every time a single digitally-connected monitor turns off for any reason (oops, bumped the "input" button again on #3, damn!), Win7 "conveniently" dumps all your icons and programs to your primary... No longer an issue!

                  I don't claim that doesn't still leave situations where a 2nd head could come in useful (such as the case you mention), but I'll gladly trade a signifcant improvement 99% of the time, for a minor nuisance the three times a year it comes up. :)
          • I use 3 x 24 inch 120hz monitors. The source code goes on my middle screen, the right screen has a browser open to whatever information I need to be looking at while writing the code and the other monitor usually has a mix of things open e.g. another copy of VS 2013 with another (dependant or co-related) solution open to the code I need to be viewing, designer screens parts of the game (when running the editor for that), etc.

            My three screens have a combined resolution of roughly 6000 x 1080, allow me to have three separate apps running windowed fullscreen, and can do so at a refresh rate of 120 hz. Even better, I don't get a crick in my neck from looking up and down all the time. I can simply rotate a little in my chair if I need to give one of the side monitors most of my attention for a while.

        • Re:Get a TV (Score:5, Insightful)

          by sexconker (1179573) on Tuesday June 17, 2014 @07:47PM (#47258947)

          Frame rate is for gamers. Programmers need pixels.

          That's why TFA is missing the right angle.
          4K is great for programming

                1 - You can see more lines of code

                2 - it doesn't require silly refresh rates)
          4K for gaming is silly. It doesn't meet the basic requirements

                1 - your card can't drive it

                2 - the framerate is low)

          Arguing that 4K is bad because it's no good for gamers is like arguing mobile phones are bad because you can't program on one effectively.

          Are you kidding me? Staring at 30 Hz console output is maddening, and plenty of GPUs can handle 4K @ 60 fps for modern games. I'm sorry if you're trying to run Ubisoft's latest gimped turd, but that's an issue with the game, not a modern flagship GPU. Beyond that plenty of monitors can handle 4K 60 Hz. I have no idea why the fuck this shit got front paged. HDMI 2.0. WELCOME TO THE PRESENT. DisplayPort 1.2. WELCOME TO THE YEAR 2010.

        • by Twinbee (767046) on Tuesday June 17, 2014 @08:43PM (#47259317) Homepage
          Enjoy your mouse cursor and window frame moving at 30fps then, and the associated lag that will bring.

          Instead we should be encouraging movement the other way - towards 120fps which allows for much more lifelike smoother motion. Youtube stuck at 30fps is a thorn in the whole online video sector.
        • by aaronb1138 (2035478) on Tuesday June 17, 2014 @09:01PM (#47259463)
          For gaming (not text or web) if the refresh is high enough (30 hz is not), scaled resolutions look fine. We've hit high enough resolutions where certain scaling operations just look like anti-aliasing instead of blurring.

          Scaling rightfully got a bad name when it was upscaling 800x600 content to a 1024x768 or 1280x1024 17" monitor. It looked blurry. Scaling 1920x1080 to 2560x1440 on a 27" monitor looks really good. I'm more interested on the gaming side if these 4K TVs will take 1920x1080 or 2560x1440 at 60 hz and maintain refresh rate (technically if it is 120 hz it should, but I have my doubts about their scaler). Doing productivity work at full resolution would mostly be fine at 30 hz, if occasionally annoying.
        • by mikael (484) on Tuesday June 17, 2014 @10:18PM (#47259883)

          And graphics programmers need both frame rate and pixels. 120Hz seems perfect, but once you try using 3D vision glasses, those LCD shutters bring back the flicker.

          The problem with the higher resolutions is that application developers just seem to think they can then make their application main window even bigger so it still fills the entire screen. Then they have to use bigger fonts to maintain compatibility with past versions of the same application.

          • by AC-x (735297) on Wednesday June 18, 2014 @06:58AM (#47261305)

            And graphics programmers need both frame rate and pixels. 120Hz seems perfect, but once you try using 3D vision glasses, those LCD shutters bring back the flicker.

            We're not using CRTs anymore, LCD panels don't flicker with the refresh rate so 24hz, 30hz, 60hz, 120hz will all be just as steady.

        • by dbIII (701233) on Wednesday June 18, 2014 @05:01AM (#47260979)
          Me - I want e-ink and can live with a one second refresh rate for plain text.
        • by donaldm (919619) on Wednesday June 18, 2014 @06:52AM (#47261263)

          Frame rate is for gamers. Programmers need pixels.

          Basically if you display 80 or even 120 lines of code it does not matter if the monitor is 1080 pixels or 2160 pixels in height. Sure the higher resolution will display a well designed highly detailed font better than a lower resolution font but that is all, however programmers normally use a mono-spaced font like "Courier" so a fine detailed font is pointless.

          Displaying more than 100 lines of code in the window/screen is IMHO stupid because the human eye and consequently the brain is not going to help you debug or even write code any better than if you used 24 to 80 lines. When coding you need to know what you are writing the code for and you should be writing the code in such a way that it is easy to understand and hopefully easy to debug.

          As for gamers a screen with a 30Hz refresh rate is pretty much the "sweet spot" for general gaming however fast action games such as FPS's and racing benefit from high refresh rates such as 60Hz and above.

          If a programmer is coding for a game then they will need a high resolution monitor with a good refresh rate, assuming they are going to use the monitor for both applications, although if you look at game programmers they normally use two or more screens with at least one low resolution for programming and the other a high performance one to test out what they are developing.

          • by swimboy (30943) on Wednesday June 18, 2014 @09:41AM (#47262079)

            Basically if you display 80 or even 120 lines of code it does not matter if the monitor is 1080 pixels or 2160 pixels in height. Sure the higher resolution will display a well designed highly detailed font better than a lower resolution font but that is all, however programmers normally use a mono-spaced font like "Courier" so a fine detailed font is pointless.

            Spoken by one who hasn't done much programming on a HiDPI monitor. I can tell you from first-hand experience that the higher resolution display significantly reduces eye fatigue. I have two 24" 1920x1080 external displays connected to my 15" rMBP. I always put my main window on the small 15" screen because the text is much easier on the eyes at 220 dpi. In matter of fact, text is the only thing that looks dramatically better on the retina display than a standard display. Images and icons may be more detailed, but it's not nearly as noticeable as the improvements in the display of text.

            I'd rather give up my external monitors than my retina display.

        • by nabsltd (1313397) on Wednesday June 18, 2014 @08:43AM (#47261663)

          Frame rate is for gamers. Programmers need pixels.

          The mouse lag on a 24 or 30Hz display will drive you nuts when you are trying to select a block of text.

          If you are a keyboard-only editor, it's not as bad, but even highlighting text or trying to page down quickly will likely send you back to a high-speed multi-monitor setup.

        • by Dcnjoe60 (682885) on Wednesday June 18, 2014 @10:16AM (#47262395)

          The article was talking about 4K for mainstream consumers, which most likely would be closer to gamers than programmers.

      • by strack (1051390) on Tuesday June 17, 2014 @09:25PM (#47259631)
        I mean, seriously, Seiki needs to hurry up and release a 60hz 4k version of its 38.5 inch display, preferably with a displayport. A 38.5 inch 4k 60hz VA panel would blow the weak ass 28 inch 4k TN panels everyone seems to be pushing today out of the water, especially if they keep their current price point. Ditch the tv tuner and smart tv crap, put in displayport and adaptive sync, and watch it become the monitor for All The Computers In The World.
    • by mcvos (645701) on Wednesday June 18, 2014 @06:16AM (#47261165)

      I don't understand why 4K TVs exist before 4K monitors do. Firstly, TVs simply don't need a crazy resolution like that. Look at how long it took before HD finally took hold. Is anything actually being broadcast in 4K? And if it's impossible to get a decent signal to it, how do those 4K broadcasts end up on the TV?

  • Occulus Rift (Score:3, Insightful)

    by ZouPrime (460611) on Tuesday June 17, 2014 @07:10PM (#47258715)

    Some will call me a troll, but as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.

    Why spend a shitload of money of a new 4K screen and the video card necessary for an acceptable game experience when I'll be able to do VR with a fraction of the cost and with my existing hardware setup?

    Obviously that's a gamer perspective - I'm sure plenty of people will find 4K for what they are doing.

    • Re:Occulus Rift (Score:2, Interesting)

      by Anonymous Coward on Tuesday June 17, 2014 @07:27PM (#47258805)

      In it's present iteration, the Occulus Rift might very well fit your current hardware but the requirements for getting a decent amount of pixel per view-angle on VR are brutal. Micheal Abrash's post on the matter is very enlightening: http://blogs.valvesoftware.com/abrash/when-it-comes-to-resolution-its-all-relative/. In short, you'll most likely need a ultra-responsive, insanely dense mini-displays each boasting a 4k x 4k resolution per eye. This kind of resolution plus the latency requirements for VR will indeed demand a very powerful gaming rig.

    • by Your.Master (1088569) on Tuesday June 17, 2014 @07:36PM (#47258869)

      It's not entirely clear that VR is going to displace PC gaming to that significant of a degree.

      As a fairly avid gamer, most games I play are not in the first person perspective and I don't want them to be. I don't like FPS, and that's a huge portion of all first-person games (though I do like the sort of FPS-stealth-subgenre that encompasses Hitman, Dishonoured, Deus Ex, etc., and I can see how VR would be an asset there).

      Platformers, most RPGs (the Elder Scrolls series is a popular exception, but I have never liked them), strategy and/or tactics games, most adventure games, most puzzle games, most "unique" / "indie" games, etc. -- these things and others are generally not first-person, and VR almost implies a first person perspective.

      Most of those things I listed (aside from platformers) are already more popular on the PC than on console competitors.

      • by RedWizzard (192002) on Tuesday June 17, 2014 @10:21PM (#47259897)

        It's not entirely clear that VR is going to displace PC gaming to that significant of a degree.

        As a fairly avid gamer, most games I play are not in the first person perspective and I don't want them to be. I don't like FPS, and that's a huge portion of all first-person games... and VR almost implies a first person perspective.

        Only if you've got no imagination. What this iteration of VR is bringing is head tracking and that allows massive virtual screens. I think Rift and similar products are going to break into non-gaming market as cost effective way of getting giant flat displays.

    • by vux984 (928602) on Tuesday June 17, 2014 @08:01PM (#47259057)

      Some will call me a troll, but as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.

      As a gamer I'm not really concerned about 4k either. I'm much more interested better support for 3-view type setups. And 4k 3-view is just all the gamer problems of 4k times 3 :)

      Oculus... I'm not sold on it. I see it as niche at best. Very cool in that niche though.

      I would like to see head tracking go mainstream though.

    • by Osgeld (1900440) on Tuesday June 17, 2014 @08:24PM (#47259187)

      you have been able to do that for 2 decades, so the question is why havent you

      I will give you a hint, there is a reason for that, that reason is strapping a thing to your face gets old really fucking quick

    • Re:Occulus Rift (Score:5, Informative)

      by Solandri (704621) on Tuesday June 17, 2014 @09:50PM (#47259767)

      Some will call me a troll, but as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.

      Why spend a shitload of money of a new 4K screen and the video card necessary for an acceptable game experience when I'll be able to do VR with a fraction of the cost and with my existing hardware setup?

      You're making a fundamental error many people make when it comes to display resolution. What matters isn't resolution or pixels per inch. It's pixels per degree. Angular resolution, not linear resolution.

      I've got a 1080p projector. When I project a 20 ft image onto a wall 10 ft away, the pixels are quite obvious and I wish I had a 4k projector. If I move back to 20 ft away from the wall, the image becomes acceptable again. It's the angle of view that matters not the size or resolution. 20/20 vision is defined as the ability to distinguish a line pair with 1 arc-minute separation. So within one degree (60 arc-minutes) you'd need 120 pixels to fool 20/20 vision.

      This is where the 300 dpi standard comes from. Viewed from 2 ft away, one inch covers just about 2.5 degrees, which is 150 arc-minutes, which can be fully resolved with 300 dots. So for a printout viewed from 2 ft away, you want about 300 dpi to match 20/20 vision. If it's not necessary to perfectly fool the eye, you can cut this requirement to about half.

      In terms of Occulus Rift, a 1080p screen is 2203 pixels diagonal, so this corresponds to 18.4 degrees to fool 20/20 vision, 39 degrees to be adequate. If you want your VR display to look decent while covering a substantially wider angle of view than 39 degrees, you will want better than 1080p resolution. I'm gonna go out on a limb, and predict that most people will want more than a 39 degree field of view in their VR headset.

      • by AmiMoJo (196126) * <mojo AT world3 DOT net> on Wednesday June 18, 2014 @07:29AM (#47261407) Homepage

        So within one degree (60 arc-minutes) you'd need 120 pixels to fool 20/20 vision.

        That is a common misunderstanding. 20/20 vision is the ability to distinguish two lines with 1 arc-minute separation from a single thicker line. Beyond that human eyes can still distinguish a 0.5 arc-minute wide line from a 1 arc-minute wide line, and can tell if a 0.5 arc-minute line is jagged or smooth.

        That's why there is a noticeable difference between 300 PPI and 450+ PPI phone displays at normal viewing distances. It's why people with normal vision can differentiate 1080p and 4k on a 127cm screen from a couple of metres back.

    • by mestar (121800) on Wednesday June 18, 2014 @05:29AM (#47261043)

      "Why spend a shitload of money of a new 4K screen and the video card necessary for an acceptable game experience when I'll be able to do VR with a fraction of the cost and with my existing hardware setup?"

      Oh boy, somebody is going to get very disappointed.

    • by lordofthechia (598872) on Wednesday June 18, 2014 @10:41AM (#47262667)

      as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.

      Same here, I've been due for a monitor upgrade for a while (was running a triple 19" monitors), but it doesn't make sense to do that now since the budget can be used to snatch up Oculus CV1s when they come out (for less money)!

  • display port (Score:5, Interesting)

    by rogoshen1 (2922505) on Tuesday June 17, 2014 @07:14PM (#47258731)

    Displayport doesn't have the same limitations that HDMI has at those resolutions. and is available now.

    Nvidia 6xx and ATI 7xxx (not to mention intel hd4000) are not exactly brand new, and available now.

    IF anything, this sounds like "HDMI is showing it's age, use displayport"

  • I'm not a young person anymore, but I've been on the tech wagon since I was 8 years old. And I have to admit that I was one of those people touting the high-resolution thing and pushed it forward all the time (I even made a living in the graphics industry).

    But there is such a thing as too much. After 720p...over 2 meters away from the television set, despite having Air-Pilot approved eyes, I still could not HONESTLY see the difference between a 50 inch 720p and a 50 inch 1080p, honestly - I could not!

    I'd rather have a TV that can be seen perfectly from any angle, super-fast refresh rate for my gaming needs (my current LG 47" inch TV sports a 4ms refresh rate), but there is still room for improvement. And I'd love for these screens to be in the OLED department instead of the LED (Aka...TFT with LED backlight) we have now.
    • If you watched something with high resolution and a clean picture, like Disney's "Frozen," on a high-quality display, like a Samsung 55", then you should be able to tell the difference b/w 720p and 1080p easily. For many things, it is hard to tell the difference at a reasonable distance. Monitors are different in that you're usually much closer to one. At 24", 720p monitors look like crap compared to 1080p. 4K, however, seems like overkill at anything below 30".

      For gaming, I'm totally with you. For computer gamers, what's really popular are the 27" 2560x1440 monitors that can be overclocked, ideally to 120Hz and that do not have a scalar which reduces response time (which means it can only be run at 2560x1440 and has a single dual-dvi input). Many cheaper monitors will advertise sort of bogus or software-corrected response times that are not representative of real-world use, so it's important to read the reviews. For the more mainstream models, tftcentral is a very good resource. It's trickier if you import from Korea trying to get the magic 120Hz overclock.

      • I regularly use a 1080p monitor in the 24" range and I can tell you I would *definitely* like the resolution to be higher. I do a lot of text-based work and I can see the letters start to get blocky if I reduce the text size while I know for a fact I could easily read text even smaller when printed on a decent laser printer.

        Try it one day. Use a word processor to print "the quick brown fox jumped over the lazy dog" in steadily reduced font size down the page. Print that page and hold it next to the computer screen at a comfortable viewing distance and find the smallest font size you can read on the printed version and the on-screen version. If picked the same paper and monitor sizes (as measured by a real-life ruler), you may want to see an optometrist.

      • Everybody says this. It has been repeated hundreds of times on Slashdot. And it is just wrong. Fuzzy text looks fuzzy whether you're 2 inches away or 2 feet away. You might not be able to see individuals pixels, but you can clearly see the resolution is not sufficiently high to allow clear and crisp font rendering. I'm over 40 and my eyesight is worse than most people, but I sure as hell know that zooming out does not make a fuzzy picture look smooth.
    • You need a good quality 1080p film (or picture) to be able to see the difference. I have a 1080p setup at home and I can tell for sure the difference between 720p and 1080p sources.
    • TFA is about monitors. The pixel density on my laptop is about double that of the 28" display on my desk. This is really noticeable for text rendering, where it's clear and crisp on the laptop screen, but looks a little blurry around the edges due to the sub-pixel AA on the bigger display. I'd love to replace the one on my desk with a 4K display once the prices become a bit less silly. And, yes, my laptop can drive a 4K display.
  • What?! (Score:5, Interesting)

    by RyanFenton (230700) on Tuesday June 17, 2014 @07:19PM (#47258755)

    I'm typing this on a monitor with 3840x2160 resolution, at 60hz right now. I posted about it weeks ago:

    Clicky [slashdot.org]

    It's like $600 when on sale, and it works superb for coding and playing games. Skyrim/Saints Row 4 plays fine on a GTX 660 at 4k resolution, you just disable any AA (not needed), but enable vsync (tearing is more visible at 4k, so just use that). Perhaps that's just me - but things seem fine at 4k res on a medium-cost graphics card.

    A few generations of video cards, and everything will be > 60-FPS smooth again anyway (partially thanks to consoles again), so I don't really need to wait for a dynamic frame smoothing algorithm implementation to enjoy having a giant screen for coding now.

    I don't see any reason why you'd want to wait - it's as cheap as two decent monitors, and if you're slightly near-sighted like me, it's just really great. See my previous post for a review link and an image of all the PC Ultima games on screen at once.

    Ryan Fenton

    • by MindPrison (864299) on Tuesday June 17, 2014 @07:23PM (#47258773) Journal
      I agree, for coding - the more resolution, the better...no doubt about that.
    • by Namarrgon (105036) on Tuesday June 17, 2014 @09:15PM (#47259557) Homepage

      The other nice thing about the Samsung UD590, apart from 4K @ 60Hz, is that it presents itself as a single 4K monitor, rather than two half-size monitors tiled next to each other. That can make a big difference to some uses, like running games at lower resolutions. The Asus PB287Q is another such single-tile 4K monitor.

    • by drinkypoo (153816) <martin.espinoza@gmail.com> on Wednesday June 18, 2014 @01:01AM (#47260477) Homepage Journal

      I don't see any reason why you'd want to wait - it's as cheap as two decent monitors, and if you're slightly near-sighted like me, it's just really great.

      I'll wait until I can get as great a deal as I did on this 120% color 25.5" 1920x1200 IPS, which was fifty bucks. Hooray for storage lockers, I guess. And also for flea markets, so I don't have to buy storage lockers myself. Also, I just bought a used video card and it doesn't have displayport. IOW, the obvious reason is that one is a cheap bastard.

      I did buy an Eye-One Display LT for fifty bucks to go with my fifty buck monitor. And the color is still spot on, amazingly. It wanted calibration, but it still has all of its range.

      I'll get 4k when I can get it for $100 for the monitor and $100 for the card. That seems fair, it's twice-ish the pixels I have now :)

  • by Anonymous Coward on Tuesday June 17, 2014 @07:24PM (#47258779)

    But all I really need is a LCD running 720p.
    Truthfully all I really need is a super vga CRT.
    In all honesty I could live with the warm glow of an ega screen.
    Net net I miss a nice monochrome to get me through.
    All things considered, teletype handles 99% of my day to day needs.
    Actually, I feel like anything more than a single blinking indicator light is pretty decadent.

  • by gman003 (1693318) on Tuesday June 17, 2014 @07:24PM (#47258783)

    This seems to be a time when monitor features are growing fast. I'm personally going to stick with my 1440p screen until it stabilizes a bit.

    The G-Sync/FreeSync battle is going to start. For gamers, this is going to be big. Right now, G-Sync only works with Nvidia cards, and FreeSync will probably only work with AMD cards. FreeSync is much better licensed, and I expect it will probably win eventually, but I tend to prefer Nvidia cards so I'm willing to wait until we get a clear winner.

    Basically, my dream monitor right now would be:
    under 28" diagonal
    full AdobeRGB gamut or better, factory-calibrated (if significantly wider than AdobeRGB, needs 10-bit color support)
    refresh rates up to at least 120Hz, variable using either Sync method as long as it works with any card I buy
    resolution of 3840x2400 or higher (16:10 aspect ratio)
    no need for multiple data links (as some current 2160p monitors do)
    sub-millisecond input latency

    I would naturally be willing to compromise on many of those points, but the way the market is going, I might not have to. And what I have right now is plenty good enough to last me until things become more future-proof.

    • by Twinbee (767046) on Tuesday June 17, 2014 @08:52PM (#47259381) Homepage
      Which is technically better out of Gsync and Freesync?

      I agree with your ideal choice of monitor btw! Apart from the size which should be bigger, but further away. This way your eyes would be more relaxed.
      • by gman003 (1693318) on Tuesday June 17, 2014 @10:13PM (#47259857)

        I live in a rather small apartment and would really like a triple monitor setup. So I prefer smaller hardware. I'm also nearsighted and usually take my glasses off when computing for a long period, so smaller, closer displays are actually more relaxing. But to each his own.

        As far as which is technically better, I haven't seen any solid comparisons. G-Sync does use proprietary hardware in the display, which means it has the potential to do a lot more. FreeSync works with existing panels provided they support V_BLANK, which isn't many yet, and none are exposing it to the GPU.

        FreeSync has been incorporated into the DisplayPort standard (as "Adaptive-Sync", an option in DP1.2a and 1.3) but no displays have made it to market yet. G-Sync has the advantage of shipping, but unless it's either far superior in a technical manner, or Nvidia flat-out refuses to support Adaptive-Sync, I expect it to die sometime next year when the competition arrives.

  • by alen (225700) on Tuesday June 17, 2014 @07:26PM (#47258793)

    when the 4K content starts coming out
    because you know, they will stop selling these soon and you will never be able to buy one to view all the 4K content coming out soon
    or they will drop in price to the point where kids can afford them on their allowance, but you have to buy it NOW and Before this happens just to be the first one to watch 4K content

  • by asmkm22 (1902712) on Tuesday June 17, 2014 @07:26PM (#47258797)
    It just seems like the options down the road for media that can store 4k are a bit limited. Streaming seems out of the question when we can't even get consistent 1080p streams out to people. Blu Ray would need some major overhauls unless people want to have 4k movies come on 10 to 20 disks, and something tells me people aren't going to rush out to embrace a new media format even if it did get that overhaul. I just can't help but think 4k tech will have to be targeted at niche industries like photo editing and maybe CAD type stuff. I could also see a push towards the medical industry. But the average consumer? Not happening.
  • by Kjella (173770) on Tuesday June 17, 2014 @07:28PM (#47258815) Homepage

    I got a UHD @ 60Hz single stream transport here in the Samsung U28D590D. There's not much video content yet except for a few porn sites, but for stills it's brilliant. Software support for increasing font size is mediocre in many apps, but they're usually functional just ugly. I wish there was some way to just tell Windows to draw a window at 200% size instead. Gaming is cool though my graphics card is choking on the resolution when it gets heavy, I guess it needs an upgrade now that it's pushing 4x the pixels. Overall I'm happy, yes I'm an early adopter but the bleeding edge is more like a paper cut.

  • by TheSync (5291) on Tuesday June 17, 2014 @07:28PM (#47258817) Journal

    I can confirm that the Panasonic TC-L65WT600 [panasonic.com] 65" 4K UHDTV can play 60 fps 4K over its HDMI 2.0 connector (yes, I actually have access to 4K/60p content and a 4K/60p video server). I have seen it for as low as $3500 on BestBuy.com.

  • by rsborg (111459) on Tuesday June 17, 2014 @07:29PM (#47258819) Homepage

    4K displays @ 60Hz with Retina pixel doubling = fantastic coding display [1]
    Of course, I don't have this at work - I have two separate 24" monitors but my spend most of my time on my 15" retina screen.

    [1] http://support.apple.com/kb/ht... [apple.com]

  • by Rinikusu (28164) on Tuesday June 17, 2014 @07:32PM (#47258835)

    I've been considering one of these bad boys for awhile now. Cheap and for what I intend to use it for (software dev and video editing where the 30Hz refresh isn't a big deal), good enough. It's not something I'd use for gaming, at least at 4K, but hey... $500.

  • by the eric conspiracy (20178) on Tuesday June 17, 2014 @07:37PM (#47258879)

    Having a full color gamut is important too. And a really good contrast ratio.

    So I'm saving my pennies for a OLED 4K display. At 80". And none of that curved bullshit.

  • by rossdee (243626) on Tuesday June 17, 2014 @07:38PM (#47258883)

    A 30 inch monitor, 16:10 aspect ratio , and 2560 x 1600

    The only reason I would want much higher resolution that that is to overcome the problem of scaling on digital displays, in the old days of analog monitors we could run differehttp://hardware.slashdot.org/story/14/06/17/224208/4k-monitors-not-now-but-soon#nt resolutions wothout it looking like shit.

    I currently have a 28 inch 1920 x 1200 monitor, but they don't make those anymore,

    • by KozmoStevnNaut (630146) <henrikstevn@nOSPam.gmail.com> on Wednesday June 18, 2014 @08:08AM (#47261523)

      I used to have a 24" 1920x1200 display, but I upgraded to a 27" 2560x1440. It's 16:9, but I find it I can just about fit three browser windows side by side (or two for wider layout pages). It's not the aspect ratio that's important, it's the number of vertical pixels you have available. I put the main KDE tool par on the left-hand side of the screen to make the most of the vertical resolution, it's working pretty well for me.

      I've finally reached the point where I don't feel like I have to vertically maximize every window to make full use of my desktop resolution.

    • by BitZtream (692029) on Wednesday June 18, 2014 @10:24AM (#47262491)

      This isn't what your asking for exactly, but it's close:

      Apple thunderbolt and whatever they call the normal display port ones are 2560x1440@27", of course they cost $999 too :(

      I don't know that they are "worth the money". But I definitely approve of mine.

  • Ow, the ignorance (Score:5, Informative)

    by jtownatpunk.net (245670) on Tuesday June 17, 2014 @08:11PM (#47259115)

    Was that summary written by someone who's never used a 30Hz 4k display?

    A 30Hz feed to an LCD panel is not like a 30Hz feed to a CRT. The CRT phosphors need to be refreshed frequently or the image fades. That's why 30Hz was all flickery and crappy back in the 90s. But 30Hz to an LCD isn't like that. The image stays solid until it's changed. A 30Hz display on an LCD is rock solid and works fine for a workstation. I know. I've seen me do it. Right now. There are no "transition" issues, whatever that is supposed to mean. Nothing weird happens when I switch between applications. Multitasking works fine. I'm playing multiple HD videos without a hitch. Same way the 30hz 1080 programming from cable and satellite plays just fine on LCDs. Gaming's not great but turn on vertical sync and it's not terrible. I'd rather be running at 60Hz but I got my 4k panel for $400. It'll hold me over until displays and video cards with HDMI 2 are common.

  • by EmperorOfCanada (1332175) on Tuesday June 17, 2014 @08:26PM (#47259209)
    I don't know any of my tech friends who are breathlessly awaiting 4K monitors. If I go to staples to replace my monitors some day and see that the 4K one is $50 more than the regular one, then OK I'll happily buy one. But it if it is $200 more then, no, I'll wait.

    I am not saying that 4K is a stupid idea, or that I hate 4K, if it turned out that one of my present monitors had a switch on the back that would switch it to 4K I would be delighted, but when it comes to budgeting my money there are a huge number of things that would make my workflow a whole lot better that I would rather spend my money on. 4K is nice but just not needed. I think that I speak for most people who aren't doing video editing.

    But I suspect that for the next 3-5 years that I am going to be reading various tech blogs and they will breathlessly review the latest 4K monitors as they drop lower and lower in price. But again the spread between regular and 4K will have to be pretty small before I will make the jump.

    A 4K TV on the other hand would be pretty cool and I think that Netflix has some programming 4K ready so I would probably make that leap long before a monitor.
    • A 4K TV on the other hand would be pretty cool and I think that Netflix has some programming 4K ready so I would probably make that leap long before a monitor.

      You have that pretty backwards. UltraHD is immediately useful for a monitor, if you actually do work with a computer and aren't one of these people who think work can only be done in a maximized window. There's not much video in that resolution yet and at any distance it's not immediately obvious what resolution a TV is, but you can put all the text you want on screen at that resolution and you sit within arm's length of your monitor.

  • by Touvan (868256) on Tuesday June 17, 2014 @10:54PM (#47260039) Homepage

    Computers can handle multiple monitors at 60Hz, so why not 4K with duel inputs? Is that feasible, and are there some models on the horizon that have multiple HDMI, dual-dvi, or dual-display port (pre-thunderbolt-2 display port - I don't know the version numbers)? It seems it could be possible.

    • by Areyoukiddingme (1289470) on Tuesday June 17, 2014 @11:32PM (#47260169)

      Computers can handle multiple monitors at 60Hz, so why not 4K with duel inputs? Is that feasible, and are there some models on the horizon that have multiple HDMI, dual-dvi, or dual-display port (pre-thunderbolt-2 display port - I don't know the version numbers)?

      The Asus PB287Q has two HDMI and one DisplayPort and supports dual simultaneous input from any two of them. They call it Picture-by-Picture mode. They put two HD displays side by side, with black bars above and below, from two different machines. It's slightly silly, since it's not exactly convenient to switch to that mode, but it's available. It will also do Picture-In-Picture mode, displaying one input across the full screen and the other in a window up in the corner, all rescaled in software transparently to the machines outputting the signals.

  • by jcdr (178250) on Wednesday June 18, 2014 @06:58AM (#47261309)

    I just buy this setup last week and it work perfectly well at 3840 x 2160 @ 60 Hz with Debian Jessie and the last fglrx-14.20 driver. The monitor is packed with a DP cable (and a HDMI cable by the way) so it worked out of the box with a FM2A88X Extreme6+ motherboard. I have see 30 Hz display before, but sorry, I can't enjoy them to test 1080p60 applications in a window while having code and debug around others windows.

    Electronics schematic and hardware routing is a pleasure on UDH monitor, as coding side by side to the full documentations and output results. On the fun side, Google Earth is probably the most impressive with so much details that it's a bit like an immersion. The APU is still capable by itself to yield a few frames per second in full screen mode. Really need a powerful GPU if you expect more...

    The monitor have a little bug in the power management that require to cycle his power to sometime weak it up, but this is a minor issue for me.

  • by DrXym (126579) on Wednesday June 18, 2014 @07:42AM (#47261451)
    The DPI in some tablets / laptops is so high that applications running on desktop operating systems (Windows, OS X and Linux) render like postage stamps with tiny fonts, toolbars and other buttons. To counter this the OS can upscale any non-high-dpi-aware app's window but that makes everything looks blurry.

    So that shiny new 4K monitor may end up delivering an inferior desktop experience and requires a GPU working 4x as hard. That might change as more desktop apps become high dpi aware but obviously any legacy app is never going to get fixed.

    • by rjstanford (69735) on Wednesday June 18, 2014 @12:26PM (#47263733) Homepage Journal

      The DPI in some tablets / laptops is so high that applications running on desktop operating systems (Windows, OS X and Linux) render like postage stamps with tiny fonts, toolbars and other buttons. To counter this the OS can upscale any non-high-dpi-aware app's window but that makes everything looks blurry.

      I'm with you on Windows and Linux. OSX has been doing this natively, with no blurriness, since they first started shipping retina laptops. It really is amazingly nicer than the old low-res screens. Source: been using it personally since 2012.

  • by swb (14022) on Wednesday June 18, 2014 @07:57AM (#47261501)

    The problem I have with super high res displays is the limitations of window management. I have yet to find a decent tool for Windows that allows for virtual monitors that lets me subdivide a very large display into multiple displays. You end up with maximized windows that make poor use of screen real-estate, like this dinky box on a mostly empty window I'm typing in.

    And what about window content scaling? I'd be nice to scale the content of a window so that I could display more in the same window or make it larger, especially when combined with a way to scale subdivided display regions.

  • by neminem (561346) <neminem&gmail,com> on Wednesday June 18, 2014 @10:55AM (#47262787) Homepage

    I don't really have any pressing need for 4k. I mean, I'd take it, though I feel like it would require not only sufficient hardware, but also an OS with a UI better designed for it (I imagine there'd be a lot of times, with a screen that large, that you would want to tell windows to "maximize" onto only one quadrant, for instance.)

    What I would really like is for monitors to just not have *regressed*. My laptop's about 3 1/2 years old. I'd be tempted to buy a new one sometime kinda-soon (was looking, and drooling over the fact that affordable laptops these days finally generally come with bays for both a ssd and an hdd, often even a ssd and *two* hard drives)... but screw 16:9. All I want is the choice of 1920:1200. That's 1200, not 1080. (Though it's also surprising how many mid-level, desktop replacement laptops don't even have 1920:1080.)

One small step for man, one giant stumble for mankind.

Working...