Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Displays

4K Monitors: Not Now, But Soon 186

An anonymous reader writes 4K monitor prices have fallen into the range where mainstream consumers are starting to consider them for work and for play. There are enough models that we can compare and contrast, and figure out which are the best of the ones available. But this report at The Wirecutter makes the case that absent a pressing need for 8.29 million pixels, you should just wait before buying one. They say, "The current version of the HDMI specification (1.4a) can only output a 4096×2160 resolution at a refresh rate of 24 Hz or 3840×2160 at 30 Hz—the latter, half that of what we're used to on TVs and monitors. Connect up a 4K monitor at 30 Hz via HDMI and you'll see choppier animations and transitions in your OS. You might also encounter some visible motion stuttering during normal use, and you'll be locked to a maximum of 30 frames per second for your games—it's playable, but not that smooth. ... Most people don't own a system that's good enough for gaming on a 4K display—at least, not at highest-quality settings. You'll be better off if you just plan to surf the Web in 4K: Nvidia cards starting in the 600 series and AMD Radeon HD 6000 and 7000-series GPUs can handle 4K, as can systems built with integrated Intel HD 4000 graphics or AMD Trinity APUs. ... There's a light on the horizon. OS support will strengthen, connection types will be able to handle 4K displays sans digital tricks, and prices will drop as more 4K displays hit the market. By then, there will even be more digital content to play on a 4K display (if gaming or multitasking isn't your thing), and 4K monitors will even start to pull in fancier display technology like Nvidia's G-Sync for even smoother digital shootouts."
This discussion has been archived. No new comments can be posted.

4K Monitors: Not Now, But Soon

Comments Filter:
  • Get a TV (Score:3, Informative)

    by TechyImmigrant ( 175943 ) on Tuesday June 17, 2014 @06:06PM (#47258697) Homepage Journal

    Why pay $1000+ for a 4K monitor tomorrow when you can pay $500 for a TV today?

    http://tiamat.tsotech.com/4k-i... [tsotech.com]

    • I have 2 clients with Seiko 4kTVs as monitors and it is fantastic for them. Another case of "This is not what I need, so no one needs it."
      • Re:Get a TV (Score:5, Insightful)

        by TechyImmigrant ( 175943 ) on Tuesday June 17, 2014 @06:20PM (#47258761) Homepage Journal

        Frame rate is for gamers. Programmers need pixels.

        That's why TFA is missing the right angle.
        4K is great for programming
              1 - You can see more lines of code
              2 - it doesn't require silly refresh rates)
        4K for gaming is silly. It doesn't meet the basic requirements
              1 - your card can't drive it
              2 - the framerate is low)

        Arguing that 4K is bad because it's no good for gamers is like arguing mobile phones are bad because you can't program on one effectively.
         

        • Frame rate is for gamers. Programmers need pixels.

          What do game programmers need?

          • by Cryacin ( 657549 ) on Tuesday June 17, 2014 @06:33PM (#47258843)

            What do game programmers need?

            Sleep, generally.

          • What do game programmers need?

            Multiple displays that work well for the task at hand.
            • What do game programmers need? Multiple displays that work well for the task at hand.

              A 39 inch 4kTV is the equivalent of 4 20 inch 1080p monitors together.

              • A 39 inch 4kTV is the equivalent of 4 20 inch 1080p monitors together.

                But useless for when you need to run the debugged program full-screen while watching what happens in a debugger, network sniffer, etc. at the same time. There really are times when you need multiple displays not just for the added screen area, but because each display is being used for something different.
                • by pla ( 258480 )
                  But useless for when you need to run the debugged program full-screen while watching what happens in a debugger, network sniffer, etc. at the same time. There really are times when you need multiple displays not just for the added screen area, but because each display is being used for something different.

                  You can still run a cheap 20" 1080p 2nd monitor while using a 4k as your primary.

                  I say this as a developer, who until recently used a 4-headed machine for most of my work - I haven't bothered to turn
          • I use 3 x 24 inch 120hz monitors. The source code goes on my middle screen, the right screen has a browser open to whatever information I need to be looking at while writing the code and the other monitor usually has a mix of things open e.g. another copy of VS 2013 with another (dependant or co-related) solution open to the code I need to be viewing, designer screens parts of the game (when running the editor for that), etc.

            My three screens have a combined resolution of roughly 6000 x 1080, allow me to hav

        • Re:Get a TV (Score:5, Insightful)

          by sexconker ( 1179573 ) on Tuesday June 17, 2014 @06:47PM (#47258947)

          Frame rate is for gamers. Programmers need pixels.

          That's why TFA is missing the right angle.
          4K is great for programming

                1 - You can see more lines of code

                2 - it doesn't require silly refresh rates)
          4K for gaming is silly. It doesn't meet the basic requirements

                1 - your card can't drive it

                2 - the framerate is low)

          Arguing that 4K is bad because it's no good for gamers is like arguing mobile phones are bad because you can't program on one effectively.

          Are you kidding me? Staring at 30 Hz console output is maddening, and plenty of GPUs can handle 4K @ 60 fps for modern games. I'm sorry if you're trying to run Ubisoft's latest gimped turd, but that's an issue with the game, not a modern flagship GPU. Beyond that plenty of monitors can handle 4K 60 Hz. I have no idea why the fuck this shit got front paged. HDMI 2.0. WELCOME TO THE PRESENT. DisplayPort 1.2. WELCOME TO THE YEAR 2010.

        • by Twinbee ( 767046 )
          Enjoy your mouse cursor and window frame moving at 30fps then, and the associated lag that will bring.

          Instead we should be encouraging movement the other way - towards 120fps which allows for much more lifelike smoother motion. Youtube stuck at 30fps is a thorn in the whole online video sector.
          • Of course, you do realize all major motion pictures are shot at 24fps with the exception of a handful.

        • For gaming (not text or web) if the refresh is high enough (30 hz is not), scaled resolutions look fine. We've hit high enough resolutions where certain scaling operations just look like anti-aliasing instead of blurring.

          Scaling rightfully got a bad name when it was upscaling 800x600 content to a 1024x768 or 1280x1024 17" monitor. It looked blurry. Scaling 1920x1080 to 2560x1440 on a 27" monitor looks really good. I'm more interested on the gaming side if these 4K TVs will take 1920x1080 or 2560x1440 at
        • by mikael ( 484 )

          And graphics programmers need both frame rate and pixels. 120Hz seems perfect, but once you try using 3D vision glasses, those LCD shutters bring back the flicker.

          The problem with the higher resolutions is that application developers just seem to think they can then make their application main window even bigger so it still fills the entire screen. Then they have to use bigger fonts to maintain compatibility with past versions of the same application.

          • by AC-x ( 735297 )

            And graphics programmers need both frame rate and pixels. 120Hz seems perfect, but once you try using 3D vision glasses, those LCD shutters bring back the flicker.

            We're not using CRTs anymore, LCD panels don't flicker with the refresh rate so 24hz, 30hz, 60hz, 120hz will all be just as steady.

        • by dbIII ( 701233 )
          Me - I want e-ink and can live with a one second refresh rate for plain text.
        • by donaldm ( 919619 )

          Frame rate is for gamers. Programmers need pixels.

          Basically if you display 80 or even 120 lines of code it does not matter if the monitor is 1080 pixels or 2160 pixels in height. Sure the higher resolution will display a well designed highly detailed font better than a lower resolution font but that is all, however programmers normally use a mono-spaced font like "Courier" so a fine detailed font is pointless.

          Displaying more than 100 lines of code in the window/screen is IMHO stupid because the human eye

          • by swimboy ( 30943 )

            Basically if you display 80 or even 120 lines of code it does not matter if the monitor is 1080 pixels or 2160 pixels in height. Sure the higher resolution will display a well designed highly detailed font better than a lower resolution font but that is all, however programmers normally use a mono-spaced font like "Courier" so a fine detailed font is pointless.

            Spoken by one who hasn't done much programming on a HiDPI monitor. I can tell you from first-hand experience that the higher resolution display significantly reduces eye fatigue. I have two 24" 1920x1080 external displays connected to my 15" rMBP. I always put my main window on the small 15" screen because the text is much easier on the eyes at 220 dpi. In matter of fact, text is the only thing that looks dramatically better on the retina display than a standard display. Images and icons may be more detaile

        • Frame rate is for gamers. Programmers need pixels.

          The mouse lag on a 24 or 30Hz display will drive you nuts when you are trying to select a block of text.

          If you are a keyboard-only editor, it's not as bad, but even highlighting text or trying to page down quickly will likely send you back to a high-speed multi-monitor setup.

        • The article was talking about 4K for mainstream consumers, which most likely would be closer to gamers than programmers.

      • by strack ( 1051390 )
        I mean, seriously, Seiki needs to hurry up and release a 60hz 4k version of its 38.5 inch display, preferably with a displayport. A 38.5 inch 4k 60hz VA panel would blow the weak ass 28 inch 4k TN panels everyone seems to be pushing today out of the water, especially if they keep their current price point. Ditch the tv tuner and smart tv crap, put in displayport and adaptive sync, and watch it become the monitor for All The Computers In The World.
    • by mcvos ( 645701 )

      I don't understand why 4K TVs exist before 4K monitors do. Firstly, TVs simply don't need a crazy resolution like that. Look at how long it took before HD finally took hold. Is anything actually being broadcast in 4K? And if it's impossible to get a decent signal to it, how do those 4K broadcasts end up on the TV?

  • Occulus Rift (Score:3, Insightful)

    by ZouPrime ( 460611 ) on Tuesday June 17, 2014 @06:10PM (#47258715)

    Some will call me a troll, but as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.

    Why spend a shitload of money of a new 4K screen and the video card necessary for an acceptable game experience when I'll be able to do VR with a fraction of the cost and with my existing hardware setup?

    Obviously that's a gamer perspective - I'm sure plenty of people will find 4K for what they are doing.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      In it's present iteration, the Occulus Rift might very well fit your current hardware but the requirements for getting a decent amount of pixel per view-angle on VR are brutal. Micheal Abrash's post on the matter is very enlightening: http://blogs.valvesoftware.com/abrash/when-it-comes-to-resolution-its-all-relative/. In short, you'll most likely need a ultra-responsive, insanely dense mini-displays each boasting a 4k x 4k resolution per eye. This kind of resolution plus the latency requirements for VR will

    • It's not entirely clear that VR is going to displace PC gaming to that significant of a degree.

      As a fairly avid gamer, most games I play are not in the first person perspective and I don't want them to be. I don't like FPS, and that's a huge portion of all first-person games (though I do like the sort of FPS-stealth-subgenre that encompasses Hitman, Dishonoured, Deus Ex, etc., and I can see how VR would be an asset there).

      Platformers, most RPGs (the Elder Scrolls series is a popular exception, but I have n

      • It's not entirely clear that VR is going to displace PC gaming to that significant of a degree.

        As a fairly avid gamer, most games I play are not in the first person perspective and I don't want them to be. I don't like FPS, and that's a huge portion of all first-person games... and VR almost implies a first person perspective.

        Only if you've got no imagination. What this iteration of VR is bringing is head tracking and that allows massive virtual screens. I think Rift and similar products are going to break into non-gaming market as cost effective way of getting giant flat displays.

        • by Osgeld ( 1900440 )

          head tracking has been around in toy grade vr helmets like the rift since the 90's ... those serial ports on them were not there for the sound

        • by mestar ( 121800 )

          The word "flat", it doesn't mean what you mean it means.

    • by vux984 ( 928602 )

      Some will call me a troll, but as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.

      As a gamer I'm not really concerned about 4k either. I'm much more interested better support for 3-view type setups. And 4k 3-view is just all the gamer problems of 4k times 3 :)

      Oculus... I'm not sold on it. I see it as niche at best. Very cool in that niche though.

      I would like to see head tracking go mainstream though.

    • by Osgeld ( 1900440 )

      you have been able to do that for 2 decades, so the question is why havent you

      I will give you a hint, there is a reason for that, that reason is strapping a thing to your face gets old really fucking quick

      • Strapping something on your face may get old, but today it is better than the helmet you had to wear prior and the fixed device prior to that, the current trend is smaller faster lighter. How long before the VR solution is only slightly more uncomfortable than a pair of glasses?

        • by Osgeld ( 1900440 )

          the last model I had from early 2000's actually weighed less than the rift (8 whole ounces total) had head tracking and displays on par with the then current resolutions (not to mention they only cost like 200 bucks), so the current trend seems to be a rubber band and not predictable by anyone

          http://www.mindflux.com.au/pro... [mindflux.com.au]

    • Re:Occulus Rift (Score:5, Informative)

      by Solandri ( 704621 ) on Tuesday June 17, 2014 @08:50PM (#47259767)

      Some will call me a troll, but as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.

      Why spend a shitload of money of a new 4K screen and the video card necessary for an acceptable game experience when I'll be able to do VR with a fraction of the cost and with my existing hardware setup?

      You're making a fundamental error many people make when it comes to display resolution. What matters isn't resolution or pixels per inch. It's pixels per degree. Angular resolution, not linear resolution.

      I've got a 1080p projector. When I project a 20 ft image onto a wall 10 ft away, the pixels are quite obvious and I wish I had a 4k projector. If I move back to 20 ft away from the wall, the image becomes acceptable again. It's the angle of view that matters not the size or resolution. 20/20 vision is defined as the ability to distinguish a line pair with 1 arc-minute separation. So within one degree (60 arc-minutes) you'd need 120 pixels to fool 20/20 vision.

      This is where the 300 dpi standard comes from. Viewed from 2 ft away, one inch covers just about 2.5 degrees, which is 150 arc-minutes, which can be fully resolved with 300 dots. So for a printout viewed from 2 ft away, you want about 300 dpi to match 20/20 vision. If it's not necessary to perfectly fool the eye, you can cut this requirement to about half.

      In terms of Occulus Rift, a 1080p screen is 2203 pixels diagonal, so this corresponds to 18.4 degrees to fool 20/20 vision, 39 degrees to be adequate. If you want your VR display to look decent while covering a substantially wider angle of view than 39 degrees, you will want better than 1080p resolution. I'm gonna go out on a limb, and predict that most people will want more than a 39 degree field of view in their VR headset.

      • by AmiMoJo ( 196126 ) *

        So within one degree (60 arc-minutes) you'd need 120 pixels to fool 20/20 vision.

        That is a common misunderstanding. 20/20 vision is the ability to distinguish two lines with 1 arc-minute separation from a single thicker line. Beyond that human eyes can still distinguish a 0.5 arc-minute wide line from a 1 arc-minute wide line, and can tell if a 0.5 arc-minute line is jagged or smooth.

        That's why there is a noticeable difference between 300 PPI and 450+ PPI phone displays at normal viewing distances. It's why people with normal vision can differentiate 1080p and 4k on a 127cm screen from

    • by mestar ( 121800 )

      "Why spend a shitload of money of a new 4K screen and the video card necessary for an acceptable game experience when I'll be able to do VR with a fraction of the cost and with my existing hardware setup?"

      Oh boy, somebody is going to get very disappointed.

    • as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.

      Same here, I've been due for a monitor upgrade for a while (was running a triple 19" monitors), but it doesn't make sense to do that now since the budget can be used to snatch up Oculus CV1s when they come out (for less money)!

  • display port (Score:5, Interesting)

    by rogoshen1 ( 2922505 ) on Tuesday June 17, 2014 @06:14PM (#47258731)

    Displayport doesn't have the same limitations that HDMI has at those resolutions. and is available now.

    Nvidia 6xx and ATI 7xxx (not to mention intel hd4000) are not exactly brand new, and available now.

    IF anything, this sounds like "HDMI is showing it's age, use displayport"

    • Re:display port (Score:4, Interesting)

      by complete loony ( 663508 ) <Jeremy.Lakeman@nOSpaM.gmail.com> on Tuesday June 17, 2014 @06:41PM (#47258915)
      HDMI was showing it's age the moment it was designed. All of the design and planning behind HD TV's was short sighted, as if they never planned to replace it.
      • Honestly, they didn't plan to replace it, at least not on any appreciable timescale. People don't buy TVs like they buy games consoles or phones or iPods, so the sensible thing at the time was to roll out yet another twenty-year standard and get around to thinking about succession later. Of course, if you're a Sony or a Samsung looking at your briefly revitalised TV business tailing off again as people finish upgrading, maybe you're regretting this.

    • DisplayPort did not support 2160p60 out-of-the-box either; it needed v1.2 to get there.

      HDMI can do 2160p60 too, just needs v2.0.

      • Re: (Score:2, Informative)

        by Anonymous Coward

        Oh you mean v1.2 which came out in 2009, and virtually every DP capable graphics card and monitor supports?

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          Building on that, is HDMI 2.0 even shopping yet?

          2009 vs 2015, maybe?

    • Totally agree. Nvidia 6xx has been out for a long time, and a 660 costs like $150. Anyone who buys a 4k monitor for $1000+ is not going to think twice about getting a matching video card. For gamers, in all likelihood, they probably already have one. The article claims a hardware barrier that is simply not an issue.

      The real issue here is the price point. 2560x1440 27" monitors have been around for a long time, but it wasn't until it dropped under $400 that gamers started chomping them up. When they ge

    • by Twinbee ( 767046 )
      One major consideration in the exact type of my new gfx card (750 Ti) was down to whether it had DisplayPort. The EVGA version was one of the only ones to have it.
  • I'm not a young person anymore, but I've been on the tech wagon since I was 8 years old. And I have to admit that I was one of those people touting the high-resolution thing and pushed it forward all the time (I even made a living in the graphics industry).

    But there is such a thing as too much. After 720p...over 2 meters away from the television set, despite having Air-Pilot approved eyes, I still could not HONESTLY see the difference between a 50 inch 720p and a 50 inch 1080p, honestly - I could not!

    I
    • If you watched something with high resolution and a clean picture, like Disney's "Frozen," on a high-quality display, like a Samsung 55", then you should be able to tell the difference b/w 720p and 1080p easily. For many things, it is hard to tell the difference at a reasonable distance. Monitors are different in that you're usually much closer to one. At 24", 720p monitors look like crap compared to 1080p. 4K, however, seems like overkill at anything below 30".

      For gaming, I'm totally with you. For com

      • I regularly use a 1080p monitor in the 24" range and I can tell you I would *definitely* like the resolution to be higher. I do a lot of text-based work and I can see the letters start to get blocky if I reduce the text size while I know for a fact I could easily read text even smaller when printed on a decent laser printer.

        Try it one day. Use a word processor to print "the quick brown fox jumped over the lazy dog" in steadily reduced font size down the page. Print that page and hold it next to the computer

      • Everybody says this. It has been repeated hundreds of times on Slashdot. And it is just wrong. Fuzzy text looks fuzzy whether you're 2 inches away or 2 feet away. You might not be able to see individuals pixels, but you can clearly see the resolution is not sufficiently high to allow clear and crisp font rendering. I'm over 40 and my eyesight is worse than most people, but I sure as hell know that zooming out does not make a fuzzy picture look smooth.
    • You need a good quality 1080p film (or picture) to be able to see the difference. I have a 1080p setup at home and I can tell for sure the difference between 720p and 1080p sources.
    • TFA is about monitors. The pixel density on my laptop is about double that of the 28" display on my desk. This is really noticeable for text rendering, where it's clear and crisp on the laptop screen, but looks a little blurry around the edges due to the sub-pixel AA on the bigger display. I'd love to replace the one on my desk with a 4K display once the prices become a bit less silly. And, yes, my laptop can drive a 4K display.
  • What?! (Score:5, Interesting)

    by RyanFenton ( 230700 ) on Tuesday June 17, 2014 @06:19PM (#47258755)

    I'm typing this on a monitor with 3840x2160 resolution, at 60hz right now. I posted about it weeks ago:

    Clicky [slashdot.org]

    It's like $600 when on sale, and it works superb for coding and playing games. Skyrim/Saints Row 4 plays fine on a GTX 660 at 4k resolution, you just disable any AA (not needed), but enable vsync (tearing is more visible at 4k, so just use that). Perhaps that's just me - but things seem fine at 4k res on a medium-cost graphics card.

    A few generations of video cards, and everything will be > 60-FPS smooth again anyway (partially thanks to consoles again), so I don't really need to wait for a dynamic frame smoothing algorithm implementation to enjoy having a giant screen for coding now.

    I don't see any reason why you'd want to wait - it's as cheap as two decent monitors, and if you're slightly near-sighted like me, it's just really great. See my previous post for a review link and an image of all the PC Ultima games on screen at once.

    Ryan Fenton

    • I agree, for coding - the more resolution, the better...no doubt about that.
    • The other nice thing about the Samsung UD590, apart from 4K @ 60Hz, is that it presents itself as a single 4K monitor, rather than two half-size monitors tiled next to each other. That can make a big difference to some uses, like running games at lower resolutions. The Asus PB287Q is another such single-tile 4K monitor.

    • I don't see any reason why you'd want to wait - it's as cheap as two decent monitors, and if you're slightly near-sighted like me, it's just really great.

      I'll wait until I can get as great a deal as I did on this 120% color 25.5" 1920x1200 IPS, which was fifty bucks. Hooray for storage lockers, I guess. And also for flea markets, so I don't have to buy storage lockers myself. Also, I just bought a used video card and it doesn't have displayport. IOW, the obvious reason is that one is a cheap bastard.

      I did buy an Eye-One Display LT for fifty bucks to go with my fifty buck monitor. And the color is still spot on, amazingly. It wanted calibration, but it still

  • by Anonymous Coward

    But all I really need is a LCD running 720p.
    Truthfully all I really need is a super vga CRT.
    In all honesty I could live with the warm glow of an ega screen.
    Net net I miss a nice monochrome to get me through.
    All things considered, teletype handles 99% of my day to day needs.
    Actually, I feel like anything more than a single blinking indicator light is pretty decadent.

    • To be fair, a teletype would solve 80% of what I need, with a video-capable tablet providing the rest...

  • This seems to be a time when monitor features are growing fast. I'm personally going to stick with my 1440p screen until it stabilizes a bit.

    The G-Sync/FreeSync battle is going to start. For gamers, this is going to be big. Right now, G-Sync only works with Nvidia cards, and FreeSync will probably only work with AMD cards. FreeSync is much better licensed, and I expect it will probably win eventually, but I tend to prefer Nvidia cards so I'm willing to wait until we get a clear winner.

    Basically, my dream mo

    • by Twinbee ( 767046 )
      Which is technically better out of Gsync and Freesync?

      I agree with your ideal choice of monitor btw! Apart from the size which should be bigger, but further away. This way your eyes would be more relaxed.
      • I live in a rather small apartment and would really like a triple monitor setup. So I prefer smaller hardware. I'm also nearsighted and usually take my glasses off when computing for a long period, so smaller, closer displays are actually more relaxing. But to each his own.

        As far as which is technically better, I haven't seen any solid comparisons. G-Sync does use proprietary hardware in the display, which means it has the potential to do a lot more. FreeSync works with existing panels provided they support

  • when the 4K content starts coming out
    because you know, they will stop selling these soon and you will never be able to buy one to view all the 4K content coming out soon
    or they will drop in price to the point where kids can afford them on their allowance, but you have to buy it NOW and Before this happens just to be the first one to watch 4K content

    • by Twinbee ( 767046 )
      I admire those first customers, because without them and the rich, the ball would never get rolling and we'd all be without forever.
  • It just seems like the options down the road for media that can store 4k are a bit limited. Streaming seems out of the question when we can't even get consistent 1080p streams out to people. Blu Ray would need some major overhauls unless people want to have 4k movies come on 10 to 20 disks, and something tells me people aren't going to rush out to embrace a new media format even if it did get that overhaul. I just can't help but think 4k tech will have to be targeted at niche industries like photo editin
  • I got a UHD @ 60Hz single stream transport here in the Samsung U28D590D. There's not much video content yet except for a few porn sites, but for stills it's brilliant. Software support for increasing font size is mediocre in many apps, but they're usually functional just ugly. I wish there was some way to just tell Windows to draw a window at 200% size instead. Gaming is cool though my graphics card is choking on the resolution when it gets heavy, I guess it needs an upgrade now that it's pushing 4x the pi

  • I can confirm that the Panasonic TC-L65WT600 [panasonic.com] 65" 4K UHDTV can play 60 fps 4K over its HDMI 2.0 connector (yes, I actually have access to 4K/60p content and a 4K/60p video server). I have seen it for as low as $3500 on BestBuy.com.

  • 4K displays @ 60Hz with Retina pixel doubling = fantastic coding display [1]
    Of course, I don't have this at work - I have two separate 24" monitors but my spend most of my time on my 15" retina screen.

    [1] http://support.apple.com/kb/ht... [apple.com]

  • I've been considering one of these bad boys for awhile now. Cheap and for what I intend to use it for (software dev and video editing where the 30Hz refresh isn't a big deal), good enough. It's not something I'd use for gaming, at least at 4K, but hey... $500.

    • by Squash ( 2258 )

      Really, it's fine for anything this side of gaming. Even Youtube and local media plays just fine. Very little out there has a framerate over that 30hz mark. The only real downside is that you can only fit one of them on your desk at a time.

    • I got one of the early Dell 4K 30 hz monitors and I can NOT recommend it. You think, "hey, 30hz isn't bad even for a game, it should be fine for web-surfing and Word editing." The problem is the mouse cursor motion at only 30 hz is downright annoying! YMMV but for me that alone ruins it. Now I run that monitor at 1080p (60hz) most of the time and only kick it into 4K to look at maps (which look great!)
  • Having a full color gamut is important too. And a really good contrast ratio.

    So I'm saving my pennies for a OLED 4K display. At 80". And none of that curved bullshit.

    • Having a full color gamut is important too. And a really good contrast ratio.

      Check out the reviews of the Asus PB287Q. Very nearly full color gamut. These ain't your daddy's TN panels.

      Yeah OLED would be nice, but I'd be surprised if an UltraHD or 4K OLED display is affordable this decade.

      • There is progress though, the new Samsung Galaxy Tab S has a 10.5" 2560x1600 AMOLED.

        I know I don't want to upgrade my TV until I can get a 50" 4k OLED for about $1K. My crystal ball says that will happen in 2018 :)

  • A 30 inch monitor, 16:10 aspect ratio , and 2560 x 1600

    The only reason I would want much higher resolution that that is to overcome the problem of scaling on digital displays, in the old days of analog monitors we could run differehttp://hardware.slashdot.org/story/14/06/17/224208/4k-monitors-not-now-but-soon#nt resolutions wothout it looking like shit.

    I currently have a 28 inch 1920 x 1200 monitor, but they don't make those anymore,

    • I used to have a 24" 1920x1200 display, but I upgraded to a 27" 2560x1440. It's 16:9, but I find it I can just about fit three browser windows side by side (or two for wider layout pages). It's not the aspect ratio that's important, it's the number of vertical pixels you have available. I put the main KDE tool par on the left-hand side of the screen to make the most of the vertical resolution, it's working pretty well for me.

      I've finally reached the point where I don't feel like I have to vertically maximiz

    • This isn't what your asking for exactly, but it's close:

      Apple thunderbolt and whatever they call the normal display port ones are 2560x1440@27", of course they cost $999 too :(

      I don't know that they are "worth the money". But I definitely approve of mine.

  • Ow, the ignorance (Score:5, Informative)

    by jtownatpunk.net ( 245670 ) on Tuesday June 17, 2014 @07:11PM (#47259115)

    Was that summary written by someone who's never used a 30Hz 4k display?

    A 30Hz feed to an LCD panel is not like a 30Hz feed to a CRT. The CRT phosphors need to be refreshed frequently or the image fades. That's why 30Hz was all flickery and crappy back in the 90s. But 30Hz to an LCD isn't like that. The image stays solid until it's changed. A 30Hz display on an LCD is rock solid and works fine for a workstation. I know. I've seen me do it. Right now. There are no "transition" issues, whatever that is supposed to mean. Nothing weird happens when I switch between applications. Multitasking works fine. I'm playing multiple HD videos without a hitch. Same way the 30hz 1080 programming from cable and satellite plays just fine on LCDs. Gaming's not great but turn on vertical sync and it's not terrible. I'd rather be running at 60Hz but I got my 4k panel for $400. It'll hold me over until displays and video cards with HDMI 2 are common.

    • by AmiMoJo ( 196126 ) *

      30Hz is terrible for scrolling. It's not so bad in a text editor but for web pages it's annoying. At 30Hz you can't read text while it is moving, at 60Hz you can and you don't have to pause while the page moves up. It's a small thing perhaps, but was one of the reasons why I switched away from Firefox to Chrome originally.

  • I don't know any of my tech friends who are breathlessly awaiting 4K monitors. If I go to staples to replace my monitors some day and see that the 4K one is $50 more than the regular one, then OK I'll happily buy one. But it if it is $200 more then, no, I'll wait.

    I am not saying that 4K is a stupid idea, or that I hate 4K, if it turned out that one of my present monitors had a switch on the back that would switch it to 4K I would be delighted, but when it comes to budgeting my money there are a huge numb
    • A 4K TV on the other hand would be pretty cool and I think that Netflix has some programming 4K ready so I would probably make that leap long before a monitor.

      You have that pretty backwards. UltraHD is immediately useful for a monitor, if you actually do work with a computer and aren't one of these people who think work can only be done in a maximized window. There's not much video in that resolution yet and at any distance it's not immediately obvious what resolution a TV is, but you can put all the text you want on screen at that resolution and you sit within arm's length of your monitor.

  • Computers can handle multiple monitors at 60Hz, so why not 4K with duel inputs? Is that feasible, and are there some models on the horizon that have multiple HDMI, dual-dvi, or dual-display port (pre-thunderbolt-2 display port - I don't know the version numbers)? It seems it could be possible.

    • Computers can handle multiple monitors at 60Hz, so why not 4K with duel inputs? Is that feasible, and are there some models on the horizon that have multiple HDMI, dual-dvi, or dual-display port (pre-thunderbolt-2 display port - I don't know the version numbers)?

      The Asus PB287Q has two HDMI and one DisplayPort and supports dual simultaneous input from any two of them. They call it Picture-by-Picture mode. They put two HD displays side by side, with black bars above and below, from two different machines. It's slightly silly, since it's not exactly convenient to switch to that mode, but it's available. It will also do Picture-In-Picture mode, displaying one input across the full screen and the other in a window up in the corner, all rescaled in software transpar

  • I just buy this setup last week and it work perfectly well at 3840 x 2160 @ 60 Hz with Debian Jessie and the last fglrx-14.20 driver. The monitor is packed with a DP cable (and a HDMI cable by the way) so it worked out of the box with a FM2A88X Extreme6+ motherboard. I have see 30 Hz display before, but sorry, I can't enjoy them to test 1080p60 applications in a window while having code and debug around others windows.

    Electronics schematic and hardware routing is a pleasure on UDH monitor, as coding side by

  • The DPI in some tablets / laptops is so high that applications running on desktop operating systems (Windows, OS X and Linux) render like postage stamps with tiny fonts, toolbars and other buttons. To counter this the OS can upscale any non-high-dpi-aware app's window but that makes everything looks blurry.

    So that shiny new 4K monitor may end up delivering an inferior desktop experience and requires a GPU working 4x as hard. That might change as more desktop apps become high dpi aware but obviously any le

    • The DPI in some tablets / laptops is so high that applications running on desktop operating systems (Windows, OS X and Linux) render like postage stamps with tiny fonts, toolbars and other buttons. To counter this the OS can upscale any non-high-dpi-aware app's window but that makes everything looks blurry.

      I'm with you on Windows and Linux. OSX has been doing this natively, with no blurriness, since they first started shipping retina laptops. It really is amazingly nicer than the old low-res screens. Source: been using it personally since 2012.

  • The problem I have with super high res displays is the limitations of window management. I have yet to find a decent tool for Windows that allows for virtual monitors that lets me subdivide a very large display into multiple displays. You end up with maximized windows that make poor use of screen real-estate, like this dinky box on a mostly empty window I'm typing in.

    And what about window content scaling? I'd be nice to scale the content of a window so that I could display more in the same window or make

  • I don't really have any pressing need for 4k. I mean, I'd take it, though I feel like it would require not only sufficient hardware, but also an OS with a UI better designed for it (I imagine there'd be a lot of times, with a screen that large, that you would want to tell windows to "maximize" onto only one quadrant, for instance.)

    What I would really like is for monitors to just not have *regressed*. My laptop's about 3 1/2 years old. I'd be tempted to buy a new one sometime kinda-soon (was looking, and dro

Whoever dies with the most toys wins.

Working...