Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Displays

4K Computer Monitors Are Coming (But Still Pricey) 286

First time accepted submitter jay age writes "When TV makers started pushing 4K screens on unsuspecting public, that just recently upgraded to 1080p, many had doubted what value will they bring consumers. Fair thought — 1080p is, at screen sizes and viewing distances commonly found in homes, good enough. However, PC users such as me have looked at this development with great hope. TV screens must have something to do with market being littered with monitors having puny 1080p resolution. What if 4K TVs will push PC makers to offer 4K screens too, wouldn't that be great? Well, they are coming. ASUS has just announced one!" You could hook a computer up to one of the available 4K displays, but will generally be paying a lot more for the privilege; this one is "only" about $5,000, according to ExtremeTech.
This discussion has been archived. No new comments can be posted.

4K Computer Monitors Are Coming (But Still Pricey)

Comments Filter:
  • by Anonymous Coward

    The question is... what content will take advantage of this? Most consumable content is at 1080p and I've yet to see a game which can run at these resolutions yet alone the newest Cryengine.

    • by Anonymous Coward on Friday May 31, 2013 @06:33PM (#43879263)

      The question is... what content will take advantage of this?.

      Video? Content? None will take advantage of it. Text. Text is the #1 driver of high density displays. Smooth text is pleasing to the eye. Developers will buy this and photo-editors.

      • by AmiMoJo ( 196126 ) *

        CAD users too. When you have lots of overlapping layers and fine detail the extra resolution really helps separate them.

    • by Nyder ( 754090 )

      The question is... what content will take advantage of this? Most consumable content is at 1080p and I've yet to see a game which can run at these resolutions yet alone the newest Cryengine.

      Well, as currently (granted the Next Gen of consoles is around the corner) all the games on the market are made for the Xbox 360 & PS3, and then ported to the PC, and those games are barely 720p on the consoles, I wouldn't want to see any of those games on a 4k monitor.

      Maybe if the next gen games are made at a resolution of 1080p, then yes, they probably be nice on the new 4k monitors. Of course, we don't have any of those next gen games out, so we really do NOT know if the new consoles can even push

      • Well, as currently (granted the Next Gen of consoles is around the corner) all the games on the market are made for the Xbox 360 & PS3, and then ported to the PC...

        This right here is one of the saddest things of the last 15 years or so. How much has gaming been held back and stunted by this one fact. :(

        • Look at the bright side: even a modest PC could run the newest games well, because those games were designed for the modest hardware of consoles.

    • by JDG1980 ( 2438906 ) on Friday May 31, 2013 @06:41PM (#43879333)

      The question is... what content will take advantage of this?

      Anyone who edits (or views) photos should appreciate the higher resolution. Even a cheap modern digital camera can usually take a picture with a resolution about as high as this monitor.

      But the biggest advantage is in smooth text (and vector UI elements where available). You aren't supposed to run this at standard DPI and squint at tiny boxes; you're supposed to run it at 200% scaling and get far smoother text than usual, since it gets 4x the number of pixels at the same point size.

    • by Sir_Sri ( 199544 ) on Friday May 31, 2013 @06:41PM (#43879337)

      Um...

      You realize there are lots of multi monitor setups that support 3, 4, or 6 or even more 1080p displays right?

      If you are trying to power 6 displays in the new Tomb Raider or Crysis 3 with a single GTX 680 you're going to have a rough time no doubt. But you can certainly build a Titan SLI configuration or AMD 7990's in crossfire setups. It is not cheap by any means. But it's certainly possible.

      I would expect to see the PC space start to adopt 'retina' displays or 4K or something else as we go forward. 4k in TV's is only for really big displays or ones viewed up close, and they're astronomically expensive. If you're spending 5k on a monitor and then complaining that your 500 dollar GPU isn't fast enough you should probably have thought of that expense first, or you shouldn't care about the money.

      I saw a (1080p) 120Hz 60 inch TV for 800 bucks this week. New. I'm sure there are better deals in the US. We're not too many years away from an 80 inch or bigger TV being in the 1000 dollar range, and for that 4k is worth it.

      Now yes, the PS4 and XB3 trying to do 4K might be... troublesome. We'll have to see exactly the specs on the GPU and then there's a tradeoff between lower quality at higher resolution or higher resolution and lower quality.

    • Yes, but with difficulty. Rendering generally scales linearly with the number of pixels, so rendering at 3840x2160 takes four times as much processing power as rendering at 1920x1080.

      Games rarely have to specifically support a resolution. Most will query the system to see what resolutions are possible - they may have to upscale UI elements that are normally 1:1 or downscaled, and they may only support certain aspect ratios, but they rarely "break". Even games that use hardcoded resolution lists tend to work

      • Games rarely have to specifically support a resolution. Most will query the system to see what resolutions are possible - they may have to upscale UI elements that are normally 1:1 or downscaled

        Theres also some games that don't scale the UI elements at all.

        I remember experiencing this years back with the original quake (dos version), It was designed for something like 320x200 and if you cranked it up to 800x600 or worse 1024x760 the UI became unusablly small (I presume this has been fixed by now either by ID or by third parties).

        A lot of "builder" and "rts" type games have also had UIs and sometimes content too (though that is rarer now that contents is 3D rendered) in fixed pixel sizes. In recent

    • As far as content goes, couldn't give a damn. But more resolution means more viewing area for multiple windows. I have 1080p on a 15.5 inch screen, and the text isn't too small. Not sure what ppi that comes to, but it's probably about the same as these screens.

      What to do with all that real estate? Firefox takes up ~1/3 of the screen while IRC is open in the background. If I'm writing LaTeX documents (not sure why I chose to capitalize that properly), you keep your editor open in one corner, keep the pdf/dvi

    • The question is... what content will take advantage of this? Most consumable content is at 1080p and I've yet to see a game which can run at these resolutions yet alone the newest Cryengine.

      If you can't hear your card screaming for air right now, you probably don't own a video card that can handle it; but most reasonably modern engines are flexible on resolution. The drop-down menu may not present the option, if it's something odd; but some bodging around with .ini files or command line options can usually be made to happen.

      I'm sure some games just don't ship with the texture assets to fully do justice; but unless the textures the engine uses even for right-in-your-face distances are truly dre

    • You know some people work on computers, right?

  • ajax.googleapis.com (Score:5, Informative)

    by Anonymous Coward on Friday May 31, 2013 @06:21PM (#43879153)

    FFS, why do I need to enable ajax.googleapis.com in NoScript just to view Asus's website?

    I'm sick of creepy Google gathering info on me.
    Then, when I later email someone with a Gmail mailbox, Google will link my IP address (contained in the email's header) with my unique email address and add that intel to their already overflowing collection of 'big data'.

    You know what? Stuff it, I won't enable it. Asus just lost me as a website visitor.

    • Re: (Score:3, Informative)

      by Anonymous Coward

      Google operates a javascript CDN that many sites use. It doesn't use cookies, and means you don't have to load common libraries like jquery from every website individually.

    • ajax.googleapis.com isn't a tracking domain and your IP shouldn't be in any emails you send unless you run your own mail server.

      Next?

      • ajax.googleapis.com isn't a tracking domain and your IP shouldn't be in any emails you send unless you run your own mail server.

        Erm, that seems like a bit of a failure of imagination. Why wouldn't that be a "tracking domain?" Do you have some specific proof that it's somehow impossible for Google to use normal logging functionality on the web server for that domain? And that this will be true forever? Obviously, the idea that any particular domain can't be used for tracking is just silly. So, if google

        • The googleapis.com domain exists purely to be a cookie free domain. Helps with caching.
          Its a pretty common technique for static content.

          Without cookies, they have a IP and the website that loaded the script. Hardly useful for advertising and it cannot be tied to your personalised advertising profile.

  • Weak! (Score:4, Informative)

    by tysonedwards ( 969693 ) on Friday May 31, 2013 @06:24PM (#43879183)
    $5000 for a 31.5" monitor with a 3840x2160 resolution?
    $800 gets a 30" monitor with a 2560x1600 resolution.
    $1400 gets a 50" TV with a 3840x2160 resolution.
    $2200 gets a 15" laptop with a 2880x1800 resolution.

    Sure, none of these are directly comparable, but at the same time it's disappointing to see Asus at such an extreme price point.
    • Re:Weak! (Score:5, Informative)

      by Tagged_84 ( 1144281 ) on Friday May 31, 2013 @07:48PM (#43879881)
      It's Extreme Tech and they admit to making up the price in the article. That site is extremely opinionated and I wouldn't trust it with my bookmarks!
    • > $1400 gets a 50" TV with a 3840x2160 resolution.

      That was my thought as well. Rather than getting a 2x2 1920x1080 monitor array, using the $1400 50" Seiki 4K TV as a monitor will give you the same real estate, seamlessly. You only need one Radeon 7970 (or better) to drive it, simplifying the configuration. $1800 for that configuration is not bad at all.

    • In all honesty, what's the real difference between a TV and a monitor? I wanted a monitor as well as something I use for my Wii (composite output). Ended up buying a TV and use DP -> HDMI when I want to use it as an external (which I never do because that thing is a piece of crap, but that's an aside).

      Is it just a matter of ports or something? Is it the fact that a TV also comes with a tuner? My conclusion is that TVs are a superset of monitors in the functionality that they provide, and often come with

    • by harrkev ( 623093 )

      $800 gets a 30" monitor with a 2560x1600 resolution.

      Really? Where? I have seen budget Korean 2560x1200 monitors for around $400 or so, and name-brand at the same resolution for around $700 or so. I have yet to see a budget 2560x1600. That extra 400 pixels of vertical resolution really raises the price.

  • 4k Computer (Score:5, Funny)

    by John Marter ( 3227 ) on Friday May 31, 2013 @06:28PM (#43879225) Homepage
    The monitor for my 4k computer (a TRS-80 Color Computer) was just an ordinary television.
    • The monitor for my 4GB media PC is just an ordinary (plasma) television but if VDUs had kept pace with computing power advancements we would be looking at 655360000p screens....

  • by MSRedfox ( 1043112 ) on Friday May 31, 2013 @06:34PM (#43879281)
    Why spend $5,000 for a 32" when you can get a 50" 4k for under $1,500. http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=7674736 [tigerdirect.com] (groupon and a few other places have had it down to around $1,100 over the past few months) I know, some people probably find the 50" way too big. But it seems a bit silly that 32" is so more expensive.
    • by DragonWriter ( 970822 ) on Friday May 31, 2013 @06:41PM (#43879339)

      Why spend $5,000 for a 32" when you can get a 50" 4k for under $1,500.

      Well, presumably, because your use case isn't appropriate for a 50" display.

      • For some people it isn't. For me, I'd have no issue replacing my triple monitor eyefinity setup with a single 4k 50". It'd be about the same real estate side to side. And for photo editing and video work, it'd would be quite nice. But I'm probably just in a small niche of people that would find it useful.
      • Re: (Score:2, Interesting)

        by Anonymous Coward

        Why spend $5,000 for a 32" when you can get a 50" 4k for under $1,500.

        Well, presumably, because your use case isn't appropriate for a 50" display.

        Just sit further back then. If you're constrained by space, then it's probably because you're in an office environment, meaning they're targeting the enterprise with this size and price-point.

        For home users, the 50" screen at a lower price-point makes way more sense.

        • If you're constrained by space, then it's probably because you're in an office environment, meaning they're targeting the enterprise with this size and price-point.

          I don't see this, except taking "office" in the broadest possible sense; I mean I could just see moving from a 24" to a 32" monitor for the desktop in the extra bedroom that serves as my home office/library/miscellaneous storage room, but a 50" display would be enormous.

          For home users, the 50" screen at a lower price-point makes way more sense.

          Fo

    • by Nemyst ( 1383049 )
      That TV has only HDMI, which limits full resolution to 30Hz. Sorry, that's an instant pass for me.
    • Who wants to stare at 30Hz on their computer all day? Is this 1992? That's the last time I saw an interlaced display on a computer. That's the best you'll be able to do at 3840x2160 on the HDMI connection on that 50" Seiki. There's currently no way to run them at 60Hz using the available connections on the computer and display. At best, they'll get Nvidia and AMD to support using dual connections to treat the single monitor as dual monitors with no bezel correction.

      Check the bandwidth of various video

      • > Who wants to stare at 30Hz on their computer all day? Is this 1992? That's the last time I saw an interlaced display on a computer.

        30Hz is perfectly acceptable on a computer display - especially if you are staring at it all day. If you want to play video games, that is another issue, but for work like photo editing or software development or spread-sheets, word proceessing, email, or even just web browsing, 30hz is plenty. You won't even notice the difference.

        I speak from experience, I used to have o

      • DisplayPort 1.2 is what's actually needed, it's found on the geforce Titan, GTX 780 and for others such as GTX 680, GTX 660 I plain don't know. Radeons, same deal you'd have to check it.

        DP 1.2 is said to be available on Haswell motherboards. Should do 3840x2160 and 3840x2400 at 60Hz and 2560x1440 or 2560x1600 at 120Hz (of course good luck finding a 120Hz monitor - real one, not fake as on TVs. They only do them as crappy 1080p TN)

        So the connection problem is not that bad except for HDMI 2.0 not being there.

    • by Z34107 ( 925136 )

      With computer monitors, you're generally paying a premium for better input latencies, refresh rates, color reproduction, and ghosting. $5,000 is still on the high side, but I'd be extremely wary about replacing my monitor with a television, sight unseen.

      • With computer monitors, you're generally paying a premium for better input latencies, refresh rates, color reproduction, and ghosting.

        And not fucking with the input signal.

        I've learnt the hard way that some TVs are incapable of taking an input signal in their advertised native resolution and displaying it without fucking it up though application of inappropriate processing that smears single pixel lines making the desktop a blurry mess.

        I've also learnt the hard way that some TVs have terrible VGA inputs that take ages to lock, can't lock properly to the low resoloutions seen during bootup/bios, sometimes mis-lock even at their native reso

        • TVs are also... renowned... for the quality and accuracy of the EDID data they provide the hapless device attempting to drive them.

  • by houbou ( 1097327 ) on Friday May 31, 2013 @06:35PM (#43879291) Journal
    Would love to have a 4K monitor.. Cheez.. the PhotoShop experience alone..
  • It is sickening that an ipad can have a better resolution than a 27 inch display that costs about the same price - that is all.

  • by JDG1980 ( 2438906 ) on Friday May 31, 2013 @06:37PM (#43879317)

    First of all, the alleged price of $5000 is pure speculation. None of the other sources reporting on the Asus 4K monitor have mentioned it, and the Extreme Tech article describes the price as "our guess".

    Secondly, the article is flat-out wrong when it says that Sharp's 4K monitor "doesnâ(TM)t seem to have been released" so far. In fact, the PN-K321 has been released and you can buy one on Amazon [amazon.com] for $4900. A few other online retailers have it, too, for slightly lower prices. There is one weird caveat; you currently need an AMD card for it to work properly, because it uses DisplayPort 1.2 with MST and basically shows up to the OS as two 1920x2160 monitors. You have to use Eyefinity to get the OS to treat it as one large screen. This Youtube video [youtube.com] (not mine - I only wish I could afford this thing!) shows how it's done.

    The Sharp monitor isn't even the cheapest 4K device currently on the market. That distinction belongs to a 50 inch Seiki Digital TV [amazon.com] which costs $1,399.99 on Amazon. But this device can only take a 30 Hz input, due to the limitations of the HDMI protocol. I've also heard some criticisms of the panel quality.

    What I and many others are hoping is that the Asus 4K monitor can lower the price point on this technology. If it sells for the same $5000 as the Sharp monitor, it's a non-event since it does nothing to advance the state of the art. But if they can get it down to $2500 or lower, then we'll start to see it show up in "extreme" gaming rigs and some professional workspaces, and maybe in a year or two they will be affordable for mainstream power users.

  • Aspect Ratio (Score:5, Insightful)

    by scarboni888 ( 1122993 ) on Friday May 31, 2013 @06:48PM (#43879423)

    On my computer monitor I need more height!! Please bring back 16:10 for computer monitors! 16:9 is for tv's only.

    • Just treat it as two 8:9 screens. With a larger display, the usability of each half increases tremendously.

      • by Rich0 ( 548339 )

        The problem is that you're generally constrained to placing those two screens directly side-by-side horizontally.

        If I switched from my current 4:3 monitor to a 16:9 one I'd end up getting a smaller screen, because I couldn't expand the width of the monitor until it had an equal area. Above my monitor there is nothing until you reach the ceiling, but next to it I have other stuff on my desk.

        Sure, you can place stuff side-by-side with a widescreen, but that doesn't change the fact that you have far less vert

    • by pne ( 93383 )

      On my computer monitor I need more height!! Please bring back 16:10 for computer monitors! 16:9 is for tv's only.

      I still have my 16:12 (aka 4:3) for pretty much this reason.

  • by Gordo_1 ( 256312 ) on Friday May 31, 2013 @06:56PM (#43879505)

    And my eyes can barely make out the width of a pixel as it is. What is it going to do for me if you increase pixel density such that pixel are now a quarter the size they are now? Give us 40" or more, and it might start to get interesting, but then you're constantly bending your neck to read what's on different parts of the screen.

    • I recently purchased a Dell 30" screen with 2560x1600 resolution. It's really nice with IPS and the ability to display 12 bit color with the right software and graphics card.

      I think the pixel density for text is about as high as I would want on a screen. For a 4K screen I'd want at least 40".

    • Good for you. Not everyone would want it. My use case is different.

      I sit 2-3ft away from a 27" screen, and would absolutely love to have a 4K screen in a 27-32" form factor in front of me. I have a second monitor and a laptop to form a 3 screen setup, but the single large screen in the center is my preference for primary tasks. Even though there's limited video content for 4k right now, being able to display more information on the screen would be awesome. And I'm sure 4K content will become more prevale
    • And my eyes can barely make out the width of a pixel as it is.

      I think that's the point. Things will look better when you can't make out the width of a pixel at all.

    • Give us 40" or more, and it might start to get interesting, but then you're constantly bending your neck to read what's on different parts of the screen.

      Yeah, I actually prefer the size of my 22" display over that of my 24" display for this reason - the 22" fits perfectly into my field of view. But my 24" is 1920x1200 IPS and the 22" is an abysmal 1080 panel, so I do a bit of neck turning.

      I'd love to have a 4K 22" display. The pixels would be small enough that I'd rarely notice them and everything would

    • Ultimately, we shouldn't be able to see pixels. It would be ideal if they were so small they were below human perception. That's the idea. You don't then make everything microscopic, rather you increase the number of pixels used to render elements so they look smoother.

  • and Asus is a shakedown operation.
    Go Samsung!

    • What?

      I've owned a lot of ASUS products, this is probably the first item I've seen that I'd consider overpriced.

  • Well, these make great monitors.. somebody has already mentioned the 50" sub-$1500 TV.

    I would rather make the case that 4k, while great for PC monitors, are not compelling as consumer TVs. I realize there are charts that demonstrate, scientifically, that 4K is visibly better in a living room, with a large screen, over 1080p, but I don't buy it, at least not for motion video (games and shows). We are reaching the pivot point towards vastly diminishing returns.

    I do that by dropping these pictures fro referenc

  • I would love one for doing retina destined design. Presently if I don't scale the iPad simulator on my screen it is 8 feet high. Doesn't quite give me the right sense of proportion. I suspect that more and more mobile devices are going to go with higher density displays and thus it would be nice to get into at least the same density ballpark on my desktop. The sad part is that most if not all of these monitors will be really BIG. Personally for development I don't like going much over 22" per monitor. I'd j
  • by Virtucon ( 127420 ) on Friday May 31, 2013 @08:11PM (#43879989)

    Like I had a few years ago. I'm also wondering about how to drive a 4K monitor with graphics cards? I mean content and driving the thing will be problematic so if you buy one now you may be buying early first generation hardware when, by the time the second gen comes out, you'll have content and hardware that can take advantage of it.

  • BTW, the TV "4K" TVs have more than JUST the resolution as technical advances over existing HDTV.

    Today's HD & Home Theater podcast episode covered it. The only one I can remember at the moment is expanded color space.

    I'm not trying to completely promote it, heck, I record mostly SD (for disk space reasons) even though I have a HDTV. I am interested in the technology, however.

  • I'd settle for 2048x1536 on a 32 inch screen

    I have a 4 -5 year old 28 inch 1920x1200 at the moment.

    The biggest (affordable) monitors you can get these days are 27 inch with 1920x1080

    you can get a HDTV thats bigger but still only 1920x1080 (Which is fair enough I suppose, since TV resolution is 1080p)

    Sooner or later this old monitor will fail (probably the flourescent illiminating tube) and obviously I will need a replacement, and I will have to buy it before the Marketplace Fairness Act comes into force) (

  • by Cowclops ( 630818 ) on Friday May 31, 2013 @10:42PM (#43880809)

    Heres the real benefit I see to 3840x2160 (or 3840x2400). Whatever. I'll call it 4k like everybody else is.

    The real benefit is that you can start treating your monitor like a CRT again, feeding it arbitrary resolutions. First off, 1080p would work fine on a 3840x2160, and with any luck the monitor would just display it pixel-doubled so it wouldn't be any more blurry than a native 1080p monitor. That would be awesome. You can also run 1280x720p natively, as 3840x2160 is triple that, just like its double 1080p. But heres the real kicker - say you have some old game that tops out at 1280x1024 or something. You'll have to accept the black bars on the sides for games that aren't widescreen, but given that, you can upscale 1280x1024 to 2700x2160 or whatever. It'll still look good because theres so many excess pixels - more than double. Back when we were switching from CRTs to 15 and 17" or maybe a 19 if you're lucky, we had the issue that 800x600 looked like junk on a 1024x768 monitor and 1024x768 looked like junk on 1280x1024. At 3840x2160, we can display 1080p and 720p with literally no artifacts, and anything in between with minimal artifacts. In fact, the dot pitch of a 3840x2160 24" monitor is smaller than that of a typical 21" fine dot pitch aperture grille CRT. 3840x2160 at that resolution is only .13mm dot pitch. Remember when we thought .25mm dot pitch was awesome? Obviously we've got that beat, and that's why 3840x2160 is worth it even when not displaying native 3840x2160 images.

  • by istartedi ( 132515 ) on Friday May 31, 2013 @11:14PM (#43880917) Journal

    No matter how many pixels you have, trendy web guys and even OS UI designers will design as if they don't exist. You'll have to move your mouse pointer to the side to make a menu appear, or click "More" to access more than six options on a horizontal menu. You'll probably have to drop your morning Danish and smudge the monitor with your fingers too.

  • I got the SEIKI 4K TV from TigerDirect not long ago. I hooked it up as a 4th (!) monitor. It dwarfs the 3 30" dells I have next to it since, well... it's frikin 50"!

    Despite being a lot bigger the pixel density is roughly the same as the 30" Dells which are only 2560x1600. The SEIKI 4K is rocking, obviously the 4K resolution of 3840x2160.

    So is it cool?

    Kinda of.

    The fundamental problem, of course, is that the refresh rate is only 30 hertz. This is driven by the fact that current 1.4 HDMI spec can't push faster than that. So the screen has a soft pulsing. It also tears badly on fast moving things, but this may be a separate issue not related to the TV, not sure. Been messing with my video card to try and solve that. VSync doesn't seem to help, so maybe it is the TV.

    Color reproduction is just ... meh. You have to switch modes to get things to look right depending on what you are doing... say work vs. play. Games do look spectacular at the high resolution and the big size. I have the monitor at a normal seated distance, so it's ... immersive. Much like the Rift in that way, but without the nausea and fatbits.

    The bottom line is, don't get this TV unless you are a crazy early adopter who just likes cool toys and throws money away to do it. Wait until next year when HDMI 2.0 comes out and more monitor-class 4K units come onto the market. Then, yes... if you are a resolution junkie like I am, get one! Because even in this early form, the promise is quite clear.

    Oh, and it impresses friends. Very important point. :)

One good suit is worth a thousand resumes.

Working...