4K Computer Monitors Are Coming (But Still Pricey) 286
First time accepted submitter jay age writes "When TV makers started pushing 4K screens on unsuspecting public, that just recently upgraded to 1080p, many had doubted what value will they bring consumers. Fair thought — 1080p is, at screen sizes and viewing distances commonly found in homes, good enough. However, PC users such as me have looked at this development with great hope. TV screens must have something to do with market being littered with monitors having puny 1080p resolution. What if 4K TVs will push PC makers to offer 4K screens too, wouldn't that be great? Well, they are coming. ASUS has just announced one!"
You could hook a computer up to one of the available 4K displays, but will generally be paying a lot more for the privilege; this one is "only" about $5,000, according to ExtremeTech.
But can you play Crysis on it? (Score:2, Interesting)
The question is... what content will take advantage of this? Most consumable content is at 1080p and I've yet to see a game which can run at these resolutions yet alone the newest Cryengine.
Re:But can you play Crysis on it? (Score:5, Insightful)
The question is... what content will take advantage of this?.
Video? Content? None will take advantage of it. Text. Text is the #1 driver of high density displays. Smooth text is pleasing to the eye. Developers will buy this and photo-editors.
Re: (Score:3)
CAD users too. When you have lots of overlapping layers and fine detail the extra resolution really helps separate them.
Re: (Score:2)
The question is... what content will take advantage of this? Most consumable content is at 1080p and I've yet to see a game which can run at these resolutions yet alone the newest Cryengine.
Well, as currently (granted the Next Gen of consoles is around the corner) all the games on the market are made for the Xbox 360 & PS3, and then ported to the PC, and those games are barely 720p on the consoles, I wouldn't want to see any of those games on a 4k monitor.
Maybe if the next gen games are made at a resolution of 1080p, then yes, they probably be nice on the new 4k monitors. Of course, we don't have any of those next gen games out, so we really do NOT know if the new consoles can even push
Re: (Score:2)
Well, as currently (granted the Next Gen of consoles is around the corner) all the games on the market are made for the Xbox 360 & PS3, and then ported to the PC...
This right here is one of the saddest things of the last 15 years or so. How much has gaming been held back and stunted by this one fact. :(
Re: (Score:2)
Look at the bright side: even a modest PC could run the newest games well, because those games were designed for the modest hardware of consoles.
Re:But can you play Crysis on it? (Score:5, Informative)
The question is... what content will take advantage of this?
Anyone who edits (or views) photos should appreciate the higher resolution. Even a cheap modern digital camera can usually take a picture with a resolution about as high as this monitor.
But the biggest advantage is in smooth text (and vector UI elements where available). You aren't supposed to run this at standard DPI and squint at tiny boxes; you're supposed to run it at 200% scaling and get far smoother text than usual, since it gets 4x the number of pixels at the same point size.
Re:But can you play Crysis on it? (Score:5, Insightful)
Ok, so 4K is marketable as a PPI gambit. This makes a lot more sense with your application. The problem is that 4K has to be mass market to drive down the price of such a thing and as we saw with 90s Apple hardware, the application won't drive it.
Why are you citing incidents from the 1990s? Look at the last couple of years. Apple already has driven high-DPI "Retina" displays into the mainstream. Yes, they are currently a premium product on laptops, but on tablets and smartphones, DPI far higher than the desktop norm is now standard across the industry. And Samsung is preparing a 3200x1800 laptop display [engadget.com] – clearly they think there is some demand here.
I think portable devices really have changed the game. Once you've used a iPad 4 for a while, the low DPI on a PC monitor really looks blurry and crappy in comparison. I don't think it's a stretch that desktop and laptop users going forward will want the same high display quality that they have gotten used to on their smartphones and tablets.
Re:But can you play Crysis on it? (Score:4, Interesting)
Um...
You realize there are lots of multi monitor setups that support 3, 4, or 6 or even more 1080p displays right?
If you are trying to power 6 displays in the new Tomb Raider or Crysis 3 with a single GTX 680 you're going to have a rough time no doubt. But you can certainly build a Titan SLI configuration or AMD 7990's in crossfire setups. It is not cheap by any means. But it's certainly possible.
I would expect to see the PC space start to adopt 'retina' displays or 4K or something else as we go forward. 4k in TV's is only for really big displays or ones viewed up close, and they're astronomically expensive. If you're spending 5k on a monitor and then complaining that your 500 dollar GPU isn't fast enough you should probably have thought of that expense first, or you shouldn't care about the money.
I saw a (1080p) 120Hz 60 inch TV for 800 bucks this week. New. I'm sure there are better deals in the US. We're not too many years away from an 80 inch or bigger TV being in the 1000 dollar range, and for that 4k is worth it.
Now yes, the PS4 and XB3 trying to do 4K might be... troublesome. We'll have to see exactly the specs on the GPU and then there's a tradeoff between lower quality at higher resolution or higher resolution and lower quality.
Re: (Score:3)
Yes, but with difficulty. Rendering generally scales linearly with the number of pixels, so rendering at 3840x2160 takes four times as much processing power as rendering at 1920x1080.
Games rarely have to specifically support a resolution. Most will query the system to see what resolutions are possible - they may have to upscale UI elements that are normally 1:1 or downscaled, and they may only support certain aspect ratios, but they rarely "break". Even games that use hardcoded resolution lists tend to work
Re: (Score:2)
Games rarely have to specifically support a resolution. Most will query the system to see what resolutions are possible - they may have to upscale UI elements that are normally 1:1 or downscaled
Theres also some games that don't scale the UI elements at all.
I remember experiencing this years back with the original quake (dos version), It was designed for something like 320x200 and if you cranked it up to 800x600 or worse 1024x760 the UI became unusablly small (I presume this has been fixed by now either by ID or by third parties).
A lot of "builder" and "rts" type games have also had UIs and sometimes content too (though that is rarer now that contents is 3D rendered) in fixed pixel sizes. In recent
Re: (Score:3)
I wouldn't be so sure about that - I've been gaming recently on a 1440p monitor, which isn't exactly something people design for. And yet I haven't really noticed major issues with texture resolution - unless I'm rubbing my face in 512x512 textures, they're usually still over one texel per pixel, meaning they're being downscaled. Yes, certain console ports have problems (Call of Duty being the most prominent), but even many console games look fine at quadruple the resolution they were designed for. A bigger
Re: (Score:2)
You DO realize that there are uses for monitors besides games. This is a monitor for professionals, not gamers.
At work, I have a 2560x1600 surrounded by a pair of 1200x1600 (portrait mode). No games at work. Terminal windows, emacs windows, simulation waveforms, and schematic windows. For this, the more pixels the better. I loved my 30" monitor so much, I purchased one for home use on those days when I telecommute.
Now, it was painful to throw down $1200 for 30" of glass for home use, and I cannot see
Re: (Score:2)
As far as content goes, couldn't give a damn. But more resolution means more viewing area for multiple windows. I have 1080p on a 15.5 inch screen, and the text isn't too small. Not sure what ppi that comes to, but it's probably about the same as these screens.
What to do with all that real estate? Firefox takes up ~1/3 of the screen while IRC is open in the background. If I'm writing LaTeX documents (not sure why I chose to capitalize that properly), you keep your editor open in one corner, keep the pdf/dvi
Re: (Score:3)
The question is... what content will take advantage of this? Most consumable content is at 1080p and I've yet to see a game which can run at these resolutions yet alone the newest Cryengine.
If you can't hear your card screaming for air right now, you probably don't own a video card that can handle it; but most reasonably modern engines are flexible on resolution. The drop-down menu may not present the option, if it's something odd; but some bodging around with .ini files or command line options can usually be made to happen.
I'm sure some games just don't ship with the texture assets to fully do justice; but unless the textures the engine uses even for right-in-your-face distances are truly dre
Work? (Score:2)
You know some people work on computers, right?
Re:But can you play Crysis on it? (Score:4, Insightful)
Re: (Score:2)
Really? Isn't that over 120hz? That's a genuine question -- i'm pretty ignorant of display tech.
Re: (Score:3)
The rate described in ms for a display isn't the frequency with which new data can be presented, but how long it takes a set of data to be presented. In this case, it takes the pixels 8ms (on average) to get from where they were to a new state. At a 60Hz refresh rate, that leaves you 10-11ms to perceive the image before it starts changing again. As the portion of each frame that is spent getting the pixels into position increases, fast-changing scenes will begin to look muddy (because the pixels that have t
Re:But can you play Crysis on it? (Score:4, Interesting)
If the shutter spends too much of its time closed, the illusion of motion is lost.
Nope, the image simply gets progressively darker. This analogy doesn't apply to monitors that usually don't blank the LED backlight while the pixels change state. Now they obviously could blank the backlight since LEDs are more than fast enough. You'd trade off reduced perceived image intensity for crisper, less "muddy" image as you wouldn't be seeing the desired pixel values averaged with values from the transition.
Re: (Score:2, Insightful)
People routinely play on 3-4 monitor setups and I've never heard anyone complain about texture quality on a proper PC game. A shitty console port? Those doesn't even look good at 1080p. Besides, there are too many variables to even answer you question. If you want to know about M:LL specifically I guess you have to buy the game and look at the resource files if you really want to know, or maybe you could just settle to look at one of the many many 4K screenshots that is out there.
Re: (Score:3)
Depends how far away you sit.
At 10 feet from a 36" screen, you can't tell the difference between DVD (576p) and 720p.
At 10 feet from a 55" screen, you can't tell the difference between 720p and 1080p.
At 10 feet from a 110" screen, you can't tell the difference between 1080p and 4k.
So, given that this monitor is less than 36", if you sit at TV viewing distances from it, no, a DVD won't look awful on it.
* all measurements assume normal, good eye sight.
Re: (Score:2)
AA is a hack because of insufficient resolution.
Tell me, which is better: playing on an old SVGA 800x600 with 4x AA, or playing on 1600x1200 with no AA?
Or higher AA, but I'm willing to bet, with a quarter the AA setting, that 1600x1200, will look better than that 800x600. I'd bet 3200x2400 would look even better.
More resolution is better than more AA. More AA is better than nothing, but it's still a hack.
ajax.googleapis.com (Score:5, Informative)
FFS, why do I need to enable ajax.googleapis.com in NoScript just to view Asus's website?
I'm sick of creepy Google gathering info on me.
Then, when I later email someone with a Gmail mailbox, Google will link my IP address (contained in the email's header) with my unique email address and add that intel to their already overflowing collection of 'big data'.
You know what? Stuff it, I won't enable it. Asus just lost me as a website visitor.
Re: (Score:3, Informative)
Google operates a javascript CDN that many sites use. It doesn't use cookies, and means you don't have to load common libraries like jquery from every website individually.
Re: (Score:2)
ajax.googleapis.com isn't a tracking domain and your IP shouldn't be in any emails you send unless you run your own mail server.
Next?
Re: (Score:3)
Erm, that seems like a bit of a failure of imagination. Why wouldn't that be a "tracking domain?" Do you have some specific proof that it's somehow impossible for Google to use normal logging functionality on the web server for that domain? And that this will be true forever? Obviously, the idea that any particular domain can't be used for tracking is just silly. So, if google
Re: (Score:3)
The googleapis.com domain exists purely to be a cookie free domain. Helps with caching.
Its a pretty common technique for static content.
Without cookies, they have a IP and the website that loaded the script. Hardly useful for advertising and it cannot be tied to your personalised advertising profile.
Re: (Score:2)
Nope. I've used webmail, commercial email solutions and ISP email.
The latter two *should* always say the username you authenticated with the SMTP server with, not your IP.
I just did a test to demonstrate this. If your email provider doesn't do this then it might be an idea to use one who knows what they are doing.
Received: from localhost (localhost.localdomain [127.0.0.1])
by smtp29.relay.dfw1a.emailsrvr.com (SMTP Server) with ESMTP id 02066398EA5
for
Weak! (Score:4, Informative)
$800 gets a 30" monitor with a 2560x1600 resolution.
$1400 gets a 50" TV with a 3840x2160 resolution.
$2200 gets a 15" laptop with a 2880x1800 resolution.
Sure, none of these are directly comparable, but at the same time it's disappointing to see Asus at such an extreme price point.
Re:Weak! (Score:5, Informative)
Re: (Score:2)
> $1400 gets a 50" TV with a 3840x2160 resolution.
That was my thought as well. Rather than getting a 2x2 1920x1080 monitor array, using the $1400 50" Seiki 4K TV as a monitor will give you the same real estate, seamlessly. You only need one Radeon 7970 (or better) to drive it, simplifying the configuration. $1800 for that configuration is not bad at all.
Re: (Score:2)
In all honesty, what's the real difference between a TV and a monitor? I wanted a monitor as well as something I use for my Wii (composite output). Ended up buying a TV and use DP -> HDMI when I want to use it as an external (which I never do because that thing is a piece of crap, but that's an aside).
Is it just a matter of ports or something? Is it the fact that a TV also comes with a tuner? My conclusion is that TVs are a superset of monitors in the functionality that they provide, and often come with
Re: (Score:2)
Really? Where? I have seen budget Korean 2560x1200 monitors for around $400 or so, and name-brand at the same resolution for around $700 or so. I have yet to see a budget 2560x1600. That extra 400 pixels of vertical resolution really raises the price.
Re: (Score:2)
Re: (Score:3, Insightful)
I'm still using my 2048x1536@100hz screen from *ten years ago*. Flatpanels are the worst thing to ever happen to display technology.
Re: (Score:2)
you better hope that thing lasts until the 4k's are mass-market!
Re: (Score:3)
But what size? A 31.5" CRT isn't exactly easy to place on an average size desk, and can't compete on power consumption.
For 99% of people flat panels are a huge improvement over their old 17" CRTs that were set to 60Hz by the IT department.
Re: (Score:3)
Because you know everything about what I need, don't you?
What if I'm watching 25 FPS TV shows from the BBC? Hint: 60 is not evenly divisible by 25, but 75 sure is. Displaying 25 FPS material on a 60Hz display is always either messy or broken, or both.
What about editing film-sourced material? I'll take 72Hz over 3:2 pulldown any fucking day.
Re: (Score:3)
Television in the US is not 29.97 FPS, but 29.97002617 FPS.
Analog is fun: It doesn't care about decimal places.
Re: (Score:2)
Two monitors, one 16x9 and one 9x16 is what most people at my work use. I'm pretty happy with the 1920x1080 laptop next to the 2560x1440, but yeah another 160 pixels vertically would be nicer.
4k Computer (Score:5, Funny)
Re: (Score:2)
The monitor for my 4GB media PC is just an ordinary (plasma) television but if VDUs had kept pace with computing power advancements we would be looking at 655360000p screens....
50" 4k costs 1/4 the price of the 32" (Score:4, Interesting)
Re:50" 4k costs 1/4 the price of the 32" (Score:4, Insightful)
Well, presumably, because your use case isn't appropriate for a 50" display.
Re: (Score:2)
Re: (Score:2, Interesting)
Well, presumably, because your use case isn't appropriate for a 50" display.
Just sit further back then. If you're constrained by space, then it's probably because you're in an office environment, meaning they're targeting the enterprise with this size and price-point.
For home users, the 50" screen at a lower price-point makes way more sense.
Re: (Score:2)
I don't see this, except taking "office" in the broadest possible sense; I mean I could just see moving from a 24" to a 32" monitor for the desktop in the extra bedroom that serves as my home office/library/miscellaneous storage room, but a 50" display would be enormous.
Fo
Re: (Score:2)
Re: (Score:3)
Who wants to stare at 30Hz on their computer all day? Is this 1992? That's the last time I saw an interlaced display on a computer. That's the best you'll be able to do at 3840x2160 on the HDMI connection on that 50" Seiki. There's currently no way to run them at 60Hz using the available connections on the computer and display. At best, they'll get Nvidia and AMD to support using dual connections to treat the single monitor as dual monitors with no bezel correction.
Check the bandwidth of various video
Re: (Score:3)
> Who wants to stare at 30Hz on their computer all day? Is this 1992? That's the last time I saw an interlaced display on a computer.
30Hz is perfectly acceptable on a computer display - especially if you are staring at it all day. If you want to play video games, that is another issue, but for work like photo editing or software development or spread-sheets, word proceessing, email, or even just web browsing, 30hz is plenty. You won't even notice the difference.
I speak from experience, I used to have o
Re: (Score:2)
DisplayPort 1.2 is what's actually needed, it's found on the geforce Titan, GTX 780 and for others such as GTX 680, GTX 660 I plain don't know. Radeons, same deal you'd have to check it.
DP 1.2 is said to be available on Haswell motherboards. Should do 3840x2160 and 3840x2400 at 60Hz and 2560x1440 or 2560x1600 at 120Hz (of course good luck finding a 120Hz monitor - real one, not fake as on TVs. They only do them as crappy 1080p TN)
So the connection problem is not that bad except for HDMI 2.0 not being there.
Re: (Score:2)
With computer monitors, you're generally paying a premium for better input latencies, refresh rates, color reproduction, and ghosting. $5,000 is still on the high side, but I'd be extremely wary about replacing my monitor with a television, sight unseen.
Re: (Score:2)
With computer monitors, you're generally paying a premium for better input latencies, refresh rates, color reproduction, and ghosting.
And not fucking with the input signal.
I've learnt the hard way that some TVs are incapable of taking an input signal in their advertised native resolution and displaying it without fucking it up though application of inappropriate processing that smears single pixel lines making the desktop a blurry mess.
I've also learnt the hard way that some TVs have terrible VGA inputs that take ages to lock, can't lock properly to the low resoloutions seen during bootup/bios, sometimes mis-lock even at their native reso
Re: (Score:2)
TVs are also... renowned... for the quality and accuracy of the EDID data they provide the hapless device attempting to drive them.
Re: (Score:3)
Bad review because of a lack of "wifi, Internet connectivity, and 3D", and poor quality upconversion from 1080 content.
None of which applies to one used as a computer monitor. They did mention motion blur, so it might not be appropriate for the latest FPS games.
wow! (Score:3)
About damn time! (Score:2)
It is sickening that an ipad can have a better resolution than a 27 inch display that costs about the same price - that is all.
Article is misleading and incomplete (Score:3)
First of all, the alleged price of $5000 is pure speculation. None of the other sources reporting on the Asus 4K monitor have mentioned it, and the Extreme Tech article describes the price as "our guess".
Secondly, the article is flat-out wrong when it says that Sharp's 4K monitor "doesnâ(TM)t seem to have been released" so far. In fact, the PN-K321 has been released and you can buy one on Amazon [amazon.com] for $4900. A few other online retailers have it, too, for slightly lower prices. There is one weird caveat; you currently need an AMD card for it to work properly, because it uses DisplayPort 1.2 with MST and basically shows up to the OS as two 1920x2160 monitors. You have to use Eyefinity to get the OS to treat it as one large screen. This Youtube video [youtube.com] (not mine - I only wish I could afford this thing!) shows how it's done.
The Sharp monitor isn't even the cheapest 4K device currently on the market. That distinction belongs to a 50 inch Seiki Digital TV [amazon.com] which costs $1,399.99 on Amazon. But this device can only take a 30 Hz input, due to the limitations of the HDMI protocol. I've also heard some criticisms of the panel quality.
What I and many others are hoping is that the Asus 4K monitor can lower the price point on this technology. If it sells for the same $5000 as the Sharp monitor, it's a non-event since it does nothing to advance the state of the art. But if they can get it down to $2500 or lower, then we'll start to see it show up in "extreme" gaming rigs and some professional workspaces, and maybe in a year or two they will be affordable for mainstream power users.
Aspect Ratio (Score:5, Insightful)
On my computer monitor I need more height!! Please bring back 16:10 for computer monitors! 16:9 is for tv's only.
Re: (Score:2)
Just treat it as two 8:9 screens. With a larger display, the usability of each half increases tremendously.
Re: (Score:2)
The problem is that you're generally constrained to placing those two screens directly side-by-side horizontally.
If I switched from my current 4:3 monitor to a 16:9 one I'd end up getting a smaller screen, because I couldn't expand the width of the monitor until it had an equal area. Above my monitor there is nothing until you reach the ceiling, but next to it I have other stuff on my desk.
Sure, you can place stuff side-by-side with a widescreen, but that doesn't change the fact that you have far less vert
Re: (Score:3)
On my computer monitor I need more height!! Please bring back 16:10 for computer monitors! 16:9 is for tv's only.
I still have my 16:12 (aka 4:3) for pretty much this reason.
I'm sitting 24" away from my 24" monitor... (Score:5, Insightful)
And my eyes can barely make out the width of a pixel as it is. What is it going to do for me if you increase pixel density such that pixel are now a quarter the size they are now? Give us 40" or more, and it might start to get interesting, but then you're constantly bending your neck to read what's on different parts of the screen.
Re: (Score:2)
I recently purchased a Dell 30" screen with 2560x1600 resolution. It's really nice with IPS and the ability to display 12 bit color with the right software and graphics card.
I think the pixel density for text is about as high as I would want on a screen. For a 4K screen I'd want at least 40".
Re: (Score:2)
I sit 2-3ft away from a 27" screen, and would absolutely love to have a 4K screen in a 27-32" form factor in front of me. I have a second monitor and a laptop to form a 3 screen setup, but the single large screen in the center is my preference for primary tasks. Even though there's limited video content for 4k right now, being able to display more information on the screen would be awesome. And I'm sure 4K content will become more prevale
Re: (Score:2)
And my eyes can barely make out the width of a pixel as it is.
I think that's the point. Things will look better when you can't make out the width of a pixel at all.
Re: (Score:2)
Give us 40" or more, and it might start to get interesting, but then you're constantly bending your neck to read what's on different parts of the screen.
Yeah, I actually prefer the size of my 22" display over that of my 24" display for this reason - the 22" fits perfectly into my field of view. But my 24" is 1920x1200 IPS and the 22" is an abysmal 1080 panel, so I do a bit of neck turning.
I'd love to have a 4K 22" display. The pixels would be small enough that I'd rarely notice them and everything would
Make pixels invisible (Score:2)
Ultimately, we shouldn't be able to see pixels. It would be ideal if they were so small they were below human perception. That's the idea. You don't then make everything microscopic, rather you increase the number of pixels used to render elements so they look smoother.
Re: (Score:2)
The reason you can tell the difference between looking outside and your monitor has more to do with your monitor's limited color gamut, dynamic range and contrast ratio than pixel density. Beyond a certain density, you need a much bigger screen to appreciate the extra pixels.
a buck a pixel (Score:2)
and Asus is a shakedown operation.
Go Samsung!
Re: (Score:2)
What?
I've owned a lot of ASUS products, this is probably the first item I've seen that I'd consider overpriced.
Pixels and the real world (Score:2)
Well, these make great monitors.. somebody has already mentioned the 50" sub-$1500 TV.
I would rather make the case that 4k, while great for PC monitors, are not compelling as consumer TVs. I realize there are charts that demonstrate, scientifically, that 4K is visibly better in a living room, with a large screen, over 1080p, but I don't buy it, at least not for motion video (games and shows). We are reaching the pivot point towards vastly diminishing returns.
I do that by dropping these pictures fro referenc
Retnia design (Score:2)
Just give me 1200P (Score:3)
Like I had a few years ago. I'm also wondering about how to drive a 4K monitor with graphics cards? I mean content and driving the thing will be problematic so if you buy one now you may be buying early first generation hardware when, by the time the second gen comes out, you'll have content and hardware that can take advantage of it.
4K's improvements are not JUST resolution (Score:2)
BTW, the TV "4K" TVs have more than JUST the resolution as technical advances over existing HDTV.
Today's HD & Home Theater podcast episode covered it. The only one I can remember at the moment is expanded color space.
I'm not trying to completely promote it, heck, I record mostly SD (for disk space reasons) even though I have a HDTV. I am interested in the technology, however.
I'd settle for (Score:2)
I'd settle for 2048x1536 on a 32 inch screen
I have a 4 -5 year old 28 inch 1920x1200 at the moment.
The biggest (affordable) monitors you can get these days are 27 inch with 1920x1080
you can get a HDTV thats bigger but still only 1920x1080 (Which is fair enough I suppose, since TV resolution is 1080p)
Sooner or later this old monitor will fail (probably the flourescent illiminating tube) and obviously I will need a replacement, and I will have to buy it before the Marketplace Fairness Act comes into force) (
Arbitrary Resolutions (Score:5, Insightful)
Heres the real benefit I see to 3840x2160 (or 3840x2400). Whatever. I'll call it 4k like everybody else is.
The real benefit is that you can start treating your monitor like a CRT again, feeding it arbitrary resolutions. First off, 1080p would work fine on a 3840x2160, and with any luck the monitor would just display it pixel-doubled so it wouldn't be any more blurry than a native 1080p monitor. That would be awesome. You can also run 1280x720p natively, as 3840x2160 is triple that, just like its double 1080p. But heres the real kicker - say you have some old game that tops out at 1280x1024 or something. You'll have to accept the black bars on the sides for games that aren't widescreen, but given that, you can upscale 1280x1024 to 2700x2160 or whatever. It'll still look good because theres so many excess pixels - more than double. Back when we were switching from CRTs to 15 and 17" or maybe a 19 if you're lucky, we had the issue that 800x600 looked like junk on a 1024x768 monitor and 1024x768 looked like junk on 1280x1024. At 3840x2160, we can display 1080p and 720p with literally no artifacts, and anything in between with minimal artifacts. In fact, the dot pitch of a 3840x2160 24" monitor is smaller than that of a typical 21" fine dot pitch aperture grille CRT. 3840x2160 at that resolution is only .13mm dot pitch. Remember when we thought .25mm dot pitch was awesome? Obviously we've got that beat, and that's why 3840x2160 is worth it even when not displaying native 3840x2160 images.
Don't worry (Score:3)
No matter how many pixels you have, trendy web guys and even OS UI designers will design as if they don't exist. You'll have to move your mouse pointer to the side to make a menu appear, or click "More" to access more than six options on a horizontal menu. You'll probably have to drop your morning Danish and smudge the monitor with your fingers too.
I already have a 4K monitor on my computer (Score:3)
I got the SEIKI 4K TV from TigerDirect not long ago. I hooked it up as a 4th (!) monitor. It dwarfs the 3 30" dells I have next to it since, well... it's frikin 50"!
Despite being a lot bigger the pixel density is roughly the same as the 30" Dells which are only 2560x1600. The SEIKI 4K is rocking, obviously the 4K resolution of 3840x2160.
So is it cool?
Kinda of.
The fundamental problem, of course, is that the refresh rate is only 30 hertz. This is driven by the fact that current 1.4 HDMI spec can't push faster than that. So the screen has a soft pulsing. It also tears badly on fast moving things, but this may be a separate issue not related to the TV, not sure. Been messing with my video card to try and solve that. VSync doesn't seem to help, so maybe it is the TV.
Color reproduction is just ... meh. You have to switch modes to get things to look right depending on what you are doing... say work vs. play. Games do look spectacular at the high resolution and the big size. I have the monitor at a normal seated distance, so it's ... immersive. Much like the Rift in that way, but without the nausea and fatbits.
The bottom line is, don't get this TV unless you are a crazy early adopter who just likes cool toys and throws money away to do it. Wait until next year when HDMI 2.0 comes out and more monitor-class 4K units come onto the market. Then, yes... if you are a resolution junkie like I am, get one! Because even in this early form, the promise is quite clear.
Oh, and it impresses friends. Very important point. :)
Re: (Score:3)
DisplayPort 1.2 can already do 4K @ 60 Hz. What's so special about HDMI?
Re: (Score:2)
1. So far it's been marketed more successfully and therefore exists on most HD devices
That'd be about it.
Re: (Score:2)
1. So far it's been marketed more successfully and therefore exists on most HD devices
That'd be about it.
On the plus side, the devices that are HDMI only are also unlikely to be sufficiently new and powerful to provide 4k output, and the scaling from the 1080 they do provide is a trivial 1->4, so there should be no unpleasant artifacts.
The only real losers are the (relatively thin) slice of HDMI 1.4 capable PC video cards, which are capable of pushing 4k pixels; but only at low refresh rates. Anything earlier than 1.4 won't handle that resolution at all, and anything that isn't a PC(home theatre type device
Re: (Score:2)
A ~100 seems a good count over 2013 stock.
Re: (Score:2)
Try to find a DisplayPort KVM switch. I got a 30" 2560x1600 at home for those days when I telecommute. I don't have the real estate for a 2nd whole computer setup, so I use a KVM switch. To use 2560X1600, you need a digital connection -- analog won't cut it. KVMs with digital inputs are invariably DVI.
Fortunately, my home laptop had HDMI, which connects to DVI with a simple cable. Unfortunately, HDMI won't go past 1080.
Re: (Score:2)
But this is more than just a PC problem. It's also a hurdle in the home theater space as well.
Not really. Virtually all 4K video content will be sourced from film. This is true of 1080p content on current Blu-Rays; even most TV shows in the HD era are shot on film, not video. This means the frame rate will be 23.976 frames per second, which the current version of HDMI can handle at that resolution just fine. It's only PCs that really need 4K @ 60 Hz.
Re: (Score:2)
New movies will be 4/8k ready. Film is 4k/8k scannable given Hollywood budgets and quality original stock.
4K will become the plaything of many low budget productions - with well under $50-100k production costs.
People spent $10K on early projectors, plasma, media collections "back in the day"
So have the back catalogue content, the gpu's, the computer connections, the display, the codecs, the cpus, the computer games and OS.
Price gouging DR
Re: (Score:2)
Actually, it is an issue. I don't see DisplayPort on a GTX560 or HD6770 nor a lot of other recent graphics cards that can drive such resolution. So yeah, it's still an issue.
At this point, the technology is still cutting-edge. Even if Asus manages to pull a rabbit out of their hat and releases this monitor at a $999 price point, that's still a premium product, and if the buyer doesn't already have a current-generation video card, he/she probably won't balk too much at spending an extra $100-$200 for one w
Re: (Score:2)
so we're going to have 30 to 50 Mbps internet connection to drive this beast? something tells me most in the USA will be screwed.
Re: (Score:2)
Not every use case is gaming. There are plenty of uses where more resolution would be worth trading off refresh rate. Obviously, if you can have both together, that's better, but refresh rate isn't always the key feature.
Re: (Score:2, Interesting)
There is nothing wrong with plain old VGA. It could easily handle these resolutions on CRTs. It can do the same on today's flat panels.
Re: (Score:2)
Re:Why? (Score:4, Interesting)
The thing with "VGA" is there really isn't too much to it, three analog video signals and two sync signals with some loose agreements on timings.
That means that there is very little theoretical limit on resolution* but it also means that.
1: All components in the chain have to actually have sufficient analog bandwidth. Lack of strong standards and gradual failure (rather than the brick wall failure you get with digital systems) if the analog circuitry is skimped on encourages skimping on the analog components. This is particually bad with TVs (monitors seem to make an effort to give acceptable performance on VGA at their native resoloution).
2: When driving a screen with discrete pixels the receiver has to guess where each line starts and ends. They are generally pretty good at it but again poor implementations, unhelpful content (completely black screen, screen with black bars from the source) or just plain bad luck can cause mis-locks which are annoying.
3: The individual pixels will inevitably not be completely isolated from each other.
* The connector probably imposes some limit but using the rule of thumb that structures less than a tenth of a wavelength can be regarded as of negligable size it should be usable up to a few gigahertz with careful termination..
Re: (Score:2)
If you're buying a $5000 4k monitor, upgrading your video card to something that has a decent interconnect isn't going to bother you too much. Your point seems to be along the lines of "4k monitors are impractical because my POS video with HDMI out can't drive one!" So what? 4k monitors are obviously in early adopter territory currently, but display port and thunderbolt are already widely deployed, just on a platform you have some kind of emotional dislike of. You can buy Wintels with display port as we
Re: (Score:2)
Displayport or thundebolt is on every Mac and is available from every major manufacturer. If you want one, you can get one. They're a lot more widely deployed than 4k monitors.
Re: (Score:2)
It would be 2K if we kept using the vertical resolution that we have used since the beginning of TV. This move to horizontal resolution is pure marketing hype, and it sucks
Re: (Score:2)
Re: (Score:2)
1920x1200 displays are back, though. More expensive than 1080p, but then again in the times of CRT monitors a display that could do 1280x960 at 85Hz or 1152x864 at 100Hz was quite more expensive that one that topped at 1024x768 85Hz (unless you wanted to burn your eyes)
Re: (Score:2)
Re: (Score:3)
I hate coders like you: I have to jump all over the place to see what's happening in some function you call that isn't located right where the call is located. So now I end having to use 3 monitors and a couple dozen windows just to see all the logic of what is happening.