4K Computer Monitors Are Coming (But Still Pricey) 286
First time accepted submitter jay age writes "When TV makers started pushing 4K screens on unsuspecting public, that just recently upgraded to 1080p, many had doubted what value will they bring consumers. Fair thought — 1080p is, at screen sizes and viewing distances commonly found in homes, good enough. However, PC users such as me have looked at this development with great hope. TV screens must have something to do with market being littered with monitors having puny 1080p resolution. What if 4K TVs will push PC makers to offer 4K screens too, wouldn't that be great? Well, they are coming. ASUS has just announced one!"
You could hook a computer up to one of the available 4K displays, but will generally be paying a lot more for the privilege; this one is "only" about $5,000, according to ExtremeTech.
But can you play Crysis on it? (Score:2, Interesting)
The question is... what content will take advantage of this? Most consumable content is at 1080p and I've yet to see a game which can run at these resolutions yet alone the newest Cryengine.
50" 4k costs 1/4 the price of the 32" (Score:4, Interesting)
Re:But can you play Crysis on it? (Score:0, Interesting)
By "support", what are the texture sizes like?
If I play a DVD on 4k it's going to look terrible because the source data is awful. I'm guessing that if my source textures are small then I'm going to experience the same sort of issues?
Re:But can you play Crysis on it? (Score:4, Interesting)
Um...
You realize there are lots of multi monitor setups that support 3, 4, or 6 or even more 1080p displays right?
If you are trying to power 6 displays in the new Tomb Raider or Crysis 3 with a single GTX 680 you're going to have a rough time no doubt. But you can certainly build a Titan SLI configuration or AMD 7990's in crossfire setups. It is not cheap by any means. But it's certainly possible.
I would expect to see the PC space start to adopt 'retina' displays or 4K or something else as we go forward. 4k in TV's is only for really big displays or ones viewed up close, and they're astronomically expensive. If you're spending 5k on a monitor and then complaining that your 500 dollar GPU isn't fast enough you should probably have thought of that expense first, or you shouldn't care about the money.
I saw a (1080p) 120Hz 60 inch TV for 800 bucks this week. New. I'm sure there are better deals in the US. We're not too many years away from an 80 inch or bigger TV being in the 1000 dollar range, and for that 4k is worth it.
Now yes, the PS4 and XB3 trying to do 4K might be... troublesome. We'll have to see exactly the specs on the GPU and then there's a tradeoff between lower quality at higher resolution or higher resolution and lower quality.
Re:50" 4k costs 1/4 the price of the 32" (Score:2, Interesting)
Well, presumably, because your use case isn't appropriate for a 50" display.
Just sit further back then. If you're constrained by space, then it's probably because you're in an office environment, meaning they're targeting the enterprise with this size and price-point.
For home users, the 50" screen at a lower price-point makes way more sense.
Re:Why? (Score:2, Interesting)
There is nothing wrong with plain old VGA. It could easily handle these resolutions on CRTs. It can do the same on today's flat panels.
Re:Why? (Score:4, Interesting)
The thing with "VGA" is there really isn't too much to it, three analog video signals and two sync signals with some loose agreements on timings.
That means that there is very little theoretical limit on resolution* but it also means that.
1: All components in the chain have to actually have sufficient analog bandwidth. Lack of strong standards and gradual failure (rather than the brick wall failure you get with digital systems) if the analog circuitry is skimped on encourages skimping on the analog components. This is particually bad with TVs (monitors seem to make an effort to give acceptable performance on VGA at their native resoloution).
2: When driving a screen with discrete pixels the receiver has to guess where each line starts and ends. They are generally pretty good at it but again poor implementations, unhelpful content (completely black screen, screen with black bars from the source) or just plain bad luck can cause mis-locks which are annoying.
3: The individual pixels will inevitably not be completely isolated from each other.
* The connector probably imposes some limit but using the rule of thumb that structures less than a tenth of a wavelength can be regarded as of negligable size it should be usable up to a few gigahertz with careful termination..
Re:But can you play Crysis on it? (Score:4, Interesting)
If the shutter spends too much of its time closed, the illusion of motion is lost.
Nope, the image simply gets progressively darker. This analogy doesn't apply to monitors that usually don't blank the LED backlight while the pixels change state. Now they obviously could blank the backlight since LEDs are more than fast enough. You'd trade off reduced perceived image intensity for crisper, less "muddy" image as you wouldn't be seeing the desired pixel values averaged with values from the transition.