4K Monitors: Not Now, But Soon 186
An anonymous reader writes 4K monitor prices have fallen into the range where mainstream consumers are starting to consider them for work and for play. There are enough models that we can compare and contrast, and figure out which are the best of the ones available. But this report at The Wirecutter makes the case that absent a pressing need for 8.29 million pixels, you should just wait before buying one. They say, "The current version of the HDMI specification (1.4a) can only output a 4096×2160 resolution at a refresh rate of 24 Hz or 3840×2160 at 30 Hz—the latter, half that of what we're used to on TVs and monitors. Connect up a 4K monitor at 30 Hz via HDMI and you'll see choppier animations and transitions in your OS. You might also encounter some visible motion stuttering during normal use, and you'll be locked to a maximum of 30 frames per second for your games—it's playable, but not that smooth. ... Most people don't own a system that's good enough for gaming on a 4K display—at least, not at highest-quality settings. You'll be better off if you just plan to surf the Web in 4K: Nvidia cards starting in the 600 series and AMD Radeon HD 6000 and 7000-series GPUs can handle 4K, as can systems built with integrated Intel HD 4000 graphics or AMD Trinity APUs. ... There's a light on the horizon. OS support will strengthen, connection types will be able to handle 4K displays sans digital tricks, and prices will drop as more 4K displays hit the market. By then, there will even be more digital content to play on a 4K display (if gaming or multitasking isn't your thing), and 4K monitors will even start to pull in fancier display technology like Nvidia's G-Sync for even smoother digital shootouts."
Get a TV (Score:3, Informative)
Why pay $1000+ for a 4K monitor tomorrow when you can pay $500 for a TV today?
http://tiamat.tsotech.com/4k-i... [tsotech.com]
Re: (Score:3)
Re:Get a TV (Score:5, Insightful)
Frame rate is for gamers. Programmers need pixels.
That's why TFA is missing the right angle.
4K is great for programming
1 - You can see more lines of code
2 - it doesn't require silly refresh rates)
4K for gaming is silly. It doesn't meet the basic requirements
1 - your card can't drive it
2 - the framerate is low)
Arguing that 4K is bad because it's no good for gamers is like arguing mobile phones are bad because you can't program on one effectively.
Bread, eggs, breaded eggs (Score:3)
Frame rate is for gamers. Programmers need pixels.
What do game programmers need?
Re:Bread, eggs, breaded eggs (Score:5, Funny)
What do game programmers need?
Sleep, generally.
Re: (Score:2)
Multiple displays that work well for the task at hand.
Re: (Score:2)
What do game programmers need? Multiple displays that work well for the task at hand.
A 39 inch 4kTV is the equivalent of 4 20 inch 1080p monitors together.
Re: (Score:2)
But useless for when you need to run the debugged program full-screen while watching what happens in a debugger, network sniffer, etc. at the same time. There really are times when you need multiple displays not just for the added screen area, but because each display is being used for something different.
Re: (Score:3)
You can still run a cheap 20" 1080p 2nd monitor while using a 4k as your primary.
I say this as a developer, who until recently used a 4-headed machine for most of my work - I haven't bothered to turn
Re: (Score:2)
I use 3 x 24 inch 120hz monitors. The source code goes on my middle screen, the right screen has a browser open to whatever information I need to be looking at while writing the code and the other monitor usually has a mix of things open e.g. another copy of VS 2013 with another (dependant or co-related) solution open to the code I need to be viewing, designer screens parts of the game (when running the editor for that), etc.
My three screens have a combined resolution of roughly 6000 x 1080, allow me to hav
Re:Get a TV (Score:5, Insightful)
Frame rate is for gamers. Programmers need pixels.
That's why TFA is missing the right angle.
4K is great for programming
1 - You can see more lines of code
2 - it doesn't require silly refresh rates)
4K for gaming is silly. It doesn't meet the basic requirements
1 - your card can't drive it
2 - the framerate is low)
Arguing that 4K is bad because it's no good for gamers is like arguing mobile phones are bad because you can't program on one effectively.
Are you kidding me? Staring at 30 Hz console output is maddening, and plenty of GPUs can handle 4K @ 60 fps for modern games. I'm sorry if you're trying to run Ubisoft's latest gimped turd, but that's an issue with the game, not a modern flagship GPU. Beyond that plenty of monitors can handle 4K 60 Hz. I have no idea why the fuck this shit got front paged. HDMI 2.0. WELCOME TO THE PRESENT. DisplayPort 1.2. WELCOME TO THE YEAR 2010.
Re: (Score:2)
Are you kidding me? Staring at 30 Hz console output is maddening
Huh? How can you tell the difference? It's not like it's a CRT only scanning lines at 30hz.
Re: (Score:2)
Not madenning?
http://www.jfedor.org/aaquake2... [jfedor.org]
How how about that then, eh?
Re: (Score:3)
Instead we should be encouraging movement the other way - towards 120fps which allows for much more lifelike smoother motion. Youtube stuck at 30fps is a thorn in the whole online video sector.
Re: (Score:2)
Of course, you do realize all major motion pictures are shot at 24fps with the exception of a handful.
Re: (Score:2)
Scaling rightfully got a bad name when it was upscaling 800x600 content to a 1024x768 or 1280x1024 17" monitor. It looked blurry. Scaling 1920x1080 to 2560x1440 on a 27" monitor looks really good. I'm more interested on the gaming side if these 4K TVs will take 1920x1080 or 2560x1440 at
Re: (Score:2)
And graphics programmers need both frame rate and pixels. 120Hz seems perfect, but once you try using 3D vision glasses, those LCD shutters bring back the flicker.
The problem with the higher resolutions is that application developers just seem to think they can then make their application main window even bigger so it still fills the entire screen. Then they have to use bigger fonts to maintain compatibility with past versions of the same application.
Re: (Score:2)
And graphics programmers need both frame rate and pixels. 120Hz seems perfect, but once you try using 3D vision glasses, those LCD shutters bring back the flicker.
We're not using CRTs anymore, LCD panels don't flicker with the refresh rate so 24hz, 30hz, 60hz, 120hz will all be just as steady.
Re: (Score:2)
Re: (Score:2)
Frame rate is for gamers. Programmers need pixels.
Basically if you display 80 or even 120 lines of code it does not matter if the monitor is 1080 pixels or 2160 pixels in height. Sure the higher resolution will display a well designed highly detailed font better than a lower resolution font but that is all, however programmers normally use a mono-spaced font like "Courier" so a fine detailed font is pointless.
Displaying more than 100 lines of code in the window/screen is IMHO stupid because the human eye
Re: (Score:2)
Basically if you display 80 or even 120 lines of code it does not matter if the monitor is 1080 pixels or 2160 pixels in height. Sure the higher resolution will display a well designed highly detailed font better than a lower resolution font but that is all, however programmers normally use a mono-spaced font like "Courier" so a fine detailed font is pointless.
Spoken by one who hasn't done much programming on a HiDPI monitor. I can tell you from first-hand experience that the higher resolution display significantly reduces eye fatigue. I have two 24" 1920x1080 external displays connected to my 15" rMBP. I always put my main window on the small 15" screen because the text is much easier on the eyes at 220 dpi. In matter of fact, text is the only thing that looks dramatically better on the retina display than a standard display. Images and icons may be more detaile
Re: (Score:2)
Frame rate is for gamers. Programmers need pixels.
The mouse lag on a 24 or 30Hz display will drive you nuts when you are trying to select a block of text.
If you are a keyboard-only editor, it's not as bad, but even highlighting text or trying to page down quickly will likely send you back to a high-speed multi-monitor setup.
Re: (Score:2)
The article was talking about 4K for mainstream consumers, which most likely would be closer to gamers than programmers.
Re: (Score:3)
Re: (Score:2)
I don't understand why 4K TVs exist before 4K monitors do. Firstly, TVs simply don't need a crazy resolution like that. Look at how long it took before HD finally took hold. Is anything actually being broadcast in 4K? And if it's impossible to get a decent signal to it, how do those 4K broadcasts end up on the TV?
Re: (Score:2)
The VA panels used in TVs are prioritizing cost, color fidelity, brightness and viewing angles. These panels have horrid pixel precision and have huge response time, pixels blend with each other on a regular basis. Using TVs as monitors are painful to the eye.
Most TV's are TN, not VA.
Occulus Rift (Score:3, Insightful)
Some will call me a troll, but as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.
Why spend a shitload of money of a new 4K screen and the video card necessary for an acceptable game experience when I'll be able to do VR with a fraction of the cost and with my existing hardware setup?
Obviously that's a gamer perspective - I'm sure plenty of people will find 4K for what they are doing.
Re: (Score:2, Interesting)
In it's present iteration, the Occulus Rift might very well fit your current hardware but the requirements for getting a decent amount of pixel per view-angle on VR are brutal. Micheal Abrash's post on the matter is very enlightening: http://blogs.valvesoftware.com/abrash/when-it-comes-to-resolution-its-all-relative/. In short, you'll most likely need a ultra-responsive, insanely dense mini-displays each boasting a 4k x 4k resolution per eye. This kind of resolution plus the latency requirements for VR will
Re: (Score:2)
It's not entirely clear that VR is going to displace PC gaming to that significant of a degree.
As a fairly avid gamer, most games I play are not in the first person perspective and I don't want them to be. I don't like FPS, and that's a huge portion of all first-person games (though I do like the sort of FPS-stealth-subgenre that encompasses Hitman, Dishonoured, Deus Ex, etc., and I can see how VR would be an asset there).
Platformers, most RPGs (the Elder Scrolls series is a popular exception, but I have n
Re: (Score:2)
It's not entirely clear that VR is going to displace PC gaming to that significant of a degree.
As a fairly avid gamer, most games I play are not in the first person perspective and I don't want them to be. I don't like FPS, and that's a huge portion of all first-person games... and VR almost implies a first person perspective.
Only if you've got no imagination. What this iteration of VR is bringing is head tracking and that allows massive virtual screens. I think Rift and similar products are going to break into non-gaming market as cost effective way of getting giant flat displays.
Re: (Score:2)
head tracking has been around in toy grade vr helmets like the rift since the 90's ... those serial ports on them were not there for the sound
Re: (Score:2)
The word "flat", it doesn't mean what you mean it means.
Re: (Score:2)
Some will call me a troll, but as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.
As a gamer I'm not really concerned about 4k either. I'm much more interested better support for 3-view type setups. And 4k 3-view is just all the gamer problems of 4k times 3 :)
Oculus... I'm not sold on it. I see it as niche at best. Very cool in that niche though.
I would like to see head tracking go mainstream though.
Re: (Score:3)
you have been able to do that for 2 decades, so the question is why havent you
I will give you a hint, there is a reason for that, that reason is strapping a thing to your face gets old really fucking quick
Re: (Score:2)
Strapping something on your face may get old, but today it is better than the helmet you had to wear prior and the fixed device prior to that, the current trend is smaller faster lighter. How long before the VR solution is only slightly more uncomfortable than a pair of glasses?
Re: (Score:2)
the last model I had from early 2000's actually weighed less than the rift (8 whole ounces total) had head tracking and displays on par with the then current resolutions (not to mention they only cost like 200 bucks), so the current trend seems to be a rubber band and not predictable by anyone
http://www.mindflux.com.au/pro... [mindflux.com.au]
Re:Occulus Rift (Score:5, Informative)
You're making a fundamental error many people make when it comes to display resolution. What matters isn't resolution or pixels per inch. It's pixels per degree. Angular resolution, not linear resolution.
I've got a 1080p projector. When I project a 20 ft image onto a wall 10 ft away, the pixels are quite obvious and I wish I had a 4k projector. If I move back to 20 ft away from the wall, the image becomes acceptable again. It's the angle of view that matters not the size or resolution. 20/20 vision is defined as the ability to distinguish a line pair with 1 arc-minute separation. So within one degree (60 arc-minutes) you'd need 120 pixels to fool 20/20 vision.
This is where the 300 dpi standard comes from. Viewed from 2 ft away, one inch covers just about 2.5 degrees, which is 150 arc-minutes, which can be fully resolved with 300 dots. So for a printout viewed from 2 ft away, you want about 300 dpi to match 20/20 vision. If it's not necessary to perfectly fool the eye, you can cut this requirement to about half.
In terms of Occulus Rift, a 1080p screen is 2203 pixels diagonal, so this corresponds to 18.4 degrees to fool 20/20 vision, 39 degrees to be adequate. If you want your VR display to look decent while covering a substantially wider angle of view than 39 degrees, you will want better than 1080p resolution. I'm gonna go out on a limb, and predict that most people will want more than a 39 degree field of view in their VR headset.
Re: (Score:2)
So within one degree (60 arc-minutes) you'd need 120 pixels to fool 20/20 vision.
That is a common misunderstanding. 20/20 vision is the ability to distinguish two lines with 1 arc-minute separation from a single thicker line. Beyond that human eyes can still distinguish a 0.5 arc-minute wide line from a 1 arc-minute wide line, and can tell if a 0.5 arc-minute line is jagged or smooth.
That's why there is a noticeable difference between 300 PPI and 450+ PPI phone displays at normal viewing distances. It's why people with normal vision can differentiate 1080p and 4k on a 127cm screen from
Re: (Score:2)
Then you need better eyes. Unless you can actually get cybernetic implants you're stuck with the choice of fine detail or large quantities of information at the same time.
Even the GP is overestimating the capabilities, because the 20/20 resolution itself is limited to a very narrow field of view; the human eyes capability of even resolving text at all is pretty much nonexistent outside a 6 degree arc.
Now, if we could get monitors where you'd have to look away from an image of the sun because it's too bright
Re: (Score:2)
"Why spend a shitload of money of a new 4K screen and the video card necessary for an acceptable game experience when I'll be able to do VR with a fraction of the cost and with my existing hardware setup?"
Oh boy, somebody is going to get very disappointed.
Re: (Score:3)
Some people like to compute without headaches and vomiting.
Re: (Score:2)
as a gamer I'm no longer interested in 4K video since I know Occulus Rift (and competing VR set) are coming.
Same here, I've been due for a monitor upgrade for a while (was running a triple 19" monitors), but it doesn't make sense to do that now since the budget can be used to snatch up Oculus CV1s when they come out (for less money)!
display port (Score:5, Interesting)
Displayport doesn't have the same limitations that HDMI has at those resolutions. and is available now.
Nvidia 6xx and ATI 7xxx (not to mention intel hd4000) are not exactly brand new, and available now.
IF anything, this sounds like "HDMI is showing it's age, use displayport"
Re:display port (Score:4, Interesting)
Re: (Score:2)
Honestly, they didn't plan to replace it, at least not on any appreciable timescale. People don't buy TVs like they buy games consoles or phones or iPods, so the sensible thing at the time was to roll out yet another twenty-year standard and get around to thinking about succession later. Of course, if you're a Sony or a Samsung looking at your briefly revitalised TV business tailing off again as people finish upgrading, maybe you're regretting this.
Re: (Score:2)
DisplayPort did not support 2160p60 out-of-the-box either; it needed v1.2 to get there.
HDMI can do 2160p60 too, just needs v2.0.
Re: (Score:2, Informative)
Oh you mean v1.2 which came out in 2009, and virtually every DP capable graphics card and monitor supports?
Re: (Score:2, Insightful)
Building on that, is HDMI 2.0 even shopping yet?
2009 vs 2015, maybe?
Re: (Score:2)
Totally agree. Nvidia 6xx has been out for a long time, and a 660 costs like $150. Anyone who buys a 4k monitor for $1000+ is not going to think twice about getting a matching video card. For gamers, in all likelihood, they probably already have one. The article claims a hardware barrier that is simply not an issue.
The real issue here is the price point. 2560x1440 27" monitors have been around for a long time, but it wasn't until it dropped under $400 that gamers started chomping them up. When they ge
Re: (Score:2)
Re: (Score:2)
Over 30yo+ you won't see the difference anyway. (Score:2)
But there is such a thing as too much. After 720p...over 2 meters away from the television set, despite having Air-Pilot approved eyes, I still could not HONESTLY see the difference between a 50 inch 720p and a 50 inch 1080p, honestly - I could not!
I
Re: (Score:2)
If you watched something with high resolution and a clean picture, like Disney's "Frozen," on a high-quality display, like a Samsung 55", then you should be able to tell the difference b/w 720p and 1080p easily. For many things, it is hard to tell the difference at a reasonable distance. Monitors are different in that you're usually much closer to one. At 24", 720p monitors look like crap compared to 1080p. 4K, however, seems like overkill at anything below 30".
For gaming, I'm totally with you. For com
Re: (Score:2)
I regularly use a 1080p monitor in the 24" range and I can tell you I would *definitely* like the resolution to be higher. I do a lot of text-based work and I can see the letters start to get blocky if I reduce the text size while I know for a fact I could easily read text even smaller when printed on a decent laser printer.
Try it one day. Use a word processor to print "the quick brown fox jumped over the lazy dog" in steadily reduced font size down the page. Print that page and hold it next to the computer
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
What?! (Score:5, Interesting)
I'm typing this on a monitor with 3840x2160 resolution, at 60hz right now. I posted about it weeks ago:
Clicky [slashdot.org]
It's like $600 when on sale, and it works superb for coding and playing games. Skyrim/Saints Row 4 plays fine on a GTX 660 at 4k resolution, you just disable any AA (not needed), but enable vsync (tearing is more visible at 4k, so just use that). Perhaps that's just me - but things seem fine at 4k res on a medium-cost graphics card.
A few generations of video cards, and everything will be > 60-FPS smooth again anyway (partially thanks to consoles again), so I don't really need to wait for a dynamic frame smoothing algorithm implementation to enjoy having a giant screen for coding now.
I don't see any reason why you'd want to wait - it's as cheap as two decent monitors, and if you're slightly near-sighted like me, it's just really great. See my previous post for a review link and an image of all the PC Ultima games on screen at once.
Ryan Fenton
Re: (Score:2)
Single-tile too (Score:3)
The other nice thing about the Samsung UD590, apart from 4K @ 60Hz, is that it presents itself as a single 4K monitor, rather than two half-size monitors tiled next to each other. That can make a big difference to some uses, like running games at lower resolutions. The Asus PB287Q is another such single-tile 4K monitor.
Re: (Score:2)
I don't see any reason why you'd want to wait - it's as cheap as two decent monitors, and if you're slightly near-sighted like me, it's just really great.
I'll wait until I can get as great a deal as I did on this 120% color 25.5" 1920x1200 IPS, which was fifty bucks. Hooray for storage lockers, I guess. And also for flea markets, so I don't have to buy storage lockers myself. Also, I just bought a used video card and it doesn't have displayport. IOW, the obvious reason is that one is a cheap bastard.
I did buy an Eye-One Display LT for fifty bucks to go with my fifty buck monitor. And the color is still spot on, amazingly. It wanted calibration, but it still
Ack! I'll take muh 1080p monitor thank you. (Score:2, Funny)
But all I really need is a LCD running 720p.
Truthfully all I really need is a super vga CRT.
In all honesty I could live with the warm glow of an ega screen.
Net net I miss a nice monochrome to get me through.
All things considered, teletype handles 99% of my day to day needs.
Actually, I feel like anything more than a single blinking indicator light is pretty decadent.
Re: (Score:2)
To be fair, a teletype would solve 80% of what I need, with a video-capable tablet providing the rest...
Wait for G-Sync vs. FreeSync to finish (Score:2)
This seems to be a time when monitor features are growing fast. I'm personally going to stick with my 1440p screen until it stabilizes a bit.
The G-Sync/FreeSync battle is going to start. For gamers, this is going to be big. Right now, G-Sync only works with Nvidia cards, and FreeSync will probably only work with AMD cards. FreeSync is much better licensed, and I expect it will probably win eventually, but I tend to prefer Nvidia cards so I'm willing to wait until we get a clear winner.
Basically, my dream mo
Re: (Score:2)
I agree with your ideal choice of monitor btw! Apart from the size which should be bigger, but further away. This way your eyes would be more relaxed.
Re: (Score:3)
I live in a rather small apartment and would really like a triple monitor setup. So I prefer smaller hardware. I'm also nearsighted and usually take my glasses off when computing for a long period, so smaller, closer displays are actually more relaxing. But to each his own.
As far as which is technically better, I haven't seen any solid comparisons. G-Sync does use proprietary hardware in the display, which means it has the potential to do a lot more. FreeSync works with existing panels provided they support
BUY NOW because you have to be ready (Score:2)
when the 4K content starts coming out
because you know, they will stop selling these soon and you will never be able to buy one to view all the 4K content coming out soon
or they will drop in price to the point where kids can afford them on their allowance, but you have to buy it NOW and Before this happens just to be the first one to watch 4K content
Re: (Score:2)
4k media? (Score:2)
Who needs HDMI? (Score:2)
I got a UHD @ 60Hz single stream transport here in the Samsung U28D590D. There's not much video content yet except for a few porn sites, but for stills it's brilliant. Software support for increasing font size is mediocre in many apps, but they're usually functional just ugly. I wish there was some way to just tell Windows to draw a window at 200% size instead. Gaming is cool though my graphics card is choking on the resolution when it gets heavy, I guess it needs an upgrade now that it's pushing 4x the pi
Panasonic with HDMI-2.0 (Score:2)
I can confirm that the Panasonic TC-L65WT600 [panasonic.com] 65" 4K UHDTV can play 60 fps 4K over its HDMI 2.0 connector (yes, I actually have access to 4K/60p content and a 4K/60p video server). I have seen it for as low as $3500 on BestBuy.com.
Future is here for some displays on OSX (Score:2)
4K displays @ 60Hz with Retina pixel doubling = fantastic coding display [1]
Of course, I don't have this at work - I have two separate 24" monitors but my spend most of my time on my 15" retina screen.
[1] http://support.apple.com/kb/ht... [apple.com]
Seiki 4k for $500 (Score:2)
I've been considering one of these bad boys for awhile now. Cheap and for what I intend to use it for (software dev and video editing where the 30Hz refresh isn't a big deal), good enough. It's not something I'd use for gaming, at least at 4K, but hey... $500.
Re: (Score:3)
Really, it's fine for anything this side of gaming. Even Youtube and local media plays just fine. Very little out there has a framerate over that 30hz mark. The only real downside is that you can only fit one of them on your desk at a time.
Re: (Score:2)
Re: (Score:3)
4K is nice but... (Score:2)
Having a full color gamut is important too. And a really good contrast ratio.
So I'm saving my pennies for a OLED 4K display. At 80". And none of that curved bullshit.
Re: (Score:2)
Having a full color gamut is important too. And a really good contrast ratio.
Check out the reviews of the Asus PB287Q. Very nearly full color gamut. These ain't your daddy's TN panels.
Yeah OLED would be nice, but I'd be surprised if an UltraHD or 4K OLED display is affordable this decade.
Re: (Score:2)
I know I don't want to upgrade my TV until I can get a 50" 4k OLED for about $1K. My crystal ball says that will happen in 2018 :)
I'd settle for (Score:2)
A 30 inch monitor, 16:10 aspect ratio , and 2560 x 1600
The only reason I would want much higher resolution that that is to overcome the problem of scaling on digital displays, in the old days of analog monitors we could run differehttp://hardware.slashdot.org/story/14/06/17/224208/4k-monitors-not-now-but-soon#nt resolutions wothout it looking like shit.
I currently have a 28 inch 1920 x 1200 monitor, but they don't make those anymore,
Re: (Score:2)
I used to have a 24" 1920x1200 display, but I upgraded to a 27" 2560x1440. It's 16:9, but I find it I can just about fit three browser windows side by side (or two for wider layout pages). It's not the aspect ratio that's important, it's the number of vertical pixels you have available. I put the main KDE tool par on the left-hand side of the screen to make the most of the vertical resolution, it's working pretty well for me.
I've finally reached the point where I don't feel like I have to vertically maximiz
Re: (Score:2)
This isn't what your asking for exactly, but it's close:
Apple thunderbolt and whatever they call the normal display port ones are 2560x1440@27", of course they cost $999 too :(
I don't know that they are "worth the money". But I definitely approve of mine.
Ow, the ignorance (Score:5, Informative)
Was that summary written by someone who's never used a 30Hz 4k display?
A 30Hz feed to an LCD panel is not like a 30Hz feed to a CRT. The CRT phosphors need to be refreshed frequently or the image fades. That's why 30Hz was all flickery and crappy back in the 90s. But 30Hz to an LCD isn't like that. The image stays solid until it's changed. A 30Hz display on an LCD is rock solid and works fine for a workstation. I know. I've seen me do it. Right now. There are no "transition" issues, whatever that is supposed to mean. Nothing weird happens when I switch between applications. Multitasking works fine. I'm playing multiple HD videos without a hitch. Same way the 30hz 1080 programming from cable and satellite plays just fine on LCDs. Gaming's not great but turn on vertical sync and it's not terrible. I'd rather be running at 60Hz but I got my 4k panel for $400. It'll hold me over until displays and video cards with HDMI 2 are common.
Re: (Score:2)
30Hz is terrible for scrolling. It's not so bad in a text editor but for web pages it's annoying. At 30Hz you can't read text while it is moving, at 60Hz you can and you don't have to pause while the page moves up. It's a small thing perhaps, but was one of the reasons why I switched away from Firefox to Chrome originally.
My computer can but no interest right now (Score:2)
I am not saying that 4K is a stupid idea, or that I hate 4K, if it turned out that one of my present monitors had a switch on the back that would switch it to 4K I would be delighted, but when it comes to budgeting my money there are a huge numb
Re: (Score:2)
A 4K TV on the other hand would be pretty cool and I think that Netflix has some programming 4K ready so I would probably make that leap long before a monitor.
You have that pretty backwards. UltraHD is immediately useful for a monitor, if you actually do work with a computer and aren't one of these people who think work can only be done in a maximized window. There's not much video in that resolution yet and at any distance it's not immediately obvious what resolution a TV is, but you can put all the text you want on screen at that resolution and you sit within arm's length of your monitor.
multiple inputs for 4k? (Score:2)
Computers can handle multiple monitors at 60Hz, so why not 4K with duel inputs? Is that feasible, and are there some models on the horizon that have multiple HDMI, dual-dvi, or dual-display port (pre-thunderbolt-2 display port - I don't know the version numbers)? It seems it could be possible.
Re: (Score:2)
Computers can handle multiple monitors at 60Hz, so why not 4K with duel inputs? Is that feasible, and are there some models on the horizon that have multiple HDMI, dual-dvi, or dual-display port (pre-thunderbolt-2 display port - I don't know the version numbers)?
The Asus PB287Q has two HDMI and one DisplayPort and supports dual simultaneous input from any two of them. They call it Picture-by-Picture mode. They put two HD displays side by side, with black bars above and below, from two different machines. It's slightly silly, since it's not exactly convenient to switch to that mode, but it's available. It will also do Picture-In-Picture mode, displaying one input across the full screen and the other in a window up in the corner, all rescaled in software transpar
AMD A10-7850K - DP - Samsung S28D590D (Score:2)
I just buy this setup last week and it work perfectly well at 3840 x 2160 @ 60 Hz with Debian Jessie and the last fglrx-14.20 driver. The monitor is packed with a DP cable (and a HDMI cable by the way) so it worked out of the box with a FM2A88X Extreme6+ motherboard. I have see 30 Hz display before, but sorry, I can't enjoy them to test 1080p60 applications in a window while having code and debug around others windows.
Electronics schematic and hardware routing is a pleasure on UDH monitor, as coding side by
High dpi isn't necessarily better (Score:2)
So that shiny new 4K monitor may end up delivering an inferior desktop experience and requires a GPU working 4x as hard. That might change as more desktop apps become high dpi aware but obviously any le
Re: (Score:2)
The DPI in some tablets / laptops is so high that applications running on desktop operating systems (Windows, OS X and Linux) render like postage stamps with tiny fonts, toolbars and other buttons. To counter this the OS can upscale any non-high-dpi-aware app's window but that makes everything looks blurry.
I'm with you on Windows and Linux. OSX has been doing this natively, with no blurriness, since they first started shipping retina laptops. It really is amazingly nicer than the old low-res screens. Source: been using it personally since 2012.
Will 4k give us better GUIs and windowing? (Score:2)
The problem I have with super high res displays is the limitations of window management. I have yet to find a decent tool for Windows that allows for virtual monitors that lets me subdivide a very large display into multiple displays. You end up with maximized windows that make poor use of screen real-estate, like this dinky box on a mostly empty window I'm typing in.
And what about window content scaling? I'd be nice to scale the content of a window so that I could display more in the same window or make
Just give me back 16:10 (Score:2)
I don't really have any pressing need for 4k. I mean, I'd take it, though I feel like it would require not only sufficient hardware, but also an OS with a UI better designed for it (I imagine there'd be a lot of times, with a screen that large, that you would want to tell windows to "maximize" onto only one quadrant, for instance.)
What I would really like is for monitors to just not have *regressed*. My laptop's about 3 1/2 years old. I'd be tempted to buy a new one sometime kinda-soon (was looking, and dro
Re:Display Port (Score:5, Informative)
Why is there no mention of Display Port? Current 4K LCD all accept this, and with the right GPU, you can most certainly drive at 60Hz, full resolution.
This is more about HDMI being a broken standard to me. I just don't like DisplayPort because it's sort of Apple's thing.
DisplayPort is AMD's thing, through VESA. It's not Apple's thing.
Re: (Score:2)
AMD and Apple picked it up because it's the only replacement for DVI which is capped at 1600x1200 at 60Hz or 1200p at 60Hz
Not to mention that it's royalty free. You have to pay a license for every HDMI port, you can stick any number of DisplayPort ports on a machine without paying a royalty for implementing the standard.
Re: (Score:2)
Re: (Score:2)
Why is there no mention of Display Port? Current 4K LCD all accept this, and with the right GPU, you can most certainly drive at 60Hz, full resolution.
This is more about HDMI being a broken standard to me. I just don't like DisplayPort because it's sort of Apple's thing.
Nope [amazon.com]
Re: (Score:2)
It is fortunately only the silly mini-displayport port that is Apple specific. I still have nightmares of trying to buy a displayport cable at a computer store and they send me to the horror that is the Apple section of the store, which was rows and rows of incompatible crap.
Re: (Score:3)
... clear giant work area for multiple windows.
All this.
There are way too many applications I use that fail to do anything useful for multi-monitor setups. There's a few useful features like being able to resize window panels to customize my view better, but I want to be able to tear panels off and put them on a different monitor. To me, that is so vastly more important than just increasing resolution.
I currently use two monitors. One in landscape and one in portrait and I use them exactly how you'd expect, documents on the portrait screen, video/gam
Re: (Score:2)