4K Displays Ready For Prime Time 207
An anonymous reader writes "After the HD revolution, display manufacturers rolled out gimmick after gimmick to try to recapture that burst of purchasing (3-D, curved displays, 'Smart' features, form factor tweaks, etc). Now, we're finally seeing an improvement that might actually be useful: 4K displays are starting to drop into a reasonable price range. Tech Report reviews a 28" model from Asus that runs $650. They say, 'Unlike almost every other 4K display on the market, the PB287Q is capable of treating that grid as a single, coherent surface. ... Running games at 4K requires tons of GPU horsepower, yet dual-tile displays don't support simple scaling. As a result, you can't drop back to obvious subset resolutions like 2560x1440 or 1920x1080 in order to keep frame rendering times low. ... And single-tile 4K at 30Hz stinks worse, especially for gaming. The PB287Q solves almost all of those problems.' They add that the monitor's firmware is not great, and while most options you want are available, they often require digging through menus to set up. The review ends up recommending the monitor, but notes that, more importantly, its capabilities signify 'the promise of better things coming soon.'"
Where's The Content? (Score:3, Interesting)
So I can get a 4k display for less than $700. Where can I get content worth watching on that display? Not only worth watching, but is somehow made better by all those extra pixels.
All that aside, seems like it would make for a really nice PC monitor.
Re:Where's The Content? (Score:4, Interesting)
I'm sure David Attenborough's voice would sound even better in 4k
Re: (Score:2)
All shot on HD video as far as I understand it, so no 4k.
Re: (Score:3)
Re: (Score:3)
Re: (Score:2)
Re:Where's The Content? (Score:5, Insightful)
I did the calculations and don't care to repeat them again, but depending on your use case, it might help... or it might be totally imperceptible. A medium-large on the other side of a good-sized living room, your eyes shouldn't be able to see the difference. On the other hand, a large computer monitor right in front of you, in many situations you will be able to see the difference. Note that human eyesight isn't a simple matter of resolution comparisons, it gets kind of complicated... there's basic measures of how far apart you can see two black dots or lines separated by white before they merge into one, but the less the contrast, the greater the distance they have to be separated (absolute brightness matters too, as does distance from the center of your field of vision and all sorts of other stuff), and of course your ability to perceive fine detail drops tremendously when viewing moving objects. But in relatively static, high contrast images, on a large screen near the viewer (say, a nice computer monitor), most people shouldn't have trouble seeing the difference in a side-by-side comparison.
The only problem with this gimmick is that we're basically running into a resolution dead-end here, there's only so far you can go before the improved detail becomes meaningless. I hope for their sake that they come up with true (non-stereoscopic) 3d or something of that nature, or they're going to be running out of TV-sales gimmicks.
Hmm, I just thought of something that I heard about a good while back but haven't seen any movement on - "peripheral vision" TVs. I seem to recall reading years ago about a type of TV that used lights around the edges to dimly shine the peripheral colors on the TV image around the room parallel to the TV, giving the illusion to your peripheral vision of an expansive screen. I could envision improving that with a video format that includes a lower-resolution peripheral video stream and side projectors instead of simple side lights. Maybe that could be the next gimmick. ;)
Re: (Score:2)
Roll your own:
https://learn.adafruit.com/ada... [adafruit.com]
Re: (Score:2)
"Hmm, I just thought of something that I heard about a good while back but haven't seen any movement on - "peripheral vision" TVs. I seem to recall reading years ago about a type of TV that used lights around the edges to dimly shine the peripheral colors on the TV image around the room parallel to the TV, giving the illusion to your peripheral vision of an expansive screen."
Philips Ambilight.
Re:Where's The Content? (Score:5, Funny)
The only problem with this gimmick is that we're basically running into a resolution dead-end here, there's only so far you can go before the improved detail becomes meaningless.
Why would you even discuss this now. We are NOWHERE near the types of resolutions that my eyes are happy with. Yes, I am an elitist snob who couldn't tell a pixel from a hole in the ground. I do not care. Stop whining about how none of us can tell the difference. I can tell the difference and even if I can not, I believe I can tell the difference.
I do NOT want to see even a hint of blockiness or fuzziness at the edge of a font. I want curves that appear to be perfect curves. As it stands now, I can clearly see blockiness in all fonts. With hinting turned on, some aspects of the blockiness goes away but it is still there... and now the fonts are fuzzy too. Will 4K solve that? Not even close. Will it be much much better than what we have now? Yes!
Stop blocking progress with your negative whining about arcs and distinguishability. I may not be able to argue against your science and maths, but science always loses out to reality. Look at the blocks and fuzziness in this message and dare to tell me that I am wrong.
Re:Where's The Content? (Score:5, Funny)
Inch. Away. From. The. Screen.
Slowly.
Re: (Score:2)
lol
Re: (Score:2)
Maybe you need glasses, it looks fine to me. Or you're being a douche because this is Slashdot. Do you have aerodynamically square Monster cables too?
Re:Where's The Content? (Score:5, Interesting)
Nah. I am not trying to be a douche but I am undoubtedly succeeding. *sigh* Such is life. Here is where I am coming from:
While the Commodore Amiga 500 was not my first computer, it was the one that brought me very deeply into computing. I first hooked it up to an NTSC television set. The fonts were extremely jagged and the images were extremely blurry. I should probably add that the color was absolutely terrible too. But it worked. I fell deeply in love with my Amiga 500. It was the most awesome computer on the planet. It had a flat 32 bit memory space and preemptive multitasking. It was god compared to the standard IBM PC and Microsoft DOS.
I eventually was able to afford to buy a used "real" monitor for it. Essentially the same resolution but much higher quality. The fonts were still jagged though.
Through the years, I have upgraded my monitors continuously, with one of the best monitors being the Apple 30 inch Cinema Display running at something like 3560x1600 or somesuch. A _very_ nice monitor. Currently, I am using a Samsung 48 inch 1920x1080 screen as a display.
One thing that was common across ALL of the displays is that curves never looked like continuous curves and fonts always looked blocky. It is possible that problem may not be resolution, but I doubt it. I look at photographs of handwriting, images that should show continuous curves, and it still does not look "right". Either it is fuzzy or the pixels intrude.
Maybe I put my face too close to the monitor. Maybe I just expect too much. Maybe I notice things that other people do not notice. Regardless, No matter how much anti-aliasing I use in Grand Theft Auto IV, lines that are not perfectly vertical or horizontal have a staircase effect. No matter what type of font hinting I use, fonts seem blocky and or fuzzy. Perhaps 1920x1080 is enough and I just want too much.
4K screens look gorgeous. I look at them at the Sony store in the mall. My eyes are still drawn to the imperfections in the red headed girl's hair (in the demo) despite the fact that it is mathematically and scientifically impossible to see them. *shrug*
Re: (Score:2)
... with one of the best monitors being the Apple 30 inch Cinema Display running at something like 3560x1600 or somesuch
2560x1600.
Re: (Score:2)
Yep. You are exactly correct. Thank you for posting. I made a mistake. :)
Re: (Score:2)
medium-large on the other side of a good-sized living room, your eyes shouldn't be able to see the difference
That's simply not true. While you won't notice it in level of detail, you will notice it due to the increased dithering and smoothness of color gradients. Things will look better at all normal viewing distances. Although my real hope for the future is in ultra-ultra-ultra high definition displays (think something like the equivalent of a 32K 46" monitor). With that new possibilities actually open up, tie that to a Lenticular lens system an you'll have multiple angle high definition viewing. Imagine a tele-
Re: (Score:2)
there's basic measures of how far apart you can see two black dots or lines separated by white before they merge into one
This is a really common misunderstanding of how human vision works. While it's true you might not be able to distinguish two dot, you can distinguish varying line widths and the sharpness of high contrast edges like text.
That is why text looks sharper on a 4k display, even at some distance. It's why people can distinguish a 4k display from a 2k display at normal viewing distances. It's why people can tell the difference between a Nexus 7 and an iPad Retina (264 ppi vs 323 ppi) even though both are seemingly
Re: (Score:3)
All that aside, seems like it would make for a really nice PC monitor.
It probably seems that way because it is a PC monitor.
Re: (Score:3)
Netflix? House of Cards and all of their new original series are shot and displayed in 4K if your device and display support it.
Also, there's a much higher quality Samsung 4K display [techreport.com] for $50 more, that is probably the model you want.
It's a PC monitor, so.. (Score:2)
There's tons of trivial sources of content either dynamically generated, made by others or made by yourself.
Of course, text and PDF, and unix-like terminals. Pictures and photographs ; there's nothing special to do, open a picture that comes out of the camera and look at it.
For games, you can probably play old RTS and such even on low/mid end graphics card. Else this monitor should allow you playing at non native resolutions.
There's even new kinds of content that such a high res display would make possible.
Re: (Score:2)
Who cares about content? The good news is that finally display makers are getting off their collective asses and producing computer displays at higher resolutions than 1920x1200.
Re: (Score:2)
Note: that's not a monitor, technically, but a TV.
Still good for regular desktop applications, though.
Re: (Score:3)
Besides the addition of a tuner, is there really a difference in this day and age? Some TVs come with higher and lower refresh rates, resolutions, etc., as do some monitors.
Re: (Score:2)
Typically monitors have a lower pixel pitch since they're meant to be viewed up much closer than a TV you'd be watching from several feet away on your couch.
Also in my experience monitors tend to have superior firmware in terms of reliability and adjustments whereas hooking up a PC to a TV sometimes has unpredictable results in the way the TV will display the image with little control to correct it even when using VGA or HDMI.
Re: (Score:3)
Erm isn't pixel pitch a factor of resolution and physical size?
So your 24" 1080p monitor has a smaller pixel pitch than your 50" 1080p TV, but if you got a 50" 1080p 'monitor' it would have the exact same pitch as the TV.
Re: (Score:2)
The distinction has become largely meaningless - The only real difference between the two, it has a TV tuner in it while a "monitor" would not. And as a bonus, it has halfway decent sound capabilities by default, which most (but not all) monitors do not.
Re: (Score:2)
Apart from pitch size, panel quality, color accuracy, setting fine-tuning and viewing angle size? Not really, no :)
high end TVs have all those things (Score:2)
It's just the ubiquitous crappy ones that don't.
Re: (Score:3)
On color, I have to agree with you. The Seiki 4k panels have horrible color (though as you also point out, that comes mostly from the lack of fine tuning capabilities). If you want a 4k display for doing graphic design work, yeah, you'll want to blow $3000+ on a it; for programming, not so big of a deal.
For the rest - At 4k, pitch size means almost nothing. Two feet away (my typical dis
Please quit conflating TV's and monitors. (Score:3, Insightful)
I may have use of a 4k monitor. I doubt I will ever need a 4K tv, even if source material were readily available. My rarely watched 1080p does just fine. Most consumers would likely agree. For TV/Movie viewing 4k IS a gimmick.
Please quit conflating TV's and monitors. (Score:2)
Re: (Score:2)
I've seen Costco's demos too and it's a great picture but I don't get the feeling it's any better than 1080p on the 55" demo. I guess coming from 720p you probably see a bigger difference, but I can't see the value in jumping from 1080p.
I can see it being worthwhile if you buy one of those badass 90" motherfuckers they have for sale there, but not at anything in the 60" or lower range.
Certainly I can't see the value on a 28" screen.
Re: (Score:2)
It's not a 28" TV, it's a monitor.
I have a 4k monitor at work now, and would really like that at home as well - but until know it's been way to expensive.
At $650 it's below what a decent 1440p monitor costs, and will be within budget for most home-office users.
Re: (Score:2)
i had been waiting for this as well, now that Star Citizen DFM is delayed a few days ( yes i expect every one to be addicted to all things RSI and know what I am talking about) maybe I can get one on my desk pronto.
Re: (Score:2)
alas... not a gaming monitor, must wait
Re: (Score:2)
Re: (Score:2)
yes i think the geforce 980 gtx black edition Ti in quad sli or something like that... unless they optimize their code :)
Re: (Score:2)
What benefits does it offer you at work? What type of work do you do on it?
Honestly, I'll probably end up getting one myself at some point too, largely because I find 1920x1080 a step backwards from my 1920x1200 monitors so the only way I can prevent my resolution taking a regressive backwards step is to go that way, but I'm not sure what benefit I'll see from it when mostly all I do is play games, browse the web, and write code. I don't do any graphics design or watch movies for example on my PC but I'm in
Re: (Score:2)
It's great for coding, webdesign or anything where you are interacting with two or more windows at the same time.
My coding days are more or less over, but I still find it very useful in almost all tasks. Keeping everything you're working on visible is a great productivity booster. I've had my monitor for roughly 3 months, and it has already paid for itself.
Re: (Score:2)
It's probably a good display to look at boring numbers and documents.
Text fonts are vector graphics, afterall. So they'll scale and be a bit easier to read.
At $650 that means you can spend $1000 on the whole system (tower or mini-PC plus display). One option is to use an Intel NUC, it comes to mind because it's a cheap and tiny PC with a Displayport ; another is to use an AMD AM1 system in mini-ITX or micro-ATX (same reason, as long as you choose the right motherboard)
Don't care about the updaters. A low en
Re: (Score:2)
There IS a lot of difference in quality but the key is you need to get up close enough to see the difference. If you watch a smallish TV from 3 meters back maybe you don't see a difference. The other thing is you actually need 4k content which is lacking now.
Give it 5 years and when all movies are released in 4k mode and iTunes / Netflix / Amazon Firetv all have a 4k download option you may change your mind a bit....
Re: (Score:2)
Well I was stood about 2 feet from the Costco 55" one on display the other day.
I wonder perhaps if the problem is that the benefit of 4k HD really needs a reference 1080p alongside it to make the difference clear. It's just not like 480p SD to 1080p HD - that was obvious even without an SD TV alongside the 1080p HD (though 720p to 1080p is less obvious I suppose). In this respect I think it's not necessarily that it's better, but that to a casual observer it's not obviously and immediately recognisable as b
good grief, why not wooden pixels?! (Score:2)
50" 720p plasma tv
Fred Flinstone, is that you? How's things down at the quarry?
Re: (Score:2)
Re: Please quit conflating TV's and monitors. (Score:2)
Re: (Score:3)
Granted, for now the bitrate rather than resolution is the limiting factor, since cable/satellite/broadcast signals aren't even 1080p, they're 1080i or 720p with inadequate bitrates. But the quality of video through Netflix / Amazon Prime ri
Re:Please quit conflating TV's and monitors. (Score:5, Informative)
Football is shot at close angles specifically to tell a narrative; they will not and are required not to show the full field during a play. This is the view that coaches get on a closed loop. It is available to the public, but only 4 hours after the game ends and you have to pay a special subscription to get it.
Re: (Score:2)
Re: (Score:2)
There's a whole bunch of 27" 1440p monitors out there, and they sell a lot.
At the office we've not bought one single 1080p since starting up in 2010. We started off with 1200p when that was still possible to get, and moved on to 1440p and now 4k for those who want it.
Re: (Score:2)
There's still 1200p sold, the res made a little "come back" semi-recently.
1920x1080 has effectively become the successor of 1280x1024. So, the monitor situation sucks in some ways but it is not that bad (the worst is that the suckage of a lowest end 21.5" 1080p monitor makes your eyes bleed more than that of a 1280x1024 17", because there's so much more surface of light bleeding, terrible color and bad blacks/grays)
Re: (Score:2)
I don't see how 3D gets permanent "gimmick" status while 4k doesn't... there are times when seeing things in 3D give you a completely different perspective, feel, immersion, and experience than something not in 3D. There are times when higher resolution does the same. And there are times when both actually seem to make things worse. Curved TVs as well... I run three monitors on my desktop and I'd be ecstatic if I could get the same resolution in a single curved display. If it weren't curved, though, the
Re: (Score:2)
Honestly, it's not a gimmick. 4K does look MUCH better than 1080p. It's really pretty spectacular. Of course, it's hugely dependent on screen size and viewing distance. If you're sitting 20 feet away from a 32" screen, 4K/1080/720/480 are going to look exactly the same.
What 4K gives you is some combination of better picture and/or larger screen size, with a comparable picture quality.
3D capable models (Score:2)
But when are the 140Hz or 120Hz 3D capable models going to be available? Even if 3D is limited to 1140p or 1080p I want the capability for 3D gaming and watching 3D movies on my PC. Right now the best I can get is a 1080p, or very soon, a 1440p monitor, and will have to buy separate 4K 2D monitors for 4K. :-(
Re: (Score:2)
Re: (Score:2)
Then you start looking at bus speeds... 4k 120hz is a LOT of data.
Not really, 4k 4:4:2 @ 120Hz is only ~16Gbps which requires only 2 PCIe 3.0 lanes (8GT/s per lane).
Re: (Score:2)
I don't care about 120Hz for 4K resolution, but 140Hz and 120Hz for 1440p and 1080p for 3D (did you not read what I posted?) Either Displayport or dual link DVI, or HDMI 2.0, all of which allow 4K at 60Hz.
Re: (Score:2)
They will probably be available in a year or two. We moved from hackish 30 Hz split-input panels to native 60 Hz single-input panels in about a year. However anything beyond 60 Hz is pretty much useless except for bragging rights as you can't see it anyway. Broadcast TV and movies are shot at 29.997 and 24 Hz, respectively. The lack of benefit of higher refresh rates is especially true on a display that is capable of displaying static images like an LCD.
Re: (Score:2)
120Hz is great for 3D because then you get 60Hz for both the right and left eye. Remember for a real desired refresh rate in 3D, you need to double the framerate because most screens use shuttered glasses rather than prismatic displays.
Re: (Score:2)
There are multiple medical studies that prove that the ability of the human eye to discriminate between images over approximately 30 Hz is limited and discrimination over 60 Hz just doesn't happen. There is a lot of evidence in the environment to back that up. Movies recorded on film are shot at 24 Hz (fps). Most digital video is 29.997 Hz. The frequency of AC electricity was set at 50-60 Hz due to that being more than enough to cause a tungsten filament to appear to be constantly glowing to the human eye i
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
By your logic, so are PCs, and even smartphones, cars, aircraft - so let's not ask for anything better today because someday generations away something will come along to make all of it totally obsolete. Let's not bother with space exploration with current tech either because someday teleportation or wormholes will be practical.
Or, we can demand better products today and enjoy them in the meantime.
Samsung UD590 is nice... (Score:5, Interesting)
I got it recently, and it's got 4k at 60FPS, in a 28" size - great for programming.
Review link [ubergizmo.com]
Just to try it, I was able to get all the single-player PC Ultima games running in about half the screen real estate:
ALL THE ULTIMAS [imgur.com]
It's around $600 when its on sale, so I think it just about matches the model slashvertised here.
Ryan Fenton
Re: (Score:2)
I bought one of these for my office:
http://www.amazon.com/Seiki-SE... [amazon.com]
The pixel density is perfect for the opposite side of my desk, and since it's the office, being restricted to 30Hz doesn't hurt anything. At $400 it's a great place to get started with 4k computing. I agree that programming in 4k changes your entire way of working.
Re: (Score:2)
ALL the ULTIMAS.... love it
TN Panels (Score:2)
Just a note to others, this one and the Slashvertized one are TN panels. TFA talks about how it's not too bad of a TN panel, but it's not IPS (which some of us buyers are waiting for).
Re: (Score:2)
Say what? He's either running Windows Vista or Windows 7.
Re: (Score:2)
For god's sake why? My question is on the level. Why would you want to emulate some mickey mouse multi-monitor setup when you have a superb single monitor with one homogeneous pixel address space? Nothing out of the ordinary is required of the window manager. You simply size and place your windows completely as desired. As many windows as you want, each of the size you want.
I have seen your wording expressed several times
fullscreen? (Score:2)
The only advantage I can see would be that you could "fullscreen" the app within the virtual monitor area.
Some apps (media players, for example) behave differently when run fullscreen vs within a window. The "virtual monitor" window manager would let you trick the app into thinking it was running fullscreen.
OSX is not ready (Score:4, Interesting)
What surprised me is the poor OSX support for 4K. Windows can scale everything (although I had to manually add a display mode to the NVidia advanced settings to even get 1080p!?), but OSX cannot. I am running it on a recent MacBook Pro 15" with discrete graphics.
The problem is that you cannot chose to run at a lower resolution. Display preferences lists ONLY the native resolution. Using QuickRes (a 3rd party add-on for more resolution choices), none of the lower resolutions work, at least until you go all they way down to 1080p
In particular, you cannot use HiDPI on an external display (where the application sees a lower resolution, but the OS renders fonts at full resolution). (No, it does not help to enable HiDPI with Quartz Debug, nor with the QuickRes "Enable HiDPI" button). So the menus and all applications are absolutely tiny.
You could adjust the size of everything on a per-application basis, but then they won't look right when you're working on the laptop display, unless you use something like QuickRes to run the laptop display at its native resolution. I guess I will try that for a few days. So I mainly use my older, power-hungry 2550x1600 30" displays.
If I could just select the highest of the HiDPI resolutions available for the laptop display in the System Preferences, and mirror *exactly* that to this display, I would be a happy camper. You can't do that.
I understand an upcoming release will improve support with HiDPI on external displays. But as it stands, I could not recommend a 4K display for a Mac - or a Mac for a 4K display.
Re:OSX is not ready (Score:5, Informative)
I thought that 10.9.3 addressed this (Not quite two weeks old). Might be time for you to try again.
Re: (Score:2)
Re: (Score:2)
Not sure if it depends on the model (artificially) - but someone here is showing it as working:
https://discussions.apple.com/... [apple.com]
Old Bugger (Score:2)
Maybe I'm getting old, but a 24" monitor running 1080p about 40cm frrom my face seems pretty damn good. About as good as I will ever need. My eyesight is not likely to improve, and despite the fact it is pretty good for me age, I don't really see any gains to be had from doubling the resolution of my monitors (x3).
I'm the sort of guy who buys the 42" TV because...he knows he can just fucking sit a few feet closer to it if he wants the pixels and screen to appear larger!
The day I need a 4k monitor for progra
I have an application for one... (Score:2)
I want a 4K 40" OLED display for photo work. This would be something that would come a lot closer to the capability of the sensor in my camera than anything than I can buy now.
In addition the high resolution would be great for displaying large amounts of text, that is for programming.
28" with crappy color gamat and a ridiculous dot pitch isn't close to what I want.
joke is on the consumer (Score:2)
While 4k is technologically cool the joke again is on the consumer. As in Blu Ray "Mastered in 4k" which isn't realy 4k but "Re-mastered" and downscaled to 2k. IIRÄ they are having difficulties getting true 4k onto disc still? Then apart from the few US streaming services (available in US only TM). Sounds like another hype from the content providers to make even more money, unfortunately.
Bit too small still. (Score:2)
At that res I really want at least a 32~37 inch screen.
only if we get better OS support for hi res (Score:2)
Assuming current OS behaviour, I'd be happier with a ~42-46" 4K screen. Just a bit higher pixel density than my current 24" 1920x1200 monitor, but almost 4x the number of pixels.
With current OS behaviour, a 32" 4K screen would have teeny-tiny text in some places.
Heavily compressed 4K (Score:2)
will look awful on cable...
They're already compressing the hell out of regular 1080P...
I would much prefer to have uncompressed 1080P than compressed 4K.
Re: (Score:2)
But is the end near for 1366x768 laptop garbage (Score:4, Insightful)
I really don't understand how retailers and manufacturers are still getting away with selling $700-800 laptops with those awful 1366x768 or 720p displays. A few times I was looking for a basic laptop with entry level CPU and memory, and a 14-15 inch screen with nice resolution at an affordable price at Fry's or BestBuy. But the sales people always direct me at loaded models that cost +1000 to get that screen.
Re: (Score:3)
This! In in the market for a new laptop at the moment. I don't have many requirements but 1080p or better is one of them. I your a reallyhigh res external display using an iPad screen for under $100. Why can't I get that option on a laptop? I can get that option on a laptop?
Re: (Score:2)
Click here [slashdot.org]. Don't delete the cookie. Was that really THAT hard to do?
Re: (Score:2)
Re:Sweet, now we just need to wait for OS Support (Score:4, Informative)
I believe Apple just pushed a patch to mavericks with better 4k support. http://au.ibtimes.com/articles... [ibtimes.com]
Re: (Score:3)
Re: (Score:2)
What the heck is wrong with a normal Intel HD3000 or higher? Plenty of motherboards with boith DP and HDMI are available.
Re: (Score:3)
Both Linux and Windows 8.1 deals just fine with 4k.
Requires a decent graphics card with drivers that actually work, but other than that there's no problem.
Re: (Score:3)
Which if it follows the support for multiple monitors means that windows 14 and Ubuntu 24 should have good support for it. ETA for Mac is unknown.
I have a 4K monitor and Windows 8.1 handles it just fine. The only problem I have is that I have on 4K and one 1080p and Windows seems to have issues getting the dpi correct. 8.1 supports per-monitor DPI but it won't let you set the DPI manually for each monitor, it tries to figure it out and fails. It will either get the 1080p right, and the 4K will make me feel like I need to get my eyes examined, or the 4K looks perfect and the 1080p looks like I already had my eyes examined and was found legally blind
Re: (Score:2)
pcb layers in Allegro are basically 2 dimensional, you'd need more than just monitor support. Also given a typical stackup for a PCB I'm not sure the "depth", unless exaggerated, would be so helpful. A motherboard is so many inches wide, but a pcb layer is somewhere around what, 8 thousands of an inch thick? Then if the fiber or prepreg is modelled it would appear to be "floating".
A tool from Ansys called SiWave does this (including allowing layers to be exaggerated), I didn't find it to be useful for much
Re: (Score:3)
Bah. You PC board wusses. Try doing physical design on a custom ASIC (note my sig).
More pixels definitely helps. I have been using a 30" 2560x1600 (Dell for about $1200), but more pixels for half the money seems like a great deal! The down side is less
Re: (Score:2)
Re: (Score:2)
Not sure about that. If you create a 70" 4k panel substrate, and part of it is defective, you can cut that down into a 2560 panel at a smaller screen size. The bigger the screen, the easier it is to hit 4k, thanks to lower pixel densities. There are only a few pixel densities being manufactured at any given time and they are cut to size for production.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
My monitor is so black that I can't see any light at all (Dell IPS). Don't get me wrong, it's not black - at night, with all the lights off, it's clearly not completely black. However, since about 99.999% of my time in front of the computer is with some other form of illumination, or with some portion of the screen being light (and fouling my night vision). I find it hard to get really worked up over "good" blacks that aren't "perfect."
Re: (Score:2)
Even Plasma "blacks" aren't truly "TV Off" blacks, but they're a good bit less luminous than any LCD technology I've seen thus far - and when watching tv in a darker room, really does make it look nicer.
Re: (Score:2)
I'm with you on the term "4K". I can't believe Slashdotters are complicit in this marketing nonsense. There's nothing "4,000" about it. We've been using lines as shorthand for display resolution for quite some time now, and it makes zero sense to switch to columns now, and it isn't even 4K columns.
Resolution: 3840x2160
Pixels: 8.3M
1280x720 is 720p/HD
1920x1080 is 1080p/FHD (Full HD)
2160x1440 is QHD (Quad HD)
Therefore 3840x2160 should be 2160p, QFHD, UHD, or 2K. 4K is utter nonsense. Calling it "Mega 8.3"
Re: (Score:2)
Correction, QHD is 2560x1440.