Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Displays

LG To Show Off New 55-Inch 8K Display at CES 179

MojoKid writes One of the most in-your-face buzzwords of the past year has been "4K," and there's little doubt that the forthcoming CES show in early January will bring it back in full force. As it stands today, 4K really isn't that rare, or expensive. You can even get 4K PC monitors for an attractive price. There does remain one issue, however; a lack of 4K content. We're beginning to see things improve, but it's still slow going. Given that, you might imagine that display vendors would hold off on trying to push that resolution envelope further – but you just can't stop hardware vendors from pushing the envelope. Earlier this year, both Apple and Dell unveiled "5K" displays that nearly doubled the number of pixels of 4K displays. 4K already brutalizes top-end graphics cards and lacks widely available video content, and yet here we are looking at the prospect of 5K. Many jaws dropped when 4K was first announced, and likewise with 5K. Now? Well, yes, 8K is on its way. We have LG to thank for that. At CES, the company will be showing-off a 55-inch display that boasts a staggering 33 million pixels — derived from a resolution of 7680x4320. It might not be immediately clear, but that's far more pixels than 4K, which suggests this whole "K" system of measuring resolutions is a little odd. On paper, you might imagine that 8K has twice the pixels of 4K, but instead, it's 4x.
This discussion has been archived. No new comments can be posted.

LG To Show Off New 55-Inch 8K Display at CES

Comments Filter:
  • by 91degrees ( 207121 ) on Friday December 12, 2014 @09:13AM (#48580495) Journal
    That's useful for technical matters like bandwidth calculation but the user cares about clarity.

    8K can display a line half the thickness of 4K. That's what matters.
    • by fuzzyfuzzyfungus ( 1223518 ) on Friday December 12, 2014 @10:01AM (#48580701) Journal
      Arguably, that depends on how large the display is. In the tablets and smartphones market(and, at least to an extent, smaller TVs and monitors) higher resolutions are mostly about aesthetic improvements. Between the limits of human dexterity on input devices and the limits of human vision you can't use the extra pixels to actually make UI elements smaller(it still has to be a certain minimum size for the user to see it, click it, or touch it, regardless of how many pixels fit in that physical area); but you can use them to make things look buttery smooth and more or less eliminate visible jaggies.

      In larger panels, there is still a good deal of room(at least for users with decent eyes) to use additional pixels to add additional effective 'space' into a monitor of the same size. No longer being able to see the nasty huge pixels that result when some terrible person smears 1920x1080 over a 27 inch screen is nice(seriously, guys, WTF is up with the increasing sizes of 1920x1080 monitors? Used to be you could get 19.5/20-inch ones quite easily, now the market is rotten with 22 inch and higher); but it's the increase in work room that really makes the difference.
      • What's with the decreasing size of 1080p monitors? Seen a 15.6" 1080p and it is fairly hard to use with Windows 7's file manager. I have not tried scaling yet as the owner is the kind to jump at me if I do or change anything on that laptop.
        That is annoying for young (enough) users esp. as the thing is used with a touchpad.
          27" 1080p has "too big" pixels but it's what users want. Had a 20" visible monitor running at 1280x960 and it was pretty sweet.

    • by AmiMoJo ( 196126 ) *

      8K can display a line half the thickness of 4K. That's what matters.

      It's more complex for 8k video, although I agree that's what counts for mostly static computer displays. Details are very thin but I doubt this monitor supports 8k video.

      8k video was pioneered by NHK in Japan, and I was lucky enough to see a demo of it a few years ago. As well as 8k resolution the frame rate has been increased to 120Hz native, and the colour depth expanded to 12 bit per channel and the RGB coverage is much higher than either HDTV or current digital cinema. The result is incredibly life like

      • Why not? HDMI can't handle the bandwidth, but DisplayPort supports 8k at 60Hz and 24bpp - any problems with video will be in finding a video card to drive it and content that can use it. Though as with watching DVDs at 1080p, good upscaling should make for dramatic improvements with existing content.

        • by AmiMoJo ( 196126 ) *

          So half the frame rate and only 8 bits per channel instead of 12... In other words DisplayPort doesn't support true 8k video yet.

          • "8k" refers to resolution only. I could say whatever magical system you're thinking of doesn't support "true 8k video" because it doesn't support 240 Hz because that's what you need for 3D.

          • by Immerman ( 2627577 ) on Friday December 12, 2014 @01:57PM (#48583539)

            You do realize HDMI maxes out at 60Hz as well, right? All that "120Hz" bullshit is just marketers selling you crappy post-processing filters in the TV. And while HDMI did technically upgrade from a maximum of 24bpp to 48bpp in 2006, there's essentially zero content that actually uses that capacity, and that very few people can clearly distinguish the additional slight variations in color anyway, except in the most contrived tests - compression artifacts are *far* more visible, and going to 48bpp tends to make those much worse.

    • That's useful for technical matters like bandwidth calculation but the user cares about clarity. 8K can display a line half the thickness of 4K. That's what matters.

      When we're talking about this level of resolution, perhaps the upper limits of human eyesight is what truly matters.

      Of course, that assumes that consumers use common sense when purchasing TVs the size of drywall sheets. Highly unlikely, especially during Superbowl season.

  • by fuzzyfuzzyfungus ( 1223518 ) on Friday December 12, 2014 @09:15AM (#48580501) Journal
    Perhaps if you are buying your LCDs just to watch TV the 'content' argument is a serious problem; but c'mon, essentially all modern 'TV's are just big monitors with built in ATSC/DVT-B tuners and severely questionable EDID data.

    Especially when the resolution is an integer multiple of what the existing 'content' was designed for, and a PC with suitably punchy GPU (which actually isn't much punch these days unless you are gaming, where things can admittedly get damned expensive at high resolutions, this isn't the bad old days when you had to buy some freaky Matrox unit to get a VGA out that didn't turn into blurryvision when it met a real monitor) can drive a seriously enormous screen, who cares?.

    Quit carping about how Sony hasn't yet graced us with Premium Ultra HD Content on Blu-Ray 2.0 and embrace the fact that you can buy a terrifying pixel-battery of your very own at surprisingly attractive prices. Still a few kinks to work out at very high resolutions that currently available displayport or HDMI standards can't drive properly; but that's really the remaining issue.
    • by AmiMoJo ( 196126 ) *

      TVs are very different to computer monitors in one important aspect: image processing. Computer monitors go for accuracy, but even basic TVs do a fair bit of processing to make the image look good. Some of it is to make up for limitations of the TV itself, like enhancing motion clarity, but a lot of it is to make up for limitations of the source material. Broadcast HD is actually fairly crap if you watch it on a normal computer monitor without any processing.

      Many TVs have a "game mode" which disables proces

      • >Even game mode doesn't disable everything though.

        Don't I know it, and the "enhancement" wreaks havoc on a pixel-perfect input.

        "PC mode" though seems to (usually) be a substantial further improvement, but it isn't always obvious how to enable it. For example on my older Samsung it is engaged by changing the video source name to "PC" - despite the fact that nothing in the documentation makes any suggestion that the name is anything other than a user convenience - and it's changed in a completely differen

      • TVs are very different to computer monitors in one important aspect: image processing. Computer monitors go for accuracy, but even basic TVs do a fair bit of processing to make the image look good. Some of it is to make up for limitations of the TV itself, like enhancing motion clarity, but a lot of it is to make up for limitations of the source material. Broadcast HD is actually fairly crap if you watch it on a normal computer monitor without any processing.

        Many TVs have a "game mode" which disables processing to get the latency down. Try switching to it with an ordinary broadcast HD feed to see how awful it looks with minimal enhancement. Even game mode doesn't disable everything though.

        Horse shit.
        The first thing I do on any TV is disable every single fucking enhancement.
        They make games, pc output, broadcast, cable, dvd, bluray, even fucking vhs look like absolute trash.

    • Actually DisplayPort 1.3 (released Sept. 14th) supports 8K @ 60Hz, if only at 4:2:0 subsampling. Now, finding actual hardware that supports 1.3 might be a challenge, but the standard itself is available.

  • At 55" and average viewing distances of 8ft you're not going to notice all the detail of even 1080p. [carltonbale.com] You literally need to be sat a couple of feet away to get the full benefit of 4K on a 55" display.
    • by MrL0G1C ( 867445 ) on Friday December 12, 2014 @09:39AM (#48580587) Journal

      THAT CHART IS WRONG.

      Seriously, I can easily see jaggies where it says I shouldn't be able to, IT IS WRONG by a large factor.

      • Quite so - such numbers typically refer to the limit at which the typical human eye can resolve individual pixels, but it's well known that the eye can still perceive visual texture (such as jaggies) at much higher resolution.

      • by bondsbw ( 888959 )

        Not sure about your eyes, but the graphic appears to be pretty close to the values I'm getting when calculating when the resolution is better than "retina" [isthisretina.com] for most people.

        Of course, video compression can alter your results. And sub-pixel motion can cause moiré patterns [wikipedia.org] that are quite noticeable even on retina displays.

        • by MrL0G1C ( 867445 )

          "better than "retina" [isthisretina.com]"
          Supposedly my screen is like retina at 6 foot.

          "the limit of the human retina to differentiate the pixels" This is what needs clarifying.

          I can clearly see a line of white pixels between lines of black pixels at 8ft.

          Most of all, at 7feet some web fonts look atrocious to me, I block web fonts on a site by site basis because of this. The font letters am looking at as I type this is constructed of lines 1 pixel wide, if I turn on 'font smoothing' that looks really bad at a distance beyond t

      • Perhaps some of it could be due to videos vs. static images. For a resolution (or angular pixel density if you prefer) with clearly visible jaggies on, say, text or line art, I might not notice anything if it's a good video. Of course, if you're watching the news or something then you might have static text anyway...
    • Some people don't put their TV in the living room. I have my TV mounted relatively low and I do sit only a few feet away from my TV. I only have 42" but I can definitely see individual pixels on non-antialiased text or other sharp graphics. It's a dedicated home theater area, not a general purpose room. I'm not the only one. I would still like larger for immersion, but I will unfortunately be able to see more pixels edges at that point.

    • by MrL0G1C ( 867445 )

      PS, a simple test of whether an upgrade to 4k would be beneficial to your enjoyment is the 'sharpness' setting. If you can tell the difference at normal viewing distance when altering the god awful sharpness thing then you would notice the difference between 1080p and 4k (I turn sharpness off, it is an abomination of an 'enhancement' that doesn't belong on HD screens.)

      Or draw a couple of diagonal lines in a graphics editor - one with anti-aliasing and one without - if you can tell which line is which then a

    • That chart must be for the middle aged or older not wearing glasses or refusing to get them. Or maybe it's using comcast's 1080p thats compressed to hell and back.

    • by David_Hart ( 1184661 ) on Friday December 12, 2014 @11:16AM (#48581359)

      At 55" and average viewing distances of 8ft you're not going to notice all the detail of even 1080p. [carltonbale.com] You literally need to be sat a couple of feet away to get the full benefit of 4K on a 55" display.

      The people who are replying that the viewing distance charts are wrong need to understand what the recommendations apply to.

      First, they apply to the average person, whoever that may be. Since we all have slightly different eyesight, there are people who will see jaggies at the recommended range and people who will not.

      Secondly, the vast majority of the distance recommendations refer to televisions and video, not computer monitors and text or still images. Computer monitors tend to have more precise pixel color and lighting control which makes them sharper but also makes it easier to see jaggies.

      The point is that the charts were developed for TVs playing video and they tend to be accurate for this usage. Any application beyond that is pretty much out of scope.

      • Also - they typically refer to the distance at which the eye can resolve individual pixels, but it's well understood that it can still perceive visual "texture" at much smaller angular resolution.

  • Maybe this will drive some faster video cards.. I run 3 30" monitors (7680x1600); and while 2D and work productivity is no problem.. and, believe me, if you have the means I highly recommend picking them up - 3D surround gaming, even with SLI current-generation cards is a challenge.

    What's even more impressive is how fast the 4K panels are dropping in price. Manufacturing FTW.

    • You will pay for the privilege(and your ears won't thank you); but is the scaling of SLI/Crossfire good enough to save you? A quick look shows that you can get motherboards with up to 7 GPU slots without recourse to any terribly specialist vendors(they aren't cheap; but they are perfectly normal motherboard brands that you could have from Newegg by Monday, not some specialist display wall vendor who might get you a quote in the same amount of time), so you can throw a lot of GPUs at the problem; but that on
  • I have a 46inch 1080p screen, at normal viewing distance I can make out jaggies and I hate font smoothing because it looks blurry. I've seen that chart that says what distances different native resolutions are effectively discernible and IT IS WRONG by 50% - I can see the difference clearly where the chart says I shouldn't be able to.

    • by mwvdlee ( 775178 )

      Any other super-human abilities you feel like sharing twice?

      • by MrL0G1C ( 867445 )

        I wasn't aware that the ability to see was super-human.

      • Any other super-human abilities you feel like sharing twice?

        My superpower is that I can tell what all the people on the TV are going to say before they say it.

        • Your superpower is subtitles?
          • Actually, I have a friend who is partially deaf and they leave the subtitles on all the time. His kids are the best readers in their grades at school by a good margin thanks to always having those subtitles up.
          • in his head .. oOOooOoOoOoOoO

        • I've had that problem - try watching something other than your own home videos, that will usually clear it up, though you may have to avoid Hollywood blockbusters as well.

    • by Kjella ( 173770 )

      Well, one shortcoming of that chart is that it assumes 20/20 vision, that's the threshold for "normal" sight that doesn't need glasses but many people have better than that - 20/16 at least is not unusual - or can see better than that once they wear glasses/contact lenses. I think the most extreme cases are something like 20/8, meaning they can see from 20 feet what a normal person would have to be at 8 feet to see. I think it depends on source material and compression though, I've got a 28" UHD monitor (38

      • by fnj ( 64210 )

        Yeah, plenty can see better than 20/20, but a whole lot can't even see 20/20 with correction. I would guess maybe 1/3 the population is significantly below 20/20.

    • by AmiMoJo ( 196126 ) *

      Such claims seem to be based on a flawed understanding of how the human eye works. It's the same reason that the iPhone "retina" display was claimed to be better than human vision could discern the pixels on, even though it clearly looks worse than even the new "retina" screens which themselves look worse than QHD phone screens.

      The idea was that if you put thin parallel lines next to each other at some point they blur into one because the human eye can't pick the individual ones out. Unfortunately the human

  • About time (Score:4, Funny)

    by Overzeetop ( 214511 ) on Friday December 12, 2014 @09:32AM (#48580553) Journal

    5.5" phone screens are at 2560x1440, with 4k on the way. 8k on a monitor...what's the hold up?

    Phones seem perfectly able to light the screen and drive the pixels at less than 4W TDP. Seems odd that 8k is such a large challenge given volume, mass and power budgets 20-100x that of a phone.

    • by Higaran ( 835598 )
      It is easy to get high resolution on a small screen. The problem with the larger screens is that you have ALOT higher rate of defective screens, when you increase the size. Once that happens your cost go up because your basically throwing a bunch of them away.
      • Not throwing them away. Last I knew, if a panel has a defective quadrant, you get 3 quarter-size panels out of it after it's cut. It doesn't get put into a TV until after QA. And this means that the 8K display is just a perfectly defect-free panel that was probably intended to be 3 or 4 4K panels. Large 4K TV's is part of the natural progression of ramping up production of smaller 1080p TV's. And 8K is coming from smaller 4K production ramping up.

        • by AmiMoJo ( 196126 ) *

          It's not that simple though. A 50" 8k display only works out as 4x 30" 4k displays, and there isn't a huge market for 25" 4k displays at the moment either. On top of that, 4k yields are not particularly brilliant either because while there is a market for 27" 4k monitors, there isn't much of a market for 13.5" 1080P displays.

          • there isn't a huge market for 25" 4k displays at the moment either

            6" phones and tablets

            there isn't much of a market for 13.5" 1080P displays.

            That's becoming a fairly standard laptop screen size.

      • Re:About time (Score:4, Insightful)

        by Jaime2 ( 824950 ) on Friday December 12, 2014 @10:52AM (#48581121)
        I'd believe you, but computer monitor resolution was on an upward trajectory until 2005, when consumer HD took off. Since then, it's been nearly stagnant for nine years. This is simply the progress we should have seen a long time ago.
        • However during that time they let the technology stagnate. So the process to make better hi-res screen wasn't in the R&D pipeline.

    • I am not a subject matter expert; but the swift divergence of typical resolutions on small-screen devices with typical resolutions on larger monitors makes me suspect that manufacturing technique has improved substantially at fabricating very small pixels; but less dramatically at avoiding flawed pixels cropping up often enough to hurt yields of large and high resolution screens.

      I mean no disrespect to the (likely substantial) engineering effort and cleverness that goes into cramming 2560x1440 into some
    • by AmiMoJo ( 196126 ) *

      The yields on 5.5" screens are much better. If you have a 1m^2 substrate you can either make it into a single large monitor or many smaller 5.5" displays. A single serious defect writes off your entire large monitor, but only one of your many 5.5" panels.

      In addition, because a 60" 8k monitor costs a very large amount of money people expect perfection from it. Accurate colours, even backlighting, no visible dead pixels. That reduces your yields even further.

    • I want 8K phone displays. Entirely so we can use them as VR displays.

    • by sribe ( 304414 )

      5.5" phone screens are at 2560x1440

      Or 1280x1440 if you don't believe that an RGBG quad should be counted as 2 pixels just because it's twice as wide as it is tall ;-)

  • Do we need this? Is there really a sizable market for people who must have the latest even if the current stuff is good enough?

    • I use a 40" 1080p monitor - I promise you it's not remotely "good enough", except in the sense that I wanted a big-screen monitor and that was the best available. 4K will at least bring its resolution in line with my almost adequate 20" monitor. 8K will be the first actual improvement in "standard" screen resolution in what, a decade or so?

      Oh... you're talking about TV. Yeah, I don't see much point there either - 1080p already lets me see way more skin imperfections and bad makeup jobs than I have any de

  • If you double the length and the width of the rectangle you will get four times the area. There is nothing odd about it. Quadratic (and cubic ) relationships are very common. Typically the height of human beings and their mass follows a cubic relationship. The urban sprawl distance and the area of the city follows a quadratic relationship. It is not odd. It is just math.
  • by codewarren ( 927270 ) on Friday December 12, 2014 @09:45AM (#48580613)

    You can even get 4K PC monitors for an attractive price

    Citation needed (...please!)

  • by mentil ( 1748130 ) on Friday December 12, 2014 @09:50AM (#48580639)

    8k won't be ready for anything any time soon. HDMI 2.0 doesn't even support 8k 30Hz, and few TVs have Displayport. 4k Blurays are taking their time arriving to market, and 50GB arguably won't be enough for 8k without a codec upgrade which would itself require a new disc player. What portion of existing bluray players have old HDMI ports or processors that can't handle 4k content? It's not like 4k TVs are high-margin items anymore -- I saw a nice 50" one at Walmart for $699 a few weeks ago, and there were cheaper ones online. The price has hit rock bottom before there's even the demand for them. Unlike 4k cameras, there are only a couple prototypes of 8k cameras, so almost all content will be rendered CG for a while.

    I'd read countless arguments on Slashdot that human eyes can't discern resolution higher than 1080p in a 50" TV over 10 feet or so, before I actually watched a demo 4k TV running 4k content, for about 15 minutes. If you have a 50" TV in your bedroom, 5 feet away from where you're sitting, you can definitely notice a huge improvement in detail. I stepped about 15 feet away and in most scenes it was still usually an obvious, substantial improvement over 1080p.
    An electronics retailer in Europe held a contest, setting a cordon that people had to stay behind, more than 10 feet away from two televisions, and were asked which was the 4k tv and which was the 1080p. 98% of people correctly guessed which was which. Maybe people asked others who cheated, but it suggests that "most people can't tell" is bullshit. I seem to recall when the Apple retina display claims first came out, a scientist mentioned that humans' actual acuity was about 50% better than what Apple was claiming. It's also worth noting that while a single still retina image may be at a certain DPI, there are psychovisual effects (like depth perception) that can improve the resolution inside the brain, beyond what one retina picks up at one time. The eyes also saccade all the time, which I seem to recall can be interpolated to improve resolution.

    • by Kjella ( 173770 )

      An electronics retailer in Europe held a contest, setting a cordon that people had to stay behind, more than 10 feet away from two televisions, and were asked which was the 4k tv and which was the 1080p. 98% of people correctly guessed which was which. Maybe people asked others who cheated, but it suggests that "most people can't tell" is bullshit.

      This electronics retailer wouldn't happen to be in the business of selling people expensive new 4k TV sets by any chance? There's a lot of ways you could configure a 4K and 1080p TV to get that result like contrast, color and Netflix 4K probably got as many compression artifacts as an upscaled BluRay. I have a UHD monitor for gaming and such but TVs are way ahead of the content, I've no idea why 4K TVs are actually selling.

    • I guess the more expensive one was 4k?

    • A 50" 1080p TV has a dot pitch of approximately 0.58 mm. That's huge.

      My 27" 1440p monitor has a dot pitch of 0.23 mm. I can clearly see pixels jaggies 2' away. It's not capable of producing fonts smaller than 8 px without collapsing the whitespace in and between letters. I can clearly read an 8 px font on that display from 7' away.

      The pixels in the 50" TV would be discernible at 5'. I would have to be 18' away from that TV before I couldn't read an 8 px font on it. I would discern detail three times farther

    • Sure, not much content to justify it as a TV in the near term, but...

      DisplayPort 1.3 was released a few months ago, and supports 8K@60Hz.

      Upscaling to 1080 did wonderful things to DVDs for years before HD content became more than a fringe novelty.

      A 4K 40" monitor has barely the same resolution as a decade-old 20" HD monitor, and that was a step down from the high-def CRT monitors of 20 years ago, which had plenty of people were anxiously awaiting the final end of the clearly visible pixel which seemed to be

  • by gstoddart ( 321705 ) on Friday December 12, 2014 @09:51AM (#48580645) Homepage

    Earlier this year, both Apple and Dell unveiled "5K" displays that nearly doubled the number of pixels of 4K displays. 4K already brutalizes top-end graphics cards and lacks widely available video content, and yet here we are looking at the prospect of 5K.

    Fuck it, we're going to 5K ...

    I predict that this technology will be adopted for computers FAR before it is adopted for TV in any meaningful sense.

    Know why? Consumers got raped in the last HD format war. People bought gear which subsequently wasn't supported.

    I have no intention of lining the electronics industry with the money to replace my TV, my amp, my DVD player. The stuff I own is relatively new, and works just fine.

    The reason content for 4K is slow catching on because consumers are all thinking "why the hell would I switch to yet another format?" I expect we'll see 5K, 6K, 8K, 10K ... and all before the vast majority of consumers give a damn.

    My view of 4K for TV is a big "I don't care, because it's expensive, pointless, and pretty removed from being a need".

    I won't be surprised if it flops.

  • Come on, Apple, when are we going to see our 5" 8K displays? Imagine how clear that will be!!

  • and then some. I do hope they bring this to displays in the mid 30" range. But have to wonder if scaling issues are going to be a big concern.

  • Visual hyperacuity [sinauer.com] is one factor that often gets ignored in "how much resolution do you need" calculations. You'll see those "bumps" in nearly-flat diagonal lines much more readily than the simple calculations would suggest. Anti-aliasing everything tends to take care of that problem, but it's still pretty unusual to anti-alias everything. For example, does your system allow fractional-pixel cursor movements?

    • For example, does your system allow fractional-pixel cursor movements?

      Do any? That would be very nice. I'm just glad that it gets updated at 60Hz. For that matter, if you get your monitor to hit 120Hz, you'll see smoother movement just for the higher temporal resolution.

  • It doesn't seem to be in Windows 8.1 from my experience on a Surface Pro 2 -- it's a nice display and very high resolution, but it's scaling options leave a lot to be desired.

    I can only imagine the same phenomenon would be true on super high resolution screens, although a lot of people seem to like 4k monitors, but it's hard to know what these would be like in day-day usage.

    Incredible pixel density is nice, but it seems like (IMHO, anyway) that UIs and applications need to have a lot more flexibility about

    • Windows 7 introduced "Zoom" under the advanced display settings. It works well enough. Essentially scales everything (except improperly designed programs) perfectly. And a lot of the Windows UI elements scale well - most of Windows 8's new UI is vector-based anyway (or it appears to be). Similar to how Mac OS X handles "Retina" displays but with more fine-grained controls.

  • Only losers have 4K... 8K is the way to go!

    Sadly you cant get 4K content yet, Although a 50 inch 8K display on my desk would be a wonderful thing for my work computer.

  • by tverbeek ( 457094 ) on Friday December 12, 2014 @11:17AM (#48581383) Homepage

    The problem isn't that people don't understand the difference between linear and area measurement scales (so 8K is four times the number of pixels as 4K), but the fact that anyone lets these marketing drones get away with calling 7680 pixels "8K". 8K is either 8192 in binary terms, or 8000 in decimal terms.

  • I, for one, am looking forward to watching my DVDs with 10x10 pixels per pixel.

Neutrinos have bad breadth.

Working...