Forgot your password?
typodupeerror
Displays Handhelds IOS Portables Upgrades Hardware Apple

iPad 3 Confirmed To Have 2048x1536 Screen Resolution 537

Posted by timothy
from the small-package dept.
bonch writes "After months of reporting on photos of iPad 3 screen parts, MacRumors finally obtained one for themselves and examined it under a microscope, confirming that the new screens will have twice the linear resolution of the iPad 2, with a whopping 2048x1536 pixel density. Hints of the new display's resolution were found in iBooks 2, which contains hi-DPI versions of its artwork. The iPad 3 is rumored to be launching in early March."
This discussion has been archived. No new comments can be posted.

iPad 3 Confirmed To Have 2048x1536 Screen Resolution

Comments Filter:
  • 4:3 comes back! (Score:5, Insightful)

    by Bobtree (105901) on Friday February 17, 2012 @10:21PM (#39082069)

    I'm looking forward to desktop displays getting increased resolution and 4:3 aspect ratios back some day. It's mildly ridiculous that we'll have the mobile device market to thank for it.

    • Re:4:3 comes back! (Score:4, Insightful)

      by qxcv (2422318) on Friday February 17, 2012 @10:28PM (#39082127)

      Why is 4:3 such a useful aspect ratio? Just curious because I tend to prefer wide-screen monitors that I can flip on their sides or use in landscape orientation depending on what you're doing, and it seems to me that the monitor market is going that way. I'd have thought that square-ish monitors tend to be less comfortable given that humans have a greater horizontal than vertical field of vision (I feel a bit boxed in when using 4:3 CRTs, but that may just be the low resolution).

      • Re:4:3 comes back! (Score:5, Insightful)

        by Shadow of Eternity (795165) on Friday February 17, 2012 @11:01PM (#39082389)

        Eleven years ago you could buy a 24" monitor that could do this resolution, and 21" monitors that did 1600x1200 were commonplace. Inch for inch a 4:3 monitor will have more usable space than an equivalent widescreen display, they got popular because companies figured out they were cheaper to make and gave more panels for a given investment. Marketing convinced people that instead of getting an inferior display with less usable space they were getting the Next Big Thing.

        I've been waiting for resolutions and refresh rates to catch up to what they were a decade ago ever since we made the switch to widescreen flatpanels.

        • Re: (Score:3, Interesting)

          by Lumpy (12016)

          It's why I have not upgraded my 5 year old macbook pro. You cant get a 1920X1200 laptop screen anymore. WTF is that.

        • Re:4:3 comes back! (Score:5, Informative)

          by tlhIngan (30335) <<ten.frow> <ta> <todhsals>> on Saturday February 18, 2012 @03:24AM (#39083641)

          Eleven years ago you could buy a 24" monitor that could do this resolution, and 21" monitors that did 1600x1200 were commonplace. Inch for inch a 4:3 monitor will have more usable space than an equivalent widescreen display, they got popular because companies figured out they were cheaper to make and gave more panels for a given investment.

          I hated CRT monitors - they always got blurry when you ran them at super-high resolutions. Of course, I never bought the $2000 high end ones... (and having to run at 85Hz meant the monitor really only did 800x600).

          The real reason cheap screens are 720p or 1080p is because the processing electronics is trivially cheap. It's basically the same as a regular HDTV. And that gives you a VGA and DVI/HDMI input "for free". To do 1920x1200 requires a different video processing chip for the monitor, which costs a lot more money because of the limited market (one reason why a 24" 1080p is available for under $200, while a 24" 1920x1200 is $500+).

          Apple can do this because they're making these things by the millions, so they can buy in such huge quantities that high res stuff is cheap for them.

      • Re:4:3 comes back! (Score:5, Insightful)

        by Bobtree (105901) on Friday February 17, 2012 @11:43PM (#39082663)

        > Why is 4:3 such a useful aspect ratio?

        I don't know, but I agree with the question's implied premise (4:3's high utility).

        It's a good question and I wish I knew the answer to it. I couldn't find any historical reference as to why 4:3 was originally chosen for televisions (the details behind the NTSC format are brilliant, but that's a separate topic). I don't feel anything like "boxed in" when computing on a 21" 1600x1200 CRT, and I don't want to give up vertical resolution for a widescreen of the same size. Lets speculate.

        The closer the ratio is to square, the more usable area you have for the size of the device. If wider screens were better, why wouldn't we keep making them wider, why not 3:1 or 4:1 or 5:1 ratios? Maybe 3:4 is just a sweet spot for compromise between high area and our forward facing binocular vision. It's a mistake to even call them wider than conventional displays, as aspect ratio is independent of physical size. Have laptops really gotten wider, or have they gotten shorter? I think wider ratios are actually mis-marketed short-screens, with their prevalence reflecting cost (smaller area) in pushing HDTV sales, and not quality.

        I know newspapers print in short columns for readability, as its easier to keep your place with short lines than with very wide ones, and computer screens were dominated by text long before graphics. Books too are mainly tall rather than wide ratios. Wider aspects are preferred for landscapes and juxtapositions of people in films, but whatever we gain in video game FOV we're losing in visible detail under our feet (and performance is lost to render peripheral objects you barely see, at increasingly skewed projection angles, versus more sky and ground in a taller ratio, which are virtually free performance-wise).

        The bottom line is always useability. Do you really want to squeeze every vertical pixel out of an interface (browsers for instance), to deal with displays that are just too short? I sure don't, and I don't care to move a physical setup around when resizing display elements is sufficient. It may even just be tribalism or convention, but I know I like it. Long live 4:3!

        • Re:4:3 comes back! (Score:5, Interesting)

          by macshit (157376) <miles.gnu@org> on Saturday February 18, 2012 @01:48AM (#39083319) Homepage

          > Why is 4:3 such a useful aspect ratio?

          I don't know, but I agree with the question's implied premise (4:3's high utility).

          It's a good question and I wish I knew the answer to it. I couldn't find any historical reference as to why 4:3 was originally chosen for televisions (the details behind the NTSC format are brilliant, but that's a separate topic).

          I suspect it was less because it was "optimal", and more because it was an acceptable compromise between a desirable aspect ratio and technical limitations. Remember, back then, when they were using primitive CRTs, the closer to a perfect circle, the easier it was to manufacture, and most efficient rectangular shape was a square. But humans with their two eyes generally want something wider than it is tall (note movie aspect ratios, which were less constrained by technology). A 4:3 aspect ratio provides something which is close enough to a square to efficiently use the technology of the time, but wide enough to provide a somewhat comfortable shape for viewing.

          With non-CRT tech, and modern manufacturing technique, there's a lot more freedom to choose a shape which is good for viewing, so it makes sense there's a lot of experimentation with aspect ratios these days.

          Personally I love the "medium-wide" aspect ratios like 16:10 for my main hacking monitor; 4:3 feels constraining. Note that I tend to have multiple windows open (multiple editor windows, an editor and some terminal windows, etc) at the same time, and side-by-side windows are vastly preferable to vertically adjacent windows when the windows are tall (typically true of editor windows). A wide aspect ratio fits this usage pretty well. People whose main mode is the MS-style "one-app-window-always-maximized" may have different preferences.

          In the case of the ipad, of course, the main style does seem to be "one app visible", and they strongly want a shape which is viable when used either vertically or horizontally. Given those factors, 4:3 does seem a reasonable choice.

  • Confirmed by who? (Score:5, Insightful)

    by mkraft (200694) on Friday February 17, 2012 @10:22PM (#39082083)

    Apple sure as hell didn't confirm anything. So basically we have someone who looked at a screen, that may or may not be for the iPad 3, under a microscope and "counted the pixels".

    Again Slashdot titles are redefining words in the English language.

    • Re:Confirmed by who? (Score:4, Interesting)

      by artor3 (1344997) on Friday February 17, 2012 @10:29PM (#39082137)

      Counting the pixels is a pretty good way to figure out how many there are. How else would you do it? The only matter in question is whether or not the screen they were looking at is actually going to be in the iPad 3. That seems likely to be the case, unless this is just some prototype screen that isn't going to go into any device.

  • by Jmanamj (1077749) on Friday February 17, 2012 @10:22PM (#39082089)

    Before the flames rise and Slashdot begins to slash the dots, I'd like to thank Apple for helping break the "HD = 1950x1080" fixation the market has. Hopefully monitor tech will get some advances soon.

    • Re: (Score:3, Insightful)

      I think operating systems have some work to do as well. Higher DPI often just means smaller widgets. Hopefully this makes its way to laptops soon.

      • Re: (Score:3, Informative)

        by Anonymous Coward

        Windows has supported changing the DPI (so widgets use more pixels) since Windows 3.1. Talk to the application developers.

    • by martin-boundary (547041) on Friday February 17, 2012 @10:38PM (#39082199)

      I'd like to thank Apple for helping break the "HD = 1950x1080" fixation

      You big meanie! For every extra pixel over 2106000, a young Chinese worker cries himself to sleep every night.

    • Ummmm (Score:5, Insightful)

      by Sycraft-fu (314770) on Friday February 17, 2012 @10:55PM (#39082367)

      There have long been higher res displays. However there's some serious limits to their usefulness, which is why they aren't widespread.

      One big one is that until recently OSes didn't have good resolution independence, and still to this day many apps don't. Windows Vista got top notch resolution scaling but if apps don't support it they can break badly, or just fail to scale.

      Another is video memory. More pixels = more VRAM particularly when you talk 3D. Now this is not a big deal, we have lots, but wasn't long ago that 256MB was considered "high end" and 64MB was common for cheaper stuff.

      Along those lines there is GPU power. If you are just fiddling with 2D stuff this isn't a big deal but if you are pushing 3D, more pixels means more strain. Double the rez in each direction you need 4 times the ROPs to get the same framerate at a given detail level.

      Then there's interface bandwidth. Gets to be a bit of a trick to push lots of data through inexpensive connectors. Dual link DVI was the only way to go, and that capped out at not all that high of a rez. DP 1.2 and HDMI 1.4 solve this, but are quite new.

      Of course then to all that there is the cost. Pixels mean transistors and more transistors mean more cost. You can't just increase pixel density and expect pricing to be the same.

      So it is a situation that only now are all the pieces falling in to place. Only once you have an OS (and apps) that support it, a readily available interface that can push the data, a GPU that can produce the data and has the memory to hold it and costs are low enough to make it economically feasible does it make sense to start pushing it on a larger scale.

      However for all that, if you want higher rez displays you can have them. There are 2.5k 27" and 30" displays that aren't too bad price wise. You can have 4k displays too, but they are extremely expensive.

      • GPU power becomes less of a problem at extremely high resolutions. Upscaling actaully looks decent if if the output resolution is an integer ratio of the input resolution, or several times greater. Back in the CRT days it was standard practice to adjust your video resolution to the highest value that gave acceptable 3D performance.
  • hmm (Score:3, Interesting)

    by buddyglass (925859) on Friday February 17, 2012 @10:23PM (#39082091)
    If they could get away with it, seems like 1920x1080 would be ideal. That's a lot longer/skinner (or shorter/wider) than 2048x1536, but still an incremental improvement over the iPad2 resolution.
  • whoa (Score:2, Insightful)

    by amoeba1911 (978485)
    2048x1536? My 21" monitor isn't even that high resolution and I can barely see the pixels. You're trying to tell me a 10" ipad is going to have higher resolution than my 21" monitor? Seems like a waste, especially on an iPad.
    • Re:whoa (Score:5, Informative)

      by MobileTatsu-NJG (946591) on Friday February 17, 2012 @10:27PM (#39082119)

      Nobody with a smartphone using a 200+dpi display would agree with you.

    • 2048x1536? My 21" monitor isn't even that high resolution and I can barely see the pixels. You're trying to tell me a 10" ipad is going to have higher resolution than my 21" monitor? Seems like a waste, especially on an iPad.

      "640x480 is more resolution than anyone will ever need." - amoeba1911

    • Re:whoa (Score:4, Interesting)

      by nomel (244635) <`turd' `at' `inorbit.com'> on Friday February 17, 2012 @10:44PM (#39082259) Homepage Journal

      You've obviously never played angry birds or plants vs zombies.

      I think the pretty and usefulness will be in the proper aliased text presentation. The desktop monitor I'm looking at has only a few useful font sizes for the capital letter "I", ether one pixel wide, two pixels wide, or three pixels wide...anything between is blurry. I would absolutely love to see a true type font that didn't look blurry and didn't require some barely tolerable sub-pixel tricks.

    • by Darinbob (1142669)

      You could put a big magnifying glass in front of it ala Brazil.

    • We've already been through this with the doubling of the iPhone to a "Retina Display". So we already know for sure it substantially improves the quality of the graphics. Your gut feel is irrelevant.

  • by Frogbert (589961) <frogbertNO@SPAMgmail.com> on Friday February 17, 2012 @10:36PM (#39082183)

    Unfortunately users at my company will still find a way to run them at 800x600

    • Re:Unfortunately (Score:5, Informative)

      by JDG1980 (2438906) on Saturday February 18, 2012 @12:31AM (#39082967)

      Unfortunately users at my company will still find a way to run them at 800x600

      You laugh, but this is actually a serious reason why we don't have high-DPI displays on the mainstream desktop.

      Not everyone has perfect 20/20 vision, or the same tolerance for small print. Many users already have problems reading text on existing displays when set to the default of 96 DPI. Unfortunately, the art of DPI scaling on mainstream OSes is still stuck in the dark ages. There are a LOT of poorly-written applications that assume 96 DPI and display badly broken output if anything else is set. Windows 7 is better than XP in its DPI scaling, but even so, it's far from perfect. Windows doesn't even support vector icons! The best you can do is to create a high-quality raster icon at 256x256 and hope it looks OK when downscaled.

      This is why so many users run a LCD monitor at less than the recommended resolution. The slight blurriness is better for them than crystal-clear text too small to read, or various graphical nastiness from broken DPI scaling. Just today, in fact, I dealt with such a situation at work. One of our librarians said that some icons in the library management software were appearing all-black. I'd seen this issue before and knew it was due to the software not supporting 120 DPI, which this librarian had set for easier reading. I tried a few different things to see if I could get it to work – I set the "Disable display scaling" option in Compatibility properties, and also tried XP-style DPI scaling as well as the native Windows 7-style scaling. None of this worked. Ultimately, the only fix was to switch back to 96 DPI and run the monitor at a non-native resolution.

      As long as this situation continues, monitor makers see no advantage in higher resolutions than 1080p, since so many users will just sacrifice that resolution for readability anyway.

      • Re:Unfortunately (Score:4, Informative)

        by shutdown -p now (807394) on Saturday February 18, 2012 @02:28AM (#39083465) Journal

        I'd seen this issue before and knew it was due to the software not supporting 120 DPI, which this librarian had set for easier reading. I tried a few different things to see if I could get it to work – I set the "Disable display scaling" option in Compatibility properties, and also tried XP-style DPI scaling as well as the native Windows 7-style scaling.

        What you should do is, find the developer of that app, and punch them in the face. Let me explain why.

        Windows could do "DPI scaling" for ages - I think it was there in Win95 already? definitely before XP, anyway. But, the way it did it, it was really just a global setting that all apps could read. Some Windows APIs respected it also - e.g. CreateDialog and friends, where you had to specify sizes of widgets in "dialog units", and said units would change according to DPI. VB6 also measured everything in "twips" rather than pixels, also DPI-aware. But many other APIs, even stock Win32 ones, dealt in physical pixels; and so did most apps in practice. At best you'd get correct scaling for stock Windows dialogs and Office...

        That was the way it all worked up until Vista. In Vista, the status quo was found to be too broken to maintain, and they've decided to break things a bit so that the defaults would be more palatable. So they've introduced a concept of DPI-aware app [microsoft.com] - meaning that its author would have to make an explicit API call to tell the OS that, yes, he knows what DPI is, and, yes, he can do proper vector scaling where possible.

        Obviously, none of the existing apps did that API call, in which case the OS assumed that they do not know how to properly render themselves at DPI other than default, and performed bitmap scaling on their top-level windows after they were rendered. The result is far from perfect, of course, since what you get are huge ugly upscaled pixels. But at least it was consistently ugly, and it actually made things bigger - which is kinda important for people with bad sight. The assumption was that, for unmaintained legacy apps, it's "good enough", and for maintained ones the authors would get complaints from their users, finally figure out the whole DPI thing, fix their apps, and opt out of bitmap scaling via the aforementioned API call.

        Also, when you enable "XP-style DPI scaling", you're basically just disabling bitmap scaling and preventing the OS from lying to the app to pretend that DPI is always at 96 - so even if app did not declare itself as DPI-aware, it would still see the real value and try to handle it the best it can. It's mainly there for old apps that were never updated for Vista, but which were written correctly to begin with.

        For the most part, the scheme works - as you note yourself, Win7 is much better than XP in that regard. Unfortunately, there's still no shortage of idiots who call SetProcessDPIAware [microsoft.com] (or set the equivalent in their app manifest) without actually them being aware of what it means, and what their obligations are when they do it. From your description, it sounds like you've run into one of those cases.

        Now, since all this stuff that I've explained above is clearly spelled out in MSDN, and since DPI-aware is not the default setting even for new apps - you have to actually know how to enable it, which implies that you've read at least the summary of what the setting does on MSDN. So clearly, for any developers who did so and still managed to go away without understanding what they do, the only recourse is a face punch - since any attempt to gently educate was lost on them already.

  • by perpenso (1613749) on Friday February 17, 2012 @10:39PM (#39082205)
    This is not surprising at all. Most iOS developers expected no change in screen resolution until 2x was possible. The repositioning of screen widgets and the scaling of bitmap images works better with whole number multiples. If 2x is the multiple then the iPad 3 could automatically recycle the 2x bitmap images found in iPhone 4 aware apps.
  • by wisebabo (638845) on Friday February 17, 2012 @10:43PM (#39082247) Journal

    So the most commonly used format for digital cinema is 2048x1080 (4K is not widely used, yet). Notice that it is just a little bit wider than 1080p (128 pixels). So either cinematographers have had to scale down the outputs from their digital cameras/post production workstations to use "standard" HD displays (and suffer scaling artifacts), throw away the pixels on the side, or use very expensive professional equipment.

    Could the iPad 3 display be used instead? If the iPad 3 has thunderbolt (now THAT would be interesting), could it be used as a (very) portable display?

    I am such an Apple Fanboi you wouldn't believe but if Samsung came out with a tablet that, at the flip of a switch, coud be used as a portable, digital cinema ready display, I would buy it so fast it would make Steve Jobs spin. (hope that wasn't too morbid or disrespectful).

    • by Sycraft-fu (314770) on Friday February 17, 2012 @11:36PM (#39082611)

      There are a number of 27" and 30" displays that are 2.5k. The NEC PA271W and PA301W, the HP ZR2740w and ZR30w, the Dell U2711 and U3011, the DoubleSight DS- 277W and DS- 307W and so on.

      They are 2560x1440 for the 27s, 2560x1600 for the 30s.

      It isn't hard to find for regular old computers. However I imagine anyone shooting in the digital cinema 2k format is probably not concerned about having to get pro gear because they already have it. You have to step up to some pretty expensive cameras before you start talking that. Everything even remotely prosumer is 1920x1080 max since that is what you are targeting for home, of course. If you have to get expensive cameras, an expensive display isn't likely to be a show stopper.

      However as I said, plenty of computer displays that do 2k (and more) no problem.

  • by Patron (2242336) on Friday February 17, 2012 @11:12PM (#39082469)
    - "Hey, John. Stop playing around with your tablet and get out in the real world."
    - "But moooom, this is the iPad 3!, it has BETTER resolution than the real world!"
  • desktop resolution (Score:4, Interesting)

    by e**(i pi)-1 (462311) on Friday February 17, 2012 @11:47PM (#39082697) Homepage Journal
    I still wait for a reasonably prized desktop monitor with resolution beyond 1920x1080 (2560 x 1440 rwould be nice) et's hope that this changes now when tablets will have 50 percent more pixels than standard desktop monitors.
  • so the last one (Score:3, Interesting)

    by nimbius (983462) on Saturday February 18, 2012 @12:01AM (#39082811) Homepage
    came out march 2011, and the one before that in 2010 about the same time give or take a month.

    As a slashdotter whos never used a "tablet computer" I sincerely want to ask mac users, why do you keep buying these? if you select the average ipad its
    six-hundred dollars and has a one-time battery that cannot be replaced (or not that apple is willing to inform on their website.)
    you will have sunk by this march about $1800 into an appliance that is guaranteed manufactured-obsolete in one year.
    You dont do this with cars, televisions, stereos, homes, desktops, laptops, clothes (presumably they last longer than a year)
    or any other major consumer purchase, so why do it with tablet computers?
    • Re:so the last one (Score:5, Insightful)

      by WiiVault (1039946) on Saturday February 18, 2012 @12:21AM (#39082911)
      Hmm perhaps its because (most) people don't upgrade to a new model everytime it comes out. Do you know that cars usually come out every year too? What about GPUs and CPUs? Heck those come out all the time! By your logic everybody must upgrade those annually as well to avoid being "obsolete". Not to mention iPads like most devices are usually supported well past their discontinuation date. I must hope you've been drinking this weekend say something so bizarre.
    • if you select the average ipad its
      six-hundred dollars and has a one-time battery that cannot be replaced (or not that apple is willing to inform on their website.)

      The basic models are $500. They are perfectly usable, it's only if you used them heavily you would really need more than the base storage (especially now that you could simply load some apps on demand and delete them when finished, and play most music off iCloud).

      The battery can be replaced by Apple, I think a $99 fee. However the first iPad I

  • Citation (Score:4, Informative)

    by whisper_jeff (680366) on Saturday February 18, 2012 @12:38AM (#39083011)

    Confirmed? Really? AWESOME!! Uh, just because I want to see, could someone post a link to Apple's announcement confirming it?

    Yeah.

    Confirmed. I think you're using that word without knowing what it means...

You can not get anything worthwhile done without raising a sweat. -- The First Law Of Thermodynamics

Working...