Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Hardware

22" 9.2-Million Pixel Display 165

chrisd writes: "Just noticed this article over on Yahoo news. It described a research project that Intel and Stanford university developed that concentrated on next-gen displays. The result? A 22 inch display that displays 9.2 million pixels (they use the odious 'megapixel' descriptor in the article), needs 16 processors and 2 GB of ram to run it and costs $200,000US. So it's a little spendy. This is a big step up from my first 12" amber screen though, that's for sure." Ah, the march of progress ... I'm happy with anything that will help drive down the cost of 17" and 18.1" LCD displays, no matter how indirectly.
This discussion has been archived. No new comments can be posted.

22" 9.2 million Pixel Display

Comments Filter:
  • Wow... so a few dozen widgets need to be recreated in a larger size. This ain't a big deal (in the case of OS X).
  • by Anonymous Coward
    Right. I hate that unit as well, but assuming the
    display is 4:3, you'd get:

    3x*4x = 9.2e6
    x = 876
    4x = 3508
    3x = 2628

    3504x2628
  • by Anonymous Coward
    My friend Daron and I did something very similar a few summers ago.

    We exploited the fact that existing LCD technology utilizes a semi-transparent membrane that produces no light of its own and is illuminated usually by way of a fluorescent light. We were able to introduce a very high resolution to these screens by actually arranging them one on top of the other, five LCD's deep. The whole setup relied on some very precise arrangement (each panel needed to be approximately 1/10 of a pixel width offset from the one beneath!). We ended up milling an enclosure with a high-tolerance CNC machine to even make the thing feasable. In addition, we eventually found that a fluorescent light source simply couldn't put out enough lumens to sufficiently illuminate the grid so we switched to a mercury vapor setup. The energy efficieny in these things is atrocious but it really was the only thing feasible at the time.

    Suprisingly enough the physical setup was relatively simple compared to the interpolation and digital signal processing required to composite a standard VGA signal across an array of dozens of interlaced LCD displays! I attribute most of our success in this to the fact that Daron and I are both wizzes in QBasic. For those of you who don't know QBasic is a low-level systems language specific to the windows API. It's one tremendous advantage is that it is native to the OS (operational system) and hence runs at what we call real-time kernel speed. I think this is really where MacOS and linux tend to fall down--they mostly rely on cross-platform languages like C, which, unfortunately must perform to the "lowest common denominator." QBasic afforded us a modularity and granularity of performance such that I don't think a similar feat could have been pulled off in any other environment.
  • thank you apple marketing department...
  • Light emitted by current LCDs is polarized, as is inherent to the design, but in the same direction for every pixel on the display. With LCDs reaching 200-300dpi across large areas, if one could alternate or dither the output polarization of pixels, 3D display would be possible with simple polarized glasses (vertical polarization for the left eye, horizontal for the right, like most modern 3D films).

    This would certainly be a big improvement over CRT displays and relatively bulky electronic shutter-glasses currently used for a lot of visualization work, across many industries.
    I think such a technology, if made cost effective, could ultimately make a huge splash in the computer and consumer electronics industry (e.g. games, 3d visualizations, stereophotographs).

    Anybody in the field around to comment?

    -Isaac

  • Then we're talking roughly 3500 x 2625 (9,187,500) (3502 x2626 = 9,196,252 is the closest you can come without exceeding 9.2 million).

    Close enough. And probably accurate, since nobody in the industry actually posts real numbers anymore.


    Chas - The one, the only.
    THANK GOD!!!

  • Also, there really isn't much need for it. I mean this monitor is nice and all, but with a 20K pricetag and no real applications for it (for the average user, anyway - maybe this would be useful in some 3d modeling scenarios, perhaps), I doubt we'll be seeing technolgy to support this kind of thing in even high end workstations for at least a few years.

    I could see this used for film resolution work as well. It's close enough to film rez specs that I tend to think that's the market the researchers are targeting in the near term.

  • They're probably counting the separate red, green, and blue elements. These are separate pixels, as they are offset from each other. A special filter, known as a "demosaicing" filter, is required to correctly merge these offset images into the Y/Cb/Cr planes that are encoded in the final JPEG. The ratio isn't a factor of 3 as you might expect, as mosaic patterns tend to use more green than red or blue. (The patterns I've seen have 5 green to 2 red and 2 blue.)

    This page [stanford.edu] describes mosaicing to some extent, in the context of reverse engineering a Kodak digital camera. This page [stanford.edu] at the same site offers some other details. And Ron Kimmel's page [technion.ac.il] has some neat pictures showing different artifacts due to poor de-mosaicing.

    --Joe
    --
  • "But it seems like no one (besides MS) is working on resolution-independent GUI frameworks."

    Hey, buddy, you misspelled "Apple," there. OS X is thoroughly vector-based.

    --
  • by hpa ( 7948 )
    What is wrong with the term "megapixel" (when spelled properly, of course)? It is descriptive and makes perfect sense.
  • High resolution displays like this one are going to require a graphics card that is a generation beyond the current cards in 2 -D computing power.

    Even though video cards have gotten more powerful in recent years, it has generally been in the 3rd dimention, rather than the first two were most of us spend our lives working. For instance, most people who work with intel based hardware will only concider nVida video cards. But guess what? My ATI Radeon card is FASTER in 2D display. It is nice that you no longer need an expensive fast video card to work in photoshop. But lack of competition in the world of computing....is a strange thing.

    Apple's Display PDF (Quartz?) standard in Mac OS X may actually be the first thing in a while to push 2D acceeration forward.
  • A company called CopyTele has been developing a flat panel display technology they call E-Paper for something like the last 15 years and they have filed well over 200 patents [164.195.100.11] for it on the way. It looks very promising, but unfortunately it's not here yet.

    It is based on the principle of electrophoresis - moving microscopic particles with electric charge while suspended in opaque fluid. When the particles are moved to the front they are visible, when they are pulled back they are obscured by the fluid.

    Since the particles have the same density as the fluid they don't drift. The image stays even after you turn off the power. Power is only required to modify the image, resulting in extremely low power consumption. No backlight is required - the display is reflective and has very high contrast, supposedly comparable to ink on paper. It is clearly visible in full sunlight.

    Since the image remains without refreshing it is not necessary to update the image tens of times per second just to get a stable image. This makes it easier to reach very high resolutions (>200DPI). This technology may not be appropriate for video, but it should be ideal for e-books.

    If you look at their site [copytele.com] you will see that they now sell encryption devices. This move is relatively recent. In their press releases [copytele.com] you can see that they are still working on their display technology. I first heard about them in a Popular Science article in 1985(!). I'm still waiting for them to bring this display to market...


    -
  • I must add that I am not too impressed with their marketing so far. And neither are their investors [buckeybuzz.com].

    Their Magicom 2000 product is a clever screen phone/paperless fax device that allows faxing a document and scribbling on it while talking. It's a nice product but is that the right product to launch a revolutionary display technology?. Consider the fact that both parties need to have these devices for it to be useful.

    -
  • For a tech article, I was kind of disappointed that they got DVD wrong. DVD doesn't stand for "Digital Versatile Disc" (though it may have at one point). DVD doesn't stand for anything.

  • by jonbrewer ( 11894 ) on Tuesday June 05, 2001 @07:34PM (#173319) Homepage
    I have a Matrox G200MMS Quad-head that will do 5.2 megapixel over four monitors, and IIRC, it was only $800. I'm using it now to drive a pair of Samsung 770TFT panels. At $950 each for a 1280*1024 flat panel, they're a bit more reasonable than most.

    I'm a bit disappointed in their manufacturing though. I have one born in Oct 2000 and another born in March 2001. Unfortunately the older one has a different white point and neither have hardware color temperature adjustment. But it was the low cost that allowed me to get away with having a 2560*1024 (soon to be 2560*2048!) flat panel desktop, so I can't complain too much. :-)

    Now if you want to display a bunch of DVDs on those monitors, you're going to need a heck of a lot more bandwidth than the PCI bus can provide you. (G200MMS is PCI card) And you'll need something to decode with too. But if you didn't read the article you should know that you don't need that kind of cash to get 9.6 million pixels up on a monitor, as long as the pixels aren't doing much.
  • Apple doesn't make exclusive manufacturing arrangements to sell sexy flat panel displays to Windows users, they do it to sell Macs.

    Just wait -- similar displays at similar prices will be mass-market within a year.
    --
  • Is a mexapixel anything like those Taco Bell Mexi-Fries? I hate those things...
  • Nitpick: 600mm by 700mm is less than a half a meter squared. .42 meters squared, actually.
  • So, assuming for the sake of argument the monitor has a roughly 16"x12" viewable area, that gives (16*17000) * (12*17000) = 55,488,000,000 points, or 55.5 gigapoints, as the limit of human eye resolution for a screen of that size. That's several orders of magnitude over the announced 9.2 megapixel display, assuming that pixels are roughly equivalent to points.

    I guess I really shouldn't be suprised at Slashdot readers having no understanding of basic optics. I note that the parent post has been modded 'Flamebait' as well.

    I don't know if the unit is 1700 point sources, but I would read that as per 'square inch', so your mathematics should be (16 * 12 * 17000) = 3,264,000 - or less than 9.2 megapixels.

    I remember reading a few years ago (can't remember the cite) that about 3000 by 3000 pixels was the maximum that a human eye could discern (no matter what the distance) while still seeing the entire screen at once. Of course a bigger screen close up would look blocky, but you couldn't view it all without getting futher away anyway.

    This was based on calculations which said that (from memory again) humans can see an arc of about 10 degrees at once, and have a certain number of photo-receptors per degree of arc in the eye. Once you get one pixel per photo-receptor, that's about the limit. Make it one and a half for smoother blending - and take into account that inperfections in the lense of the eye will blur it more anyway.

  • Well the Onyx2 at the Hayden Planetarium pumps out 7 1280x1024 images which are projected and blended on a large dome.

    It can play synchronised movies at that resolution, or generate real time graphics at that resolution.

    http://www.sgi.com/features/2000/feb/hayden/inde x. html
  • The article says it "can divide its screen to run
    16 DVDs (digital versatile discs) simultaneously
    in 720 by 480 pixel mode." The number 16 can only
    mean a 4x4 or 2x8 partitioning, with the latter
    being a rather ridiculous aspect ratio. Therefore,
    The display is around 2880x1920. So I'd guess
    they refer to around five million pixels. This is
    further reinforced by the fact that the article
    is down on 2 megapixel displays while portraying
    10 megapixels ones as many years into the future.
  • by Mike Schiraldi ( 18296 ) on Tuesday June 05, 2001 @08:48PM (#173326) Homepage Journal
    I had the Mexapixel at Taco Bell this week.

    --

  • One has to wonder where we're going to get bitmapped images anywhere near that resolution. Obviously, if something has that many pixels it's going to be generated from vector art. Then comes the question, why do we rasterize the vectors in the computer and then send gigabytes a second to the monitor!?!

    I would be much more inclinded to use a monitor like this if it used Display PostScript. They should really incorporate a custom DPS RIP (normal PS RIPs already exist, so this isn't much of a stretch). A mass produced DPS RIP would certainly cost less than 16 P4s and it wouldn't even require a bus the size of Utah between the computer and the display.

    My 2 cents.
  • After just getting DSL, this monitor is the last thing I need. C'mon, do the math, 9.2 megapixels is about 3500x2625, which on a 22" monitor (9.6" by 7.2") would be around 360 dpi. That means that even with DSL, all that high resolution porn I'm downloading would only be a 2 inch thumbnail. Might as well stick with my low res monitor and get a 14.4k modem, it'd be cheaper. =P
  • Why is the term 'megapixel' odious to the author?

    Surely you don't thik we should go around talking about how our floppies can hold one million four hindred-forty thousand bytes of data, do you?

    Kevin Fox
    --
  • As the article points out, displays are getting cheaper at about 8% per year. That's a lot better than almost any other non-IC product. Be patient.

    I try, but 8% per year is glacial compared to the rates for ICs and for magnetic/optical (take your pick) storage technology. Believe it or not, your average "good" PC system used to be equal parts (in cost) of RAM, mobo, disk, and display. Now, for systems that we'd really want, it's better than 50% display (thinking along the lines of a big Apple flat panel and even the most tricked out consumer-level PC).

    Under these conditions, patience is hard.

  • Careful with the word "superior"-- not everyone has the same opinions of what's superior and inferior.

    For example, some folks might find a 22" monitor more handy than a 17" monitor, regardless of the dpi.
  • not quite sure what your reasoning is there.

    all available microsoft oses use bitmap graphics, which are resolution dependent.

    what you're looking for is vector graphics, which are used to a large extent in apple's mac os x, and to a minor degree in gnome.

    i would say microsoft are actually lagging in this race, unless they've got some vector tricks up their sleeves in windows xp (which i haven't played with yet, so can't comment on).

    matt
  • The old Tectronix (sp?) 4014 was 4096 x 3072 pixels.
    That's 12 megapixels something like 20 years ago.
    Of course it was monochrome storage tube technology.
  • Actually, I've been rather disappointed in the advancements in display technologies (and prices) over the years. Yes, I too started with that 12" Amber screen (with a Hercules "high resolution" mono graphics card).

    At the same time, we've seen incredible increases in processing power, from a 10mhz to a 1ghz CPU. In graphical terms, we should have a 3d holographic display on our desktops by now.

    I personally blame the CRT manufacturers for this extreme lag in display technology. They've really controlled the prices and their technology hasn't increased significantly.

    Then there is the industry that placed all its bets on CRT technology for the longest time. But, considering what we've seen in the recent past, I think we're going to get a surge in display technology. CRT technology is increasing in quality, and alternative technologies like LCD are becoming mainstream.

    Here's to graphics display research.
  • The paper on the Lightning-2 hardware that drives the monitor is here [stanford.edu].

    See also WireGL [stanford.edu] and its associated paper [stanford.edu], which was used to run Quake3 on the IBM display.

  • I've seen that page but thanks for the link :-) I guess what I'm asking is a little "hands on" experience - I figure someone out there in slashdot land must own one of these babies......
  • So if it's sitting around doing nothing and i am interested in one..... ;-)
  • yeah yeah - I know - offtopic - moderators act accordingly.

    Anyways - now we are on the topic of monitors - has anyone in the slashdot crowd had any experience with the sgi flatscreen monitor? http://www.sgi.com/flatpanel/ - I am very interested in picking one up.... but I am running Windows 2000 and am not to sure of support for it / video cards? Anyone able to shed some light on how good / incompatible this sexy monitor is?
  • In all reality, I don't really understand why such ultra-high resolution displays would ever take root in home consumer applications. Nobody has eyes good enough to utilize this invention.

    Current monitor technology (particularly LCD) has certainly not reached a pixel density which surpasses human perception. People tend to run current (CRT) displays between 80ppi and 150ppi (and 150 is being generous...it takes a pretty darn high-quality CRT to be able to not have fuzzy pixels at that size).

    Doing a little math (assuming a wide-screen 3:2 aspect ratio and 22 viewable inches) this screen would be roughly 18.3"x12.2", with a resolution of 3715x2477. This yields a density of around 203ppi...certainly not excessive.

    To prove to yourself that this is a necessary improvement, go create a bitmap (not grayscale) image in Photoshop of some text and print it out on your company's laserprinter at 100dpi. Then create the same-height text and print at 200dpi. Stick these side-by-side on your monitor. Unless your monitor is farther away than most, you will find that you can certainly tell the difference, that the text at 200dpi is much easier to read, and that even then it doesn't quite look nice and smooth.

    Text on a 200ppi monitor does have the advantage that it can use anti-aliasing to help simulate greater resolution, but in general more pixels as better (until you reach the limit of human perception). 1200dpi laser printers are needed because their dots are single-shade, and also the resulting displayed output tends to be held closer to the face.

    Exercise for the reader--search the 'net to find the smallest-possible viewing angle that the average person can discern a dot. Combine this with the average distance of a monitor (a little over an arm's length?), sprinkle with trigonometry, and you should come up with the pixel density our displays should reach for.

  • I wrote:
    To prove to yourself that this is a necessary improvement, go create a bitmap...

    At the risk of being slashdotted, I've created such a file for you. It's a 2400x900@400ppi PNG file with the same non-anti-aliased text at 100ppi, 200ppi, and 400ppi. Opening it in Photoshop knows the file's saved 400dpi, but before you print it on your own in another program ensure that it's not about to print out at 72dpi.

    http://phrogz.net/tmp/res.png [phrogz.net] : (48.3k)

  • by nakaduct ( 43954 ) on Tuesday June 05, 2001 @07:35PM (#173341)
    Where do I get 9.2 megapixel porn?

    What's that you say? It doesn't exist? Well, what's this contraption good for then?

    cheers,
    mike
  • why would you need to display up to 16 DVDs on one screen?

    It's a demo. Make the product look good by having it do something outrageous. Think of a sports-car commercial, they show the car going very fast through tight curves and slaloms and what not. They're not suggesting you buy their car to do the same thing (in fact they usually have disclaimers at the bottom: "Closed track, professional driver" etc), but they're making the car look good by driving it outrageously.

  • Comment removed based on user account deletion
  • The upper theoretical limit for conventional LCD production is probably something like 80%.


    So what, theoretically, would limit yield to 80%?

  • Did you know that the Apple Cinema Display also works just fine on a PC, or with Linux?

    The Hercules Prophet II adapter with DVI out and a DVI monitor are a great combination in Windows or your favorite free OS! Haven't tried a Prophet III yet, but it should work.

    Even if you can only get the ADC version (current model) Dr.Bott have an adapter [drbott.com] made for people with older G4 Macs that will turn it back into a DVI/USB/Power device. Or, Gefen [gefen.com] have a DVI-ADC box that includes the power supply.
  • by joq ( 63625 )

    The human eye can only interpret a certain amount of pixels in the first place so this is extremely obsolete to anyone in the world. While some scientist may beg to differ on end no one would benefit from this whatsoever. Well actually Intel would since they'd be suckering someone into spending 200k on this monitor. I'll pass sirs.
    a typical person has a maximum resolution of about 17000 point sources per inch. This doesn't really equate to pixels, but, pixels can be changed into pixels per inch, and that should be close enough.


    The Joy of Visual Perception [yorku.ca]

  • Just because that's a perceived maximum doesn't mean you'd be able to determine this unless you sat in front of your monitor with a magnifying glass. These arguments are always being thrown in the loop with CCD's and crap like that.
  • Actually, the current good 3D cards just need support for higher 2D resolutions as newer displays become available, not "more 2-D computing power". Basically, 2D performance is primarily limited by 2D fill rate, which in turn is determined largely by (graphics-card) memory bandwidth.

    3D cards have continued to push the graphics memory bandwidth curve upwards, so I'm not too concerned that 2D cards "won't be fast enough." Cards nowdays can draw 2D graphics to the frame buffer about 10-1000x faster than the displays can support them using bltblt and similar 2D fill operations. (If you don't believe me, go look at the megapixel fill rates of 2D and 3D cards.) Which brings me to my next point.

    There's no competing on 2D because its irrelevant to the consumer. How so? Well, what's the difference between 500 fps 2D and 1000 fps to the naked eye? Answer: nothing, both are faster than any display device (monitor) can support.

    Sometimes marketeers try to promote (with the help of ignorant journalists) some artificial 2D performance benchmark comparisons, but I haven't seen any in the last 5 years that made a smidgen of difference in ordinary or even extraordinary human usage scenarios.

    I'll agree that Display PDF (and its predecessor Display PostScript) does soak up lots of horsepower, since to display anything means you have to execute an arbitrary program written in a variant of the PostScript language. It'll be interesting to see if any graphics cards attempt to accelerate it rather than just tossing that job to the CPU. Display PDF could make it easier to write graphics applications since it provides a nice library of 2D vector and fill operations, and I'm glad Apple is licensing rather than reinventing the wheel, but you'll never see it take off on the PC because Microsoft is intent on retaining control of those operations within ActiveX.

    --LP, formerly a graphics hardware analyst
  • It's generally accepted that when the monitors get to about 200 dpi, you won't be able to tell the difference with your eye. (Technically speaking, it depends how far your eye is from the display.) Right now, most monitors display at around 75-90 dpi however. So there's still reasonable room for improvement, although I'd agree that today's CRTs are "good enough" aka "just great for many applications". But then I think that my straight lines on my monitors are straight enough too... ;)

    Making geometries better with CRTs is an incredibly difficult technical challenge, but trivial with LCDs. If you are really picky about geometry and the newer flat-screen CRTs still aren't good enough, I'd recommend moving to an LCD display.

    --LP
  • I knew I'd seen this before... IBM announced this [ibm.com] back in November. They've been demoing prototypes of this thing since 1998.

    Having seen the demo I can tell you this thing is amazing. Here's the scene whenever they demo the system at a trade show:

    IBM Guy stands by display smiling as a slideshow of photographs, maps, and newsprint flashes across the screen.

    Techies walking by are inexplicably drawn to the display, much drooling ensues.

    IBM Guy explains that it takes tons of processing power to drive the damn thing.

    Techies ask how much it costs.

    IBM Guy says something like "This particular prototype cost $2 zillion."

    Techies drool

    IBM Guy prepares for the next question, which everyone, without fail, asks...

    Techies ask, "Can I have it?"

    IBM Guy smiles and says "No" instead of something clever like, "Gee, that's the first time I've heard that one. Have you considered a career in comedy?"

    Techies go home with visions of megapixels dancing in their heads, and eventually, one special techie posts his observations. Finally overcoming the embarassment of realizing that his clever line, "Can I have it?" was not so clever after all.

  • Pictures of that resolution will have to be highly compressed and stored in local mass storage or downloaded from off-site servers.

    1) How long will it take to see I/O bandwidth improve to where it can handle real time streaming of such huge pictures from storage.

    2) Is there any hardware out there right now that can handle such high I/O bandwidths?

  • I think the point is that you will shortly be seeing this term in every Best Buy, Computer City and local computer store flyer for the next five years.

    I'm already sick of it and its been only 15 minutes.
  • If I wanted that much resolution, I would want it on an SGI Reality Center [sgi.com]. 15 foot screens and larger, now you're talking. :)
  • Just to help out those who don't find 'megapixel' inutitive. So, given the 4:3 aspect ratio of 1600x1200, this monitor would be running at a res of about 3502x2629, so, that's what? A 17.6 x 13.2 inch monitor? So, that's 198 pixels per inch horizontal, and 199 per inch verticle. Compared to a 1600x1200 res on the same sized monitor... which would be 90 per pixel horizontal and 91 verticle. So, we're looking at about twice the clairity. :) At least, unless I screwed this up :)
  • ... their homeworks.

    Indeed, the other kids have finished theirs long ago, and are in the streets playing ball.

  • Well, a megapixel isn't even necessarily the number of pixels. When digital cameras have their resolution adveritised, the megapixel rating is much larger than the product of the largest resolution. So it's a made up term that seems like it means one thing, but really means something else, for the purposes of deceptive adveritising.
  • You could simply do the mixing in software. If you wanted to do it in hardware, just get a many-channel sound card, like this one [soundscape-digital.com]. Windows only, of course, but high-end sound support is one of the areas in which free OSes are most lacking.
  • but I am running Windows 2000 and am not to sure of support for it / video cards? Anyone able to shed some light on how good / incompatible this sexy monitor is?

    For a while we had one in the lab where I work running off a P-III / G400 running Linux. So I suspect pretty much any PC will work fine. But the machine is now a server so we've got this really nice monitor just sitting around doing nothing. :(

    But yes, those things are very pretty (and very expensive, last I looked).
  • So if it's sitting around doing nothing and i am interested in one..... ;-)

    Yeah, right dude. If they ever just toss that thing out, I'm getting dibs. :)
  • by randombit ( 87792 ) on Tuesday June 05, 2001 @07:38PM (#173360) Homepage
    1) How long will it take to see I/O bandwidth improve to where it can handle real time streaming of such huge pictures from storage.

    Quite a while - a gig per second is way more than even 66 Mhz 64-bit PCI can handle (I belive that will only go up to ~600 (?) Mb/s). Also, there really isn't much need for it. I mean this monitor is nice and all, but with a 20K pricetag and no real applications for it (for the average user, anyway - maybe this would be useful in some 3d modeling scenarios, perhaps), I doubt we'll be seeing technolgy to support this kind of thing in even high end workstations for at least a few years. Both Intel and AMD are working on new buses to replace PCI so maybe that problem (the bus) will go away fairly soon. Actually getting a gig per second off a disk - that could take a while.

    2) Is there any hardware out there right now that can handle such high I/O bandwidths?

    Surely. But once you're looking for that, you're talking about big machines with a lot of hardware RAID. I'd bet a E10K or S/390 could handle that kind of I/O. If you meant something you could actually afford - probably not. ^_^
  • by Ryu2 ( 89645 ) on Tuesday June 05, 2001 @07:59PM (#173361) Homepage Journal
    is here [stanford.edu], known as the FLASH graphics system. I worked closely with those folks, not on FLASH itself, but on an ancillary project to visualize various parts and parameters of the computer system (bus utilization, cache latency, etc).

    Very cool stuff that's just starting to get commercialized -- this is what you'll be seeing in your GeForce 4s or whatever.

  • Berlin [berlin-consortium.org]

    -----
    "Goose... Geese... Moose... MOOSE!?!?!"
  • It's used by people who use that odious 'spendy' word.
  • Thank You. I was actually considering making my next computer a Cube, mainly because I wanted one of those displays. I've never owned a mac, and it never occurred to me to ask if that display was PC compatible. I mean, who would have thought?

    Although, I still might migrate to the cube, because I like the idea of a totally silent system.
  • Last time I looked, Win32 display contexts also let me work in independant, transformable (rot/scale/shear etc) world coordinates [microsoft.com], and supported beziers [microsoft.com] & advanced clipping [microsoft.com].

    GDI+ (in WinXP) adds better alpha support, amongst other things, which Quartz already does. Haven't programmed in Quartz so I can't compare directly though.

  • My friend's computer is called mxyzptlk. Try hacking that!

    $ nmap mtzplyk
    Failed to resolve given hostname/IP: mtzplyk.
    $ ping mztlplkt
    ping: unknown host mztlplkt

    --

  • displays 9.2 million pixels [...] needs [...] and 2 GB of ram

    With a colour depth of 24 bit/pixel, 9.2 Mpixel require below 28 MB. In 32 bit mode, it's well below 40 MB.

    So the conclusion would be that this beast must have something like 1780 bits per pixel - WOW, that's a hell of a lot colours :-)

    (Yes I know what e.g. texture memory is good for.)
  • The 1600SW on it's own isn't very compatible - there's the 2 cards a previous poster mentioned (N9 RevIV - now dead so hard to find; 3DLabs - no personal experience). This is all 'cos of the connector on the end of the 1600SW's cable ...

    If you want compatibility, go for the multi-link adapter (MLA) - then the range of cards available goes up ... ignore the SGI superwide-savy page, it's useless and out-of-date. And the MLA lets you enjoy fullscreen gameing, irrespective of the resolution used - it just scales the image accordingly (I don't even notice that Q3A looks a little stretched anymore ;-)

    Browse - comp.sys.sgi.hardware plenty of user experience in there.

    Briefly, the following seem to work best (assuming you want DVI output - if not, pretty much anything will do) ....
    Asus v7100DVI (GeForce2MX)
    Herc ProphetII Ultra DVI (GeForce2 Ultra)
    Herc ProphetIII DVI (GeForce3 Ultra)
    Matrox G400 DVI
    Matrox G450 VGA
    ATI Radeon AIW DVI
    And prolly others I've forgotten .....

    Pretty much all of these work in any flavour of windows. XFree support is available, but you'll need to work on the config file to get the timings right (ask on the newsgroup, someone will help you out).

    Key thing is 1600x1024 resolution support in the drivers; and the bandwidth of the TDMS for DVI output (look for an external SIL164 on NVidia parts - otherwise forget it).

    Cheers, Carl.

  • One cost advantage of very high resolution LCDs is that they hide some flaws such as dead pixels. Hot pixels will still stand out, but they'll be tiny.
  • by Animats ( 122034 ) on Tuesday June 05, 2001 @09:08PM (#173378) Homepage
    The display conference was this week, and everybody is hyping their favorite technology. Actually, I was expecting a "plastic transistor" article, that being one of the hyped technologies.

    Incidentally, the reason displays don't obey Moore's Law, even though they're made using photolithography, is that making transistors smaller doesn't help displays.

    As the article points out, displays are getting cheaper at about 8% per year. That's a lot better than almost any other non-IC product. Be patient.

    If you're near SF, incidentally, visit the Sony Metreon, which is being used to show off Sony flat panel displays of various flavors. They're everywhere. Little ones. Big ones. LCDs. Plasma panels. No Jumbotron, though.

  • I think it's supposed to be a joke. DivX is easily counted as 'the Bose of video codecs; most people think it's great, but anybody who knows better knows it sucks.' No highs, no lows, must be Bose.
  • My CRT Sony 17 inch monitor is just great for my many applications. At some point all of those "megapixels" just don't matter, the human eye can only resolve detail to about a tenth of a millimeter after that it is pointless to have higher resolutions. Instead of working on the the pixel number companies like sony need to figure out how to make the geometry better with these monitors. I've got 3 sony monitors here and each one has flaws when it comes to making a straight line actually "straight" on the screen. Anybody else with the same troubles?

    Nathaniel P. Wilkerson
    .biz and .info for $13
  • Are they? I haven't done any OS X programming, so I don't have any first-hand experience with the OS X widgets, but I do know the icons scale very nicely when you twiddle the icon size slider.

    In any case, as http://developer.apple.com/quartz [apple.com] says, "Quartz supports PostScript®-like drawing features such as resolution independence, transformable coordinates (for rotation, scales, and skews), Bézier paths, and clipping operations."

    That's a lot more than what Windows offers. So while I don't know for a fact that the widgets are resolution-independent, the underlying graphics API certainly is.

  • You mean no one besides Apple? OS X's Quartz is vector-based.
  • by istartedi ( 132515 ) on Tuesday June 05, 2001 @09:05PM (#173386) Journal

    It's annoying to those of us who realize that the "megapixel" number rises faster than the dimensions of the display. The dimensions are linear, and increase linearly. The megapixel number is a product, and thus increases at a rate proportional to the square. It's pure marketspeak and psychological manipulation. They figured that idiots would go "ooooohhh look at all those megapixels". Meanwhile, those of us who know better have to guess the aspect ratio, and back it out to find out what we really want to know.

    The display in question, were it square, would be roughly 3033 pixels on a side because that's the square root of 9.2 million.

    Now... what's the aspect ratio... umm... the article doesn't say. So... Let's say the horizontal resolution is 4096, then the vertical could be 2246. 4096*2249=9199616. Close enough for government work.... But... We just don't know. That, my fellow Slashdotter, is why "megapixel" is annoying.


    "Hoarders... cannot help their neighbors" --RMS
  • The fact that I can discern individual pixels on my monitor is a problem. What we need is smooth curves without anti-aliasing - let our eyes blur the lines for us.

    When these things go consumer level, I'll be the first to pay up (Even if it's up to $10k) - I don't think you really understand how fucking incredible this would look.


  • by tylerh ( 137246 ) on Wednesday June 06, 2001 @10:10AM (#173388)

    At 9 megapixel, you are at 35 MM photography resolution (assuming enough color depth - betwee 16 and 24 bit, if memory serves). With this display, the need for chemical photography evaporates for all but niche applications. True, for motion, this is complete overkill. The human eye/mind would be hard pressed to absorb this many pixels at 24 frames/seconds.

    Think big: put two dozen in your house, network them, and every day you could live in a different world class museum. Or have you photo album be available via voice command instead of having to get it of a dusty shelf. or be surrounded by stunning high production value or porn. or whatever.

    The resesearchers reached for 9 megapixel for a definite reason: 150 years of chemical photography says this is a resolution people really like for still pictures.

  • Um, this monitor sounds like it's about 200-300 pixels per inch, which seems like it would be far below the limit of perception.
  • Somewhere in MSDN I read an article about how to make Windows apps work on high-dpi monitors; I haven't seen such docs for any other platform.
  • Yes, Quartz 2D supports all those whiz-bang features; too bad most OS X apps are Carbon and are using QuickDraw instead of Quartz 2D.
  • But it's not resolution-independent. For example, all the widgets are bitmaps, so on a high-dpi monitor, they would become tiny.
  • You kinda forgot the small detail that the retina doesn't have a 16"x12" area.

    The maximum resolution of the human eye has no relation to the size of the viewed object, but instead is related to the size of the retina (where the image of what a person sees is projected and transformed into nervous impulses).

    Now, if you get close enough to the monitor so that the distance between it and your eye is the same as the distance between the front of your eye and your retina (about 1")? then you have a 1:1 relation and you can apply that rule. On the other hand, even if you could focus your eye such a short distance, having a 16"x12" display would still be incredibly useless ...

  • by homer_ca ( 144738 ) on Tuesday June 05, 2001 @07:56PM (#173397)
    The answer is eyestrain. It's much easier to read 300dpi text than 72dpi text. Just like it's easier on the eyes to read a printed book than a monitor screen.
  • by Coulson ( 146956 ) on Tuesday June 05, 2001 @07:34PM (#173398) Homepage
    For comparison, Apple's 22" Cinema display, the best consumer flat-panel on the market (it's gorgeous!), has only 1600x1024 pixels, or 1.5 megapixels.... then again, the cinema display only requires 1 CPU, and costs about $2500.

    One of the largest problems facing manufacturers of large-dimension LCD screens is the high rate of failure during the manufacturing process. This means that each batch of fabricated monitors yields a low number of functional units, driving up the cost per unit. I wonder how the researchers were able to combat this, while at the same time increasing the pixel density by 7x?

    Can someone who's more familiar with the industry give approximate numbers on the failure rate of LCD manufacturing? Are we talking 1 bad screen in 20, 1 bad in 2000, or...?

  • making sure you don't make a mess on the opposite wall. glass is easy to clean.
  • by been42 ( 160065 ) on Tuesday June 05, 2001 @07:36PM (#173401) Homepage

    If you can somehow trick them into saying "mexapixel" backward, they'll be sent back to their own dimension. Maybe we can grab their $200,000 display as they're fading away!

    Them: "Lexipaxem...Oh, crap!"
    Us (grabbing display): "See you at the pawn shop...SUCKER!"

  • by Xylantiel ( 177496 ) on Tuesday June 05, 2001 @08:59PM (#173402)
    Just so you know, printers need those high resolutions so they can DITHER to get colors other than Cyan, Magenta, Yellow, Black (CMYK) or white. Monitors don't have to dither and can actually display the color value needed with coincident illuminated elements whose brightness can be adjusted. For example, on a printer if you assume that you need an area of about 6x6 =36 pixels to actually get to any color, your effective resolution is: 1200/6=200 dpi. This 6x6 value is actually dependent on the picture you're trying to display, where you are in that picture, and probably your dithering algorithm.

    It's useless going past about 300 dpi on a black and white (two-color, no-grayscale) printed page as that is the limit of the human eye's resolution at about 10 inches. The printer needs that extra resolution to dither for grayscale and color. With a monitor, 10 inches is pretty close, back off to 15 inches and you only need 200 dpi. No dithering necessary so that's it, you're done, optimal display. (Actually this is generally overkill because the eye's resolution gets worse for things that aren't just black-on-white, but if the display will be used a lot to just read text like the black on white here on slashdot it's better to be safe.)

    Just clearing up some misconceptions.

  • The claim that it requires 16 Pentium 4's to drive seems a bit dubious, either producing images on the screen requires a lot of math, or the media is just confused again.
    The article's not perfectly clear but it does seem to mean that the 16 DVDs playing simultaneously require 16 P4s. Not the display itself.
  • by RedWizzard ( 192002 ) on Tuesday June 05, 2001 @08:33PM (#173405)
    I think they're talking about a setup that would run those 16 DVDs at once costing that much. Normal use as say, a computer monitor would only require the one PC - and the cost would be a LOT cheaper... Although they don't mention the cost of the monitor itself.
    They mention in the article that the display itself is made by IBM. It is probably one of the ones mentioned in this /. article [slashdot.org] last year. What Intel have done is the 16 simultaneous DVDs, which sounds like a waste of time to me. They're not using all the display either: 4x720 by 4x480 is 2880 by 1920, only 5.5 of the 9.2 million pixels available. The monitor's resolution is 3820x2400 (info here [ibm.com]) so they should have been able to display 25 DVDs at 720x480.
    From the wording, I'd say it'll be in the neighborhood of at least $2000-4000:
    More than that. IBM's T210 [ibm.com] (20.8", 2048x1536) is around $6000. I'd say this monitor would be at least $20,000-$30,000. If you can buy it at all.
  • Doesn't anyone do the slightest bit of research anymore? The claims for this display needing 2GB RAM and god-knows-how-many processors are completely spurious.

    What is the point of running stories that are factually completely wrong? Just post the link to the research papers and move on.

    What this moron seems to have picked up on is the Lightning 2 project, which basically is a video crossbar switch. They used this to combine the DVI outputs from a network of eight slaved 1.5 GHz P4 machines each with 256 MB RAM and a GeForce2 Quadro video card (starting to see where some of the numbers come from yet?) plus another machine as master running their own parallel rendering code over (probably) gigabit Ethernet. They tested Lightning 2 with the IBM "Big Bertha" 9.2 million pixel display.

    They achieved frame rates of 12 Hz and 60 Hz (with and without depth buffer capture which requires the Z-buffer to be transferred from each PC to Lightning 2 as well as the actual rendered image segment) on a 1.16 million triangle model.

    Lightning 2 itself has 512 MB of RAM and 23 Xilinx FPGA devices.

    None of this has got anything to do with the actual display itself, which as someone else pointed out "only" needs about 40 MB of framebuffer storage; i.e. a 60 Hz data rate of about 19 Gbps, or 2.5 GB/s.

    Now can you PLEASE try and get things RIGHT in future.

  • a typical person has a maximum resolution of about 17000 point sources per inch.

    So, assuming for the sake of argument the monitor has a roughly 16"x12" viewable area, that gives (16*17000) * (12*17000) = 55,488,000,000 points, or 55.5 gigapoints, as the limit of human eye resolution for a screen of that size. That's several orders of magnitude over the announced 9.2 megapixel display, assuming that pixels are roughly equivalent to points.

    Incidentally, putting that 9.2 megapixel value into more easily understandable terms gives a display size of roughly 3500x2600 (assuming a 4:3 display ratio). Good? Yes. Perfect? No.

    --
    BACKNEXTFINISHCANCEL

  • Correct you are; my apologies.

    Even so, I'm not convinced that a 3500x2600 display exceeds the limit of human eye resolution--though that may just be because I'm used to sitting a foot from my monitor...

    --
    BACKNEXTFINISHCANCEL

  • I don't know where this Douglas Gray guy got his information, but he got a lot of it wrong, wrong.

    As far as I can figure, the display he's talking about is IBM's Big Bertha [ibm.com]. It was custom-built for Laurence Livermore national lab, and it runs at a native resolution of 3840x2400.

    I saw a prototype of this display at Supercomputing 2000 in Dallas last year. It was running off of an IBM-brand Wintel system-- can't recall which one, an Intellistation, I guess-- with four 1920x1200 graphics cards. The monitor was stitching the four images together seamlessly.

    According to rumor they hooked it up to their bigger iron from time to time, but when I saw it, it was running NT.

    So I don't know *where* the author got his "it takes 16 CPUs and costs $200,000" stuff. Hell, LLNL only paid $80,000 for the prototype-- see this [fcw.com] Federal Computing Week article. According to the IBM guy I talked to at SC2000-- although I can't seem to find a confirmation of this in writing anywhere-- when the monitor is commercialized sometime this year, they're expecting to sell it initially for about $20,000. One too many zeros, Doug. ;-)

    And the obligatory remark: yeah, it's an incredible display. Like reading a newspaper-- effectively about 200 ppi. But for any traditional computer application, it's not really practical. Once you get past the "wow" factor, this thing really lives up to its nickname: the IBM Squintron 2000.

  • (Yeah, I know. -1, Offtopic.)

    Actually, the not-uncommon 1080p resolution (1920x1080@60p) is an awfully good compromise for film work. Traditional film work is done at 2K resolution-- 2048 pixels across by however-many (depending on your aspect ratio). Some 4K work is done sometimes, but very, very rarely.

    With high-definition telecines and recorders becoming more common, many post houses are doing their film finishing in 1920x1080, 24-frames-per-second progressive-scan. The data rate is a lot lower than 2K, and a HELL of a lot lower than 4K, and you can display your work on commercially available HDTV broadcast monitors-- the 32" model from Sony comes in well under $100,000.

  • I'd bet a E10K or S/390 could handle that kind of I/O. If you meant something you could actually afford - probably not.

    Oh, you'd be surprised. You can do HDTV editing-- 1080i resolution-- with just 32 Seagate fibre channel disks on four FC loops. That's enough bandwidth to do RGB HDTV (186 MB/s, more or less) times two. (Gotta do two for real-time dissolves and such.)

    To do more I/O is really just a matter of buying more disks.

    What's that you say? You need your operating system to be able to efficiently balance multi-gigabit-per-second I/Os across several I/O channels on several busses? Oh, that's a software problem. I believe the question was originally about hardware. ;-)

  • by foobar104 ( 206452 ) on Wednesday June 06, 2001 @01:44AM (#173416) Journal
    f4 ?0. (Or four 1920x1200s, if you prefer.)

    I would say RTFA, but the FA completely failed to link to any relevant info. I had to search myself; IBM's page is here [ibm.com].

  • by Preposterous Coward ( 211739 ) on Tuesday June 05, 2001 @08:53PM (#173419)
    Displays are really used for two quite different things (even though they happen on the same screen): Displaying images and displaying text. For images it's arguable that current displays have sufficient pixel density for normal use, but this is definitely not the case for text.

    Think about print. Early laser printers were 300-dpi resolution, which is pretty good -- good enough for entry-level, mass-market desktop publishing -- but is still quite low by traditional printing standards. The professional (print) graphics shop I used to work for used to run its phototypesetters ($100,000 behemoths) at 1,200 and 2,400 dpi. (And that's in both dimensions, so 1,200 dpi output has 16 times the information content of 300 dpi output.)

    600 dpi print is pretty high-quality and acceptable to most applications, particularly with the availability of resolution enhancement, which is roughly the printer equivalent of anti-aliasing. And personally I couldn't tell any difference going past 1,200 dpi.

    Anyway, my point is that even 200-300 dpi isn't as good as we might really hope for. Still, it's a vast improvement over the resolution of today's displays. I hope this will have an impact on the well-documented fact that people read (current) computer screens more slowly than printed material and find them more uncomfortable. There's actually a lot of subtle but very helpful detail contained in type that gets lost at lower resolutions such as those used by most displays today.

    In a way, graphics are more forgiving than text, because the human eye-brain combination is pretty good at interpolating what's intended if there is sufficient resolution or sufficient color depth. Of course, artifacts can be pretty nasty too -- we've all seen bad cases of the jaggies. But the bottom line is, I think if you ever get a chance to actually see a display this good you won't doubt that the extra pixel density is wasted.

  • "The display measures 22 inches across the diagonal, has a 9.2-megapixel display and can divide its screen to run 16 DVDs (digital versatile discs) simultaneously in 720 by 480 pixel mode. There's one drawback for end users: it takes 16 of Intel's Pentium 4 processors to run it, and would cost about $200,000."

    I think they're talking about a setup that would run those 16 DVDs at once costing that much. Normal use as say, a computer monitor would only require the one PC - and the cost would be a LOT cheaper... Although they don't mention the cost of the monitor itself.

    From the wording, I'd say it'll be in the neighborhood of at least $2000-4000:

    "In two years, we want to see a 27-inch monitor able to display two pages side by side for $2,000," Leglise said. And in five years, we want to see a 10 megapixel display for $2,000."

  • This is so wrong, I don't even no where to start. First of all, SOMETHING has to RASTERIZE the Image. Putting it into the monitor or into something that sits next to the monitor is only a matter of cable length. Doesn't matter where the rasterizer is. Second the assumption that display PS is the most efficient meta language is suspect as well. For monitor usage it is too much and too slow. This is why OSX uses PDF rather than PS as it's metalanguage. There are many meta languages that are available, from GDI, to X, DPS and PDF, and others that are fine. The dependency comes on the rasterizers and the languages. Most of the time it is moved to the video card and rasterized there. That is what a video driver does... (Takes machine side Meta language and converts to hardware rasterization, Sometimes taking pixels, sometimes taking vector data.). Normal PS rips that do high resolution are BIG and EXPENSIVE, and SLOW (for video purposes). Damn, I just used up all my mod points, so I was forced to respond.
  • Didn't look very hard, did you [sgi.com]? ;-)

    (No, I don't know anything about it's goodness, but that link should tell you all you need to know about compatibility)
  • by Migelikor1 ( 308578 ) on Tuesday June 05, 2001 @08:53PM (#173450) Homepage
    Like with every new technology reviewed on slashdot, the real question about it is, what does this mean for Quake? Since it's resolution functions are nonexistant, wouldn't the Quake screen just shrink to about two square inches (maybe less)? In all reality, I don't really understand why such ultra-high resolution displays would ever take root in home consumer applications. Nobody has eyes good enough to utilize this invention. Maybe it comes with extremely high magnification glasses or eye gene manipulation tools! It could be useful for wearable technologies, though. The 1/4 inch display in your glasses would look a lot better with if it could display at a normal resolution. In current technology, the number of pixels aren't nearly as important as the refresh rate anyway. Unless you're a graphic pro, it's the headache-giving flicker that needs 200,000 research projects, not the number of tiny dots, and good LCDs take care of that anyway.
  • by MOSSey0T0 ( 412568 ) on Tuesday June 05, 2001 @11:00PM (#173453)
    LCD manufacturing yields must be over 40% in order to make a profit. Volume is generally not a problem because most of the existing fabs are overbooked. The recent introduction of Taiwanese manufacturers into the Japanese dominanted LCD manufacturing industry was seen as being sucessful (read: profitable) when they reported yields of 50%. Japanese manufacturers typically produce devices at 70% yield efficiency. Since the glass substrates used to produce LCD devices are a little under a metre squared (600*700 mm), a single substrate should be able to produce 4 17" screens or 6 14" screens. It is generally accepted that LCD yields will never approach IC yields, which are typically around 90%. The upper theoretical limit for conventional LCD production is probably something like 80%.

    A 1 in 20 failure rate would be a 95% yield. So, in short, we're looking at a 3 in 10 failure rate in real life with a 70% yield.


    If you want to read the rest of my long-winded post, go on. If you've had enough, I suggest you go play your favorite video game.


    People don't realize how complex an LCD is. The traditional example is two glass substrates surrounding a layer containing the liquid crystals, which is itself sandwiched between two polarized layers. Then there is a backlight and an active matrix of transistors used to address each pixel. Because contaminants will kill pixels in the display, this layer must be filled under ultra-clean, high vacuum conditions. The glass substrates (AND every single layer of the LCD)must be manufactured to precise planar dimensions to prevent dimensional variations across the screen. Since larger substrates approach a square metre in size, this is not an easy task. Advanced LCD technology has taken advantage of polymer coatings on each layer to separate them, prevent contamination, act as internal reflectors, etc. This adds more layers, and adds more complexity to the process.

    The active matrix of transistors itself consists of a grid which is vapour deposited under ultra-high vacuum. This layer consists of at least three layers itself: anode, cathode, and at least one active layer, since active matrix LCD screens rely on field effect to twist the LCDs.

    Perhaps it is not evident to the reader that high vacuum and fast manufacturing processes don't exactly mix. Even if you achieve vacuum, you are basically racing against time to complete your manufacturing/analysis before residual impurities hopelessly contaminate what you are working on.

    This is why plasma and OLED displays (hopefully soon to be my research topic) are being pumped up with research dollars. The accepted theoretical limit (economically) for current LCD active matrix displays is 30", and the market is clearly looking towards very massive wall-mounted units in the future. LCD will probably dominate the POS, business and computing display markets as people become more enamoured with flat panel units, but due to sheer complexity, I think it's only a matter of time before they are eclipsed by either OLEDs or plasma.

    This is long. whew. sorry, guys. pixel density was increased as a function of more efficient transistor drives for each LCD cell. It's relatively easy to pattern pixel densities that small (look at CRT monitors), but with better field effect transistors, the ghosting and cross talk present in early displays was eliminated. This was accomplished by adoption of amorphous silicon in the transistor layer as opposed to the original CdSe thin films. That in turn was enabled through advances in physical and chemical vapour deposition which allows manufacturers to pattern the transistor matrix more precisely.

    That's the best I can do right now. I mainly regard LCD wrt to comparisons with OLEDs, which I am more familiar with, so I apologize for any screw-ups in here.
  • I think the problem is that it only gives the number of pixels, it doesn't describe the size of the monitor.

    1280x1024 pixels is more descriptive than 1.3 megapixels, IMO.

    Also, there was no reason for them to mention DVDs at all -- they could have just given the maximum resolution, which would have meant a lot more. I can guess 2880x1920 based on that information, but that's only 5.5 megapixels.

    Or am I missing something completely?
  • This is a link to the display subsystem that was powering that 9.2 million pixel display

    http://www-graphics.stanford.edu/papers/lightning2 / [stanford.edu]

    You'll be able to see it with some hopefully very cool imagery at this years SIGGRAPH It really has to be seen to be believed... beautiful!

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...