22" 9.2-Million Pixel Display 165
chrisd writes: "Just noticed this article over on Yahoo news. It described a research project that Intel and Stanford university developed that concentrated on next-gen displays. The result? A 22 inch display that displays 9.2 million pixels (they use the odious 'megapixel' descriptor in the article), needs 16 processors and 2 GB of ram to run it and costs $200,000US. So it's a little spendy. This is a big step up from my first 12" amber screen though, that's for sure." Ah, the march of progress ... I'm happy with anything that will help drive down the cost of 17" and 18.1" LCD displays, no matter how indirectly.
Re:High-DPI monitors need resolution-independent G (Score:1)
9.2 megapixels.... (Score:1)
display is 4:3, you'd get:
3x*4x = 9.2e6
x = 876
4x = 3508
3x = 2628
3504x2628
Been there, done that (Score:2)
We exploited the fact that existing LCD technology utilizes a semi-transparent membrane that produces no light of its own and is illuminated usually by way of a fluorescent light. We were able to introduce a very high resolution to these screens by actually arranging them one on top of the other, five LCD's deep. The whole setup relied on some very precise arrangement (each panel needed to be approximately 1/10 of a pixel width offset from the one beneath!). We ended up milling an enclosure with a high-tolerance CNC machine to even make the thing feasable. In addition, we eventually found that a fluorescent light source simply couldn't put out enough lumens to sufficiently illuminate the grid so we switched to a mercury vapor setup. The energy efficieny in these things is atrocious but it really was the only thing feasible at the time.
Suprisingly enough the physical setup was relatively simple compared to the interpolation and digital signal processing required to composite a standard VGA signal across an array of dozens of interlaced LCD displays! I attribute most of our success in this to the fact that Daron and I are both wizzes in QBasic. For those of you who don't know QBasic is a low-level systems language specific to the windows API. It's one tremendous advantage is that it is native to the OS (operational system) and hence runs at what we call real-time kernel speed. I think this is really where MacOS and linux tend to fall down--they mostly rely on cross-platform languages like C, which, unfortunately must perform to the "lowest common denominator." QBasic afforded us a modularity and granularity of performance such that I don't think a similar feat could have been pulled off in any other environment.
Re:Comparison with apple 22" cinema display (Score:2)
LCD polarization and 3D? (Score:1)
This would certainly be a big improvement over CRT displays and relatively bulky electronic shutter-glasses currently used for a lot of visualization work, across many industries.
I think such a technology, if made cost effective, could ultimately make a huge splash in the computer and consumer electronics industry (e.g. games, 3d visualizations, stereophotographs).
Anybody in the field around to comment?
-Isaac
Assume standard 4:3 (Score:1)
Then we're talking roughly 3500 x 2625 (9,187,500) (3502 x2626 = 9,196,252 is the closest you can come without exceeding 9.2 million).
Close enough. And probably accurate, since nobody in the industry actually posts real numbers anymore.
Chas - The one, the only.
THANK GOD!!!
Re:Two questions (Score:2)
I could see this used for film resolution work as well. It's close enough to film rez specs that I tend to think that's the market the researchers are targeting in the near term.
Re:Megapixel (Score:1)
They're probably counting the separate red, green, and blue elements. These are separate pixels, as they are offset from each other. A special filter, known as a "demosaicing" filter, is required to correctly merge these offset images into the Y/Cb/Cr planes that are encoded in the final JPEG. The ratio isn't a factor of 3 as you might expect, as mosaic patterns tend to use more green than red or blue. (The patterns I've seen have 5 green to 2 red and 2 blue.)
This page [stanford.edu] describes mosaicing to some extent, in the context of reverse engineering a Kodak digital camera. This page [stanford.edu] at the same site offers some other details. And Ron Kimmel's page [technion.ac.il] has some neat pictures showing different artifacts due to poor de-mosaicing.
--Joe--
Re:High-DPI monitors need resolution-independent G (Score:3)
Hey, buddy, you misspelled "Apple," there. OS X is thoroughly vector-based.
--
Megapixel (Score:2)
Display and Video Card (Score:2)
Even though video cards have gotten more powerful in recent years, it has generally been in the 3rd dimention, rather than the first two were most of us spend our lives working. For instance, most people who work with intel based hardware will only concider nVida video cards. But guess what? My ATI Radeon card is FASTER in 2D display. It is nice that you no longer need an expensive fast video card to work in photoshop. But lack of competition in the world of computing....is a strange thing.
Apple's Display PDF (Quartz?) standard in Mac OS X may actually be the first thing in a while to push 2D acceeration forward.
Electrophoretic displays (Score:2)
It is based on the principle of electrophoresis - moving microscopic particles with electric charge while suspended in opaque fluid. When the particles are moved to the front they are visible, when they are pulled back they are obscured by the fluid.
Since the particles have the same density as the fluid they don't drift. The image stays even after you turn off the power. Power is only required to modify the image, resulting in extremely low power consumption. No backlight is required - the display is reflective and has very high contrast, supposedly comparable to ink on paper. It is clearly visible in full sunlight.
Since the image remains without refreshing it is not necessary to update the image tens of times per second just to get a stable image. This makes it easier to reach very high resolutions (>200DPI). This technology may not be appropriate for video, but it should be ideal for e-books.
If you look at their site [copytele.com] you will see that they now sell encryption devices. This move is relatively recent. In their press releases [copytele.com] you can see that they are still working on their display technology. I first heard about them in a Popular Science article in 1985(!). I'm still waiting for them to bring this display to market...
-
Re:Electrophoretic displays (Score:2)
Their Magicom 2000 product is a clever screen phone/paperless fax device that allows faxing a document and scribbling on it while talking. It's a nice product but is that the right product to launch a revolutionary display technology?. Consider the fact that both parties need to have these devices for it to be useful.
-
For once a post that's not angry and bitter (Score:1)
For a tech article, I was kind of disappointed that they got DVD wrong. DVD doesn't stand for "Digital Versatile Disc" (though it may have at one point). DVD doesn't stand for anything.
you don't need $200k to drive 9.6mp (Score:3)
I'm a bit disappointed in their manufacturing though. I have one born in Oct 2000 and another born in March 2001. Unfortunately the older one has a different white point and neither have hardware color temperature adjustment. But it was the low cost that allowed me to get away with having a 2560*1024 (soon to be 2560*2048!) flat panel desktop, so I can't complain too much.
Now if you want to display a bunch of DVDs on those monitors, you're going to need a heck of a lot more bandwidth than the PCI bus can provide you. (G200MMS is PCI card) And you'll need something to decode with too. But if you didn't read the article you should know that you don't need that kind of cash to get 9.6 million pixels up on a monitor, as long as the pixels aren't doing much.
Re:Comparison with apple 22" cinema display (Score:1)
Just wait -- similar displays at similar prices will be mass-market within a year.
--
Mexapixel (Score:1)
Re:Comparison with apple 22" cinema display (Score:2)
Realistic Maths, Please (Score:1)
I guess I really shouldn't be suprised at Slashdot readers having no understanding of basic optics. I note that the parent post has been modded 'Flamebait' as well.
I don't know if the unit is 1700 point sources, but I would read that as per 'square inch', so your mathematics should be (16 * 12 * 17000) = 3,264,000 - or less than 9.2 megapixels.
I remember reading a few years ago (can't remember the cite) that about 3000 by 3000 pixels was the maximum that a human eye could discern (no matter what the distance) while still seeing the entire screen at once. Of course a bigger screen close up would look blocky, but you couldn't view it all without getting futher away anyway.
This was based on calculations which said that (from memory again) humans can see an arc of about 10 degrees at once, and have a certain number of photo-receptors per degree of arc in the eye. Once you get one pixel per photo-receptor, that's about the limit. Make it one and a half for smoother blending - and take into account that inperfections in the lense of the eye will blur it more anyway.
Re:Two questions (Score:1)
It can play synchronised movies at that resolution, or generate real time graphics at that resolution.
http://www.sgi.com/features/2000/feb/hayden/ind
Re:Megapixel (Score:2)
16 DVDs (digital versatile discs) simultaneously
in 720 by 480 pixel mode." The number 16 can only
mean a 4x4 or 2x8 partitioning, with the latter
being a rather ridiculous aspect ratio. Therefore,
The display is around 2880x1920. So I'd guess
they refer to around five million pixels. This is
further reinforced by the fact that the article
is down on 2 megapixel displays while portraying
10 megapixels ones as many years into the future.
Yum. (Score:3)
--
Display PostScript Anyone? (Score:1)
I would be much more inclinded to use a monitor like this if it used Display PostScript. They should really incorporate a custom DPS RIP (normal PS RIPs already exist, so this isn't much of a stretch). A mass produced DPS RIP would certainly cost less than 16 P4s and it wouldn't even require a bus the size of Utah between the computer and the display.
My 2 cents.
Aww shit... (Score:1)
Why 'odious'? (Score:2)
Surely you don't thik we should go around talking about how our floppies can hold one million four hindred-forty thousand bytes of data, do you?
Kevin Fox
--
Re:It's big, but not cheaper per pixel (Score:2)
I try, but 8% per year is glacial compared to the rates for ICs and for magnetic/optical (take your pick) storage technology. Believe it or not, your average "good" PC system used to be equal parts (in cost) of RAM, mobo, disk, and display. Now, for systems that we'd really want, it's better than 50% display (thinking along the lines of a big Apple flat panel and even the most tricked out consumer-level PC).
Under these conditions, patience is hard.
Re:Comparison with apple 22" cinema display (Score:2)
For example, some folks might find a 22" monitor more handy than a 17" monitor, regardless of the dpi.
Re:High-DPI monitors need resolution-independent G (Score:2)
all available microsoft oses use bitmap graphics, which are resolution dependent.
what you're looking for is vector graphics, which are used to a large extent in apple's mac os x, and to a minor degree in gnome.
i would say microsoft are actually lagging in this race, unless they've got some vector tricks up their sleeves in windows xp (which i haven't played with yet, so can't comment on).
matt
Re:Megapixel (Score:2)
That's 12 megapixels something like 20 years ago.
Of course it was monochrome storage tube technology.
Long overdue: doesn't follow Moore's law, for sure (Score:2)
At the same time, we've seen incredible increases in processing power, from a 10mhz to a 1ghz CPU. In graphical terms, we should have a 3d holographic display on our desktops by now.
I personally blame the CRT manufacturers for this extreme lag in display technology. They've really controlled the prices and their technology hasn't increased significantly.
Then there is the industry that placed all its bets on CRT technology for the longest time. But, considering what we've seen in the recent past, I think we're going to get a surge in display technology. CRT technology is increasing in quality, and alternative technologies like LCD are becoming mainstream.
Here's to graphics display research.
Re:Stanford project page (Score:2)
The paper on the Lightning-2 hardware that drives the monitor is here [stanford.edu].
See also WireGL [stanford.edu] and its associated paper [stanford.edu], which was used to run Quake3 on the IBM display.
Re:On the subject of monitors (Score:1)
Re:On the subject of monitors (Score:1)
On the subject of monitors (Score:2)
Anyways - now we are on the topic of monitors - has anyone in the slashdot crowd had any experience with the sgi flatscreen monitor? http://www.sgi.com/flatpanel/ - I am very interested in picking one up.... but I am running Windows 2000 and am not to sure of support for it / video cards? Anyone able to shed some light on how good / incompatible this sexy monitor is?
Re:Possible applications (Score:1)
Current monitor technology (particularly LCD) has certainly not reached a pixel density which surpasses human perception. People tend to run current (CRT) displays between 80ppi and 150ppi (and 150 is being generous...it takes a pretty darn high-quality CRT to be able to not have fuzzy pixels at that size).
Doing a little math (assuming a wide-screen 3:2 aspect ratio and 22 viewable inches) this screen would be roughly 18.3"x12.2", with a resolution of 3715x2477. This yields a density of around 203ppi...certainly not excessive.
To prove to yourself that this is a necessary improvement, go create a bitmap (not grayscale) image in Photoshop of some text and print it out on your company's laserprinter at 100dpi. Then create the same-height text and print at 200dpi. Stick these side-by-side on your monitor. Unless your monitor is farther away than most, you will find that you can certainly tell the difference, that the text at 200dpi is much easier to read, and that even then it doesn't quite look nice and smooth.
Text on a 200ppi monitor does have the advantage that it can use anti-aliasing to help simulate greater resolution, but in general more pixels as better (until you reach the limit of human perception). 1200dpi laser printers are needed because their dots are single-shade, and also the resulting displayed output tends to be held closer to the face.
Exercise for the reader--search the 'net to find the smallest-possible viewing angle that the average person can discern a dot. Combine this with the average distance of a monitor (a little over an arm's length?), sprinkle with trigonometry, and you should come up with the pixel density our displays should reach for.
Addendum: file for printing (Score:1)
At the risk of being slashdotted, I've created such a file for you. It's a 2400x900@400ppi PNG file with the same non-anti-aliased text at 100ppi, 200ppi, and 400ppi. Opening it in Photoshop knows the file's saved 400dpi, but before you print it on your own in another program ensure that it's not about to print out at 72dpi.
http://phrogz.net/tmp/res.png [phrogz.net] : (48.3k)
So... (Score:5)
What's that you say? It doesn't exist? Well, what's this contraption good for then?
cheers,
mike
Re:Why not? (Score:1)
It's a demo. Make the product look good by having it do something outrageous. Think of a sports-car commercial, they show the car going very fast through tight curves and slaloms and what not. They're not suggesting you buy their car to do the same thing (in fact they usually have disclaimers at the bottom: "Closed track, professional driver" etc), but they're making the car look good by driving it outrageously.
Re: (Score:2)
Re:Comparison with apple 22" cinema display (Score:1)
So what, theoretically, would limit yield to 80%?
Re:Comparison with apple 22" cinema display (Score:1)
The Hercules Prophet II adapter with DVI out and a DVI monitor are a great combination in Windows or your favorite free OS! Haven't tried a Prophet III yet, but it should work.
Even if you can only get the ADC version (current model) Dr.Bott have an adapter [drbott.com] made for people with older G4 Macs that will turn it back into a DVI/USB/Power device. Or, Gefen [gefen.com] have a DVI-ADC box that includes the power supply.
FYI (Score:1)
The human eye can only interpret a certain amount of pixels in the first place so this is extremely obsolete to anyone in the world. While some scientist may beg to differ on end no one would benefit from this whatsoever. Well actually Intel would since they'd be suckering someone into spending 200k on this monitor. I'll pass sirs.
The Joy of Visual Perception [yorku.ca]
missing the point (Score:2)
Just because that's a perceived maximum doesn't mean you'd be able to determine this unless you sat in front of your monitor with a magnifying glass. These arguments are always being thrown in the loop with CCD's and crap like that.
Why faster 2D remains irrelevant (Score:2)
3D cards have continued to push the graphics memory bandwidth curve upwards, so I'm not too concerned that 2D cards "won't be fast enough." Cards nowdays can draw 2D graphics to the frame buffer about 10-1000x faster than the displays can support them using bltblt and similar 2D fill operations. (If you don't believe me, go look at the megapixel fill rates of 2D and 3D cards.) Which brings me to my next point.
There's no competing on 2D because its irrelevant to the consumer. How so? Well, what's the difference between 500 fps 2D and 1000 fps to the naked eye? Answer: nothing, both are faster than any display device (monitor) can support.
Sometimes marketeers try to promote (with the help of ignorant journalists) some artificial 2D performance benchmark comparisons, but I haven't seen any in the last 5 years that made a smidgen of difference in ordinary or even extraordinary human usage scenarios.
I'll agree that Display PDF (and its predecessor Display PostScript) does soak up lots of horsepower, since to display anything means you have to execute an arbitrary program written in a variant of the PostScript language. It'll be interesting to see if any graphics cards attempt to accelerate it rather than just tossing that job to the CPU. Display PDF could make it easier to write graphics applications since it provides a nice library of 2D vector and fill operations, and I'm glad Apple is licensing rather than reinventing the wheel, but you'll never see it take off on the PC because Microsoft is intent on retaining control of those operations within ActiveX.
--LP, formerly a graphics hardware analyst
Re:22 inch monitors (Score:2)
Making geometries better with CRTs is an incredibly difficult technical challenge, but trivial with LCDs. If you are really picky about geometry and the newer flat-screen CRTs still aren't good enough, I'd recommend moving to an LCD display.
--LP
Old news (Score:1)
Having seen the demo I can tell you this thing is amazing. Here's the scene whenever they demo the system at a trade show:
IBM Guy stands by display smiling as a slideshow of photographs, maps, and newsprint flashes across the screen.
Techies walking by are inexplicably drawn to the display, much drooling ensues.
IBM Guy explains that it takes tons of processing power to drive the damn thing.
Techies ask how much it costs.
IBM Guy says something like "This particular prototype cost $2 zillion."
Techies drool
IBM Guy prepares for the next question, which everyone, without fail, asks...
Techies ask, "Can I have it?"
IBM Guy smiles and says "No" instead of something clever like, "Gee, that's the first time I've heard that one. Have you considered a career in comedy?"
Techies go home with visions of megapixels dancing in their heads, and eventually, one special techie posts his observations. Finally overcoming the embarassment of realizing that his clever line, "Can I have it?" was not so clever after all.
Two questions (Score:2)
1) How long will it take to see I/O bandwidth improve to where it can handle real time streaming of such huge pictures from storage.
2) Is there any hardware out there right now that can handle such high I/O bandwidths?
Re:Megapixel (Score:1)
I'm already sick of it and its been only 15 minutes.
Such a small screen (Score:1)
quick calculation (Score:1)
No one (besides little Johnny) is working on... (Score:1)
Indeed, the other kids have finished theirs long ago, and are in the streets playing ball.
Re:Megapixel (Score:1)
Re:How would sound work? (Score:1)
Re:On the subject of monitors (Score:1)
For a while we had one in the lab where I work running off a P-III / G400 running Linux. So I suspect pretty much any PC will work fine. But the machine is now a server so we've got this really nice monitor just sitting around doing nothing.
But yes, those things are very pretty (and very expensive, last I looked).
Re:On the subject of monitors (Score:1)
Yeah, right dude. If they ever just toss that thing out, I'm getting dibs.
Re:Two questions (Score:4)
Quite a while - a gig per second is way more than even 66 Mhz 64-bit PCI can handle (I belive that will only go up to ~600 (?) Mb/s). Also, there really isn't much need for it. I mean this monitor is nice and all, but with a 20K pricetag and no real applications for it (for the average user, anyway - maybe this would be useful in some 3d modeling scenarios, perhaps), I doubt we'll be seeing technolgy to support this kind of thing in even high end workstations for at least a few years. Both Intel and AMD are working on new buses to replace PCI so maybe that problem (the bus) will go away fairly soon. Actually getting a gig per second off a disk - that could take a while.
2) Is there any hardware out there right now that can handle such high I/O bandwidths?
Surely. But once you're looking for that, you're talking about big machines with a lot of hardware RAID. I'd bet a E10K or S/390 could handle that kind of I/O. If you meant something you could actually afford - probably not. ^_^
Stanford project page (Score:5)
Very cool stuff that's just starting to get commercialized -- this is what you'll be seeing in your GeForce 4s or whatever.
Re:High-DPI monitors need resolution-independent G (Score:1)
-----
"Goose... Geese... Moose... MOOSE!?!?!"
Re:Megapixel (Score:1)
Thank You. (Score:1)
Although, I still might migrate to the cube, because I like the idea of a totally silent system.
Re:High-DPI monitors need resolution-independent G (Score:1)
GDI+ (in WinXP) adds better alpha support, amongst other things, which Quartz already does. Haven't programmed in Quartz so I can't compare directly though.
Re:Superman said... (Score:1)
My friend's computer is called mxyzptlk. Try hacking that!
$ nmap mtzplyk
Failed to resolve given hostname/IP: mtzplyk.
$ ping mztlplkt
ping: unknown host mztlplkt
--
Why 2 GB? (Score:2)
With a colour depth of 24 bit/pixel, 9.2 Mpixel require below 28 MB. In 32 bit mode, it's well below 40 MB.
So the conclusion would be that this beast must have something like 1780 bits per pixel - WOW, that's a hell of a lot colours
(Yes I know what e.g. texture memory is good for.)
Re:On the subject of monitors (Score:2)
If you want compatibility, go for the multi-link adapter (MLA) - then the range of cards available goes up ... ignore the SGI superwide-savy page, it's useless and out-of-date. And the MLA lets you enjoy fullscreen gameing, irrespective of the resolution used - it just scales the image accordingly (I don't even notice that Q3A looks a little stretched anymore ;-)
Browse - comp.sys.sgi.hardware plenty of user experience in there.
Briefly, the following seem to work best (assuming you want DVI output - if not, pretty much anything will do) ....
.....
Asus v7100DVI (GeForce2MX)
Herc ProphetII Ultra DVI (GeForce2 Ultra)
Herc ProphetIII DVI (GeForce3 Ultra)
Matrox G400 DVI
Matrox G450 VGA
ATI Radeon AIW DVI
And prolly others I've forgotten
Pretty much all of these work in any flavour of windows. XFree support is available, but you'll need to work on the config file to get the timings right (ask on the newsgroup, someone will help you out).
Key thing is 1600x1024 resolution support in the drivers; and the bandwidth of the TDMS for DVI output (look for an external SIL164 on NVidia parts - otherwise forget it).
Cheers, Carl.
Re:Comparison with apple 22" cinema display (Score:2)
It's big, but not cheaper per pixel (Score:3)
Incidentally, the reason displays don't obey Moore's Law, even though they're made using photolithography, is that making transistors smaller doesn't help displays.
As the article points out, displays are getting cheaper at about 8% per year. That's a lot better than almost any other non-IC product. Be patient.
If you're near SF, incidentally, visit the Sony Metreon, which is being used to show off Sony flat panel displays of various flavors. They're everywhere. Little ones. Big ones. LCDs. Plasma panels. No Jumbotron, though.
Re:home theater system (OT reply) (Score:2)
22 inch monitors (Score:2)
Nathaniel P. Wilkerson
Re:High-DPI monitors need resolution-independent G (Score:2)
In any case, as http://developer.apple.com/quartz [apple.com] says, "Quartz supports PostScript®-like drawing features such as resolution independence, transformable coordinates (for rotation, scales, and skews), Bézier paths, and clipping operations."
That's a lot more than what Windows offers. So while I don't know for a fact that the widgets are resolution-independent, the underlying graphics API certainly is.
Re:High-DPI monitors need resolution-independent G (Score:3)
Re:Megapixel (Score:5)
It's annoying to those of us who realize that the "megapixel" number rises faster than the dimensions of the display. The dimensions are linear, and increase linearly. The megapixel number is a product, and thus increases at a rate proportional to the square. It's pure marketspeak and psychological manipulation. They figured that idiots would go "ooooohhh look at all those megapixels". Meanwhile, those of us who know better have to guess the aspect ratio, and back it out to find out what we really want to know.
The display in question, were it square, would be roughly 3033 pixels on a side because that's the square root of 9.2 million.
Now... what's the aspect ratio... umm... the article doesn't say. So... Let's say the horizontal resolution is 4096, then the vertical could be 2246. 4096*2249=9199616. Close enough for government work.... But... We just don't know. That, my fellow Slashdotter, is why "megapixel" is annoying.
"Hoarders... cannot help their neighbors" --RMS
Re:Possible applications (Score:2)
When these things go consumer level, I'll be the first to pay up (Even if it's up to $10k) - I don't think you really understand how fucking incredible this would look.
Possible application: 35 MM (Score:3)
At 9 megapixel, you are at 35 MM photography resolution (assuming enough color depth - betwee 16 and 24 bit, if memory serves). With this display, the need for chemical photography evaporates for all but niche applications. True, for motion, this is complete overkill. The human eye/mind would be hard pressed to absorb this many pixels at 24 frames/seconds.
Think big: put two dozen in your house, network them, and every day you could live in a different world class museum. Or have you photo album be available via voice command instead of having to get it of a dusty shelf. or be surrounded by stunning high production value or porn. or whatever.
The resesearchers reached for 9 megapixel for a definite reason: 150 years of chemical photography says this is a resolution people really like for still pictures.
Re:FYI (Score:2)
Re:High-DPI monitors need resolution-independent G (Score:2)
Re:High-DPI monitors need resolution-independent G (Score:2)
Re:High-DPI monitors need resolution-independent G (Score:4)
Re:17000 * 17000 = 289,000,000 (Score:2)
The maximum resolution of the human eye has no relation to the size of the viewed object, but instead is related to the size of the retina (where the image of what a person sees is projected and transformed into nervous impulses).
Now, if you get close enough to the monitor so that the distance between it and your eye is the same as the distance between the front of your eye and your retina (about 1")? then you have a 1:1 relation and you can apply that rule. On the other hand, even if you could focus your eye such a short distance, having a 16"x12" display would still be incredibly useless ...
Re:Why? (Score:4)
Comparison with apple 22" cinema display (Score:5)
One of the largest problems facing manufacturers of large-dimension LCD screens is the high rate of failure during the manufacturing process. This means that each batch of fabricated monitors yields a low number of functional units, driving up the cost per unit. I wonder how the researchers were able to combat this, while at the same time increasing the pixel density by 7x?
Can someone who's more familiar with the industry give approximate numbers on the failure rate of LCD manufacturing? Are we talking 1 bad screen in 20, 1 bad in 2000, or...?
Re:So... (Score:2)
Superman said... (Score:5)
If you can somehow trick them into saying "mexapixel" backward, they'll be sent back to their own dimension. Maybe we can grab their $200,000 display as they're fading away!
Them: "Lexipaxem...Oh, crap!"
Us (grabbing display): "See you at the pawn shop...SUCKER!"
Re:A good step, but not there yet... (Score:4)
It's useless going past about 300 dpi on a black and white (two-color, no-grayscale) printed page as that is the limit of the human eye's resolution at about 10 inches. The printer needs that extra resolution to dither for grayscale and color. With a monitor, 10 inches is pretty close, back off to 15 inches and you only need 200 dpi. No dithering necessary so that's it, you're done, optimal display. (Actually this is generally overkill because the eye's resolution gets worse for things that aren't just black-on-white, but if the display will be used a lot to just read text like the black on white here on slashdot it's better to be safe.)
Just clearing up some misconceptions.
Re:And this matters because? (Score:2)
Re:Read that article? (Score:4)
Unadulterated CRAP (Score:2)
What is the point of running stories that are factually completely wrong? Just post the link to the research papers and move on.
What this moron seems to have picked up on is the Lightning 2 project, which basically is a video crossbar switch. They used this to combine the DVI outputs from a network of eight slaved 1.5 GHz P4 machines each with 256 MB RAM and a GeForce2 Quadro video card (starting to see where some of the numbers come from yet?) plus another machine as master running their own parallel rendering code over (probably) gigabit Ethernet. They tested Lightning 2 with the IBM "Big Bertha" 9.2 million pixel display.
They achieved frame rates of 12 Hz and 60 Hz (with and without depth buffer capture which requires the Z-buffer to be transferred from each PC to Lightning 2 as well as the actual rendered image segment) on a 1.16 million triangle model.
Lightning 2 itself has 512 MB of RAM and 23 Xilinx FPGA devices.
None of this has got anything to do with the actual display itself, which as someone else pointed out "only" needs about 40 MB of framebuffer storage; i.e. a 60 Hz data rate of about 19 Gbps, or 2.5 GB/s.
Now can you PLEASE try and get things RIGHT in future.
17000 * 17000 = 289,000,000 (Score:2)
a typical person has a maximum resolution of about 17000 point sources per inch.
So, assuming for the sake of argument the monitor has a roughly 16"x12" viewable area, that gives (16*17000) * (12*17000) = 55,488,000,000 points, or 55.5 gigapoints, as the limit of human eye resolution for a screen of that size. That's several orders of magnitude over the announced 9.2 megapixel display, assuming that pixels are roughly equivalent to points.
Incidentally, putting that 9.2 megapixel value into more easily understandable terms gives a display size of roughly 3500x2600 (assuming a 4:3 display ratio). Good? Yes. Perfect? No.
--
BACKNEXTFINISHCANCEL
My apologies. (Score:2)
Correct you are; my apologies.
Even so, I'm not convinced that a 3500x2600 display exceeds the limit of human eye resolution--though that may just be because I'm used to sitting a foot from my monitor...
--
BACKNEXTFINISHCANCEL
so completely misinformed (Score:2)
As far as I can figure, the display he's talking about is IBM's Big Bertha [ibm.com]. It was custom-built for Laurence Livermore national lab, and it runs at a native resolution of 3840x2400.
I saw a prototype of this display at Supercomputing 2000 in Dallas last year. It was running off of an IBM-brand Wintel system-- can't recall which one, an Intellistation, I guess-- with four 1920x1200 graphics cards. The monitor was stitching the four images together seamlessly.
According to rumor they hooked it up to their bigger iron from time to time, but when I saw it, it was running NT.
So I don't know *where* the author got his "it takes 16 CPUs and costs $200,000" stuff. Hell, LLNL only paid $80,000 for the prototype-- see this [fcw.com] Federal Computing Week article. According to the IBM guy I talked to at SC2000-- although I can't seem to find a confirmation of this in writing anywhere-- when the monitor is commercialized sometime this year, they're expecting to sell it initially for about $20,000. One too many zeros, Doug. ;-)
And the obligatory remark: yeah, it's an incredible display. Like reading a newspaper-- effectively about 200 ppi. But for any traditional computer application, it's not really practical. Once you get past the "wow" factor, this thing really lives up to its nickname: the IBM Squintron 2000.
Re:Two questions (Score:2)
Actually, the not-uncommon 1080p resolution (1920x1080@60p) is an awfully good compromise for film work. Traditional film work is done at 2K resolution-- 2048 pixels across by however-many (depending on your aspect ratio). Some 4K work is done sometimes, but very, very rarely.
With high-definition telecines and recorders becoming more common, many post houses are doing their film finishing in 1920x1080, 24-frames-per-second progressive-scan. The data rate is a lot lower than 2K, and a HELL of a lot lower than 4K, and you can display your work on commercially available HDTV broadcast monitors-- the 32" model from Sony comes in well under $100,000.
Re:Two questions (Score:2)
Oh, you'd be surprised. You can do HDTV editing-- 1080i resolution-- with just 32 Seagate fibre channel disks on four FC loops. That's enough bandwidth to do RGB HDTV (186 MB/s, more or less) times two. (Gotta do two for real-time dissolves and such.)
To do more I/O is really just a matter of buying more disks.
What's that you say? You need your operating system to be able to efficiently balance multi-gigabit-per-second I/Os across several I/O channels on several busses? Oh, that's a software problem. I believe the question was originally about hardware. ;-)
3840 x 2400 (was Re:Megapixel) (Score:4)
I would say RTFA, but the FA completely failed to link to any relevant info. I had to search myself; IBM's page is here [ibm.com].
not really -- compare vs. print (Score:4)
Think about print. Early laser printers were 300-dpi resolution, which is pretty good -- good enough for entry-level, mass-market desktop publishing -- but is still quite low by traditional printing standards. The professional (print) graphics shop I used to work for used to run its phototypesetters ($100,000 behemoths) at 1,200 and 2,400 dpi. (And that's in both dimensions, so 1,200 dpi output has 16 times the information content of 300 dpi output.)
600 dpi print is pretty high-quality and acceptable to most applications, particularly with the availability of resolution enhancement, which is roughly the printer equivalent of anti-aliasing. And personally I couldn't tell any difference going past 1,200 dpi.
Anyway, my point is that even 200-300 dpi isn't as good as we might really hope for. Still, it's a vast improvement over the resolution of today's displays. I hope this will have an impact on the well-documented fact that people read (current) computer screens more slowly than printed material and find them more uncomfortable. There's actually a lot of subtle but very helpful detail contained in type that gets lost at lower resolutions such as those used by most displays today.
In a way, graphics are more forgiving than text, because the human eye-brain combination is pretty good at interpolating what's intended if there is sufficient resolution or sufficient color depth. Of course, artifacts can be pretty nasty too -- we've all seen bad cases of the jaggies. But the bottom line is, I think if you ever get a chance to actually see a display this good you won't doubt that the extra pixel density is wasted.
Read that article? (Score:2)
I think they're talking about a setup that would run those 16 DVDs at once costing that much. Normal use as say, a computer monitor would only require the one PC - and the cost would be a LOT cheaper... Although they don't mention the cost of the monitor itself.
From the wording, I'd say it'll be in the neighborhood of at least $2000-4000:
"In two years, we want to see a 27-inch monitor able to display two pages side by side for $2,000," Leglise said. And in five years, we want to see a 10 megapixel display for $2,000."
Re:Display PostScript Anyone? (Score:2)
Re:On the subject of monitors (Score:2)
(No, I don't know anything about it's goodness, but that link should tell you all you need to know about compatibility)
Possible applications (Score:3)
Re:Comparison with apple 22" cinema display (Score:5)
A 1 in 20 failure rate would be a 95% yield. So, in short, we're looking at a 3 in 10 failure rate in real life with a 70% yield.
If you want to read the rest of my long-winded post, go on. If you've had enough, I suggest you go play your favorite video game.
People don't realize how complex an LCD is. The traditional example is two glass substrates surrounding a layer containing the liquid crystals, which is itself sandwiched between two polarized layers. Then there is a backlight and an active matrix of transistors used to address each pixel. Because contaminants will kill pixels in the display, this layer must be filled under ultra-clean, high vacuum conditions. The glass substrates (AND every single layer of the LCD)must be manufactured to precise planar dimensions to prevent dimensional variations across the screen. Since larger substrates approach a square metre in size, this is not an easy task. Advanced LCD technology has taken advantage of polymer coatings on each layer to separate them, prevent contamination, act as internal reflectors, etc. This adds more layers, and adds more complexity to the process.
The active matrix of transistors itself consists of a grid which is vapour deposited under ultra-high vacuum. This layer consists of at least three layers itself: anode, cathode, and at least one active layer, since active matrix LCD screens rely on field effect to twist the LCDs.
Perhaps it is not evident to the reader that high vacuum and fast manufacturing processes don't exactly mix. Even if you achieve vacuum, you are basically racing against time to complete your manufacturing/analysis before residual impurities hopelessly contaminate what you are working on.
This is why plasma and OLED displays (hopefully soon to be my research topic) are being pumped up with research dollars. The accepted theoretical limit (economically) for current LCD active matrix displays is 30", and the market is clearly looking towards very massive wall-mounted units in the future. LCD will probably dominate the POS, business and computing display markets as people become more enamoured with flat panel units, but due to sheer complexity, I think it's only a matter of time before they are eclipsed by either OLEDs or plasma.
This is long. whew. sorry, guys. pixel density was increased as a function of more efficient transistor drives for each LCD cell. It's relatively easy to pattern pixel densities that small (look at CRT monitors), but with better field effect transistors, the ghosting and cross talk present in early displays was eliminated. This was accomplished by adoption of amorphous silicon in the transistor layer as opposed to the original CdSe thin films. That in turn was enabled through advances in physical and chemical vapour deposition which allows manufacturers to pattern the transistor matrix more precisely.
That's the best I can do right now. I mainly regard LCD wrt to comparisons with OLEDs, which I am more familiar with, so I apologize for any screw-ups in here.
Re:Megapixel (Score:2)
1280x1024 pixels is more descriptive than 1.3 megapixels, IMO.
Also, there was no reason for them to mention DVDs at all -- they could have just given the maximum resolution, which would have meant a lot more. I can guess 2880x1920 based on that information, but that's only 5.5 megapixels.
Or am I missing something completely?
Lightning 2 (Score:2)
http://www-graphics.stanford.edu/papers/lightning
You'll be able to see it with some hopefully very cool imagery at this years SIGGRAPH It really has to be seen to be believed... beautiful!