New Display Technology to Compete with LCDs? 368
NetRanger writes "C|Net's News.com has a really interesting article to a new display technology that is based on interference of light patterns. The company, Iridigm, has a very compelling case for why their display method is far superior to LCD, including far brighter displays, far less power consumption... but the cool this is that the display actually works like RAM (it retains its state until voltage is applied to reset it) -- so what do you see when the driver crashes?"
PSOD? (Score:2, Funny)
Porn Screen Of Death?
Re:PSOD? (Score:2, Funny)
We call it.... "Pause"
Re:PSOD? (Score:4, Funny)
More like Porn Screen Of Divorce in my case...
HMmm.... (Score:4, Funny)
That's gonna make shutting off the monitor real fast to hide the porn from your (wife/boss/Priest/Teacher) a lot more difficult.
Therefore, this tech will never fly.
Re:Power use (Score:3, Informative)
The lighting would have to come from the side, and would reflect off the display.
One major advantage of this tech is that it should look better as the light gets brighter!
3 times brighter... (Score:4, Funny)
Laptop Frame Buffers (Score:5, Interesting)
Re:Laptop Frame Buffers (Score:2)
Re:Laptop Frame Buffers (Score:2, Interesting)
Re:Static ram? (Score:3, Informative)
SRAM is basically something like 6 transistors per bit.
DRAM, on the other hand, not only requires power to maintain state, but also requires special refresh circuitry. This is because a bit in DRAM is effectively a transistor and a tiny capacitor.
Like Ram? (Score:3, Interesting)
Re:Like Ram? S vs D RAM (Score:5, Informative)
SRAM is pretty much static until changes are made, DRAM you'll hear described like a leaky capacitor. When you give it a charge it will slowly loose it, so you need to refresh it... many many times per second.
Re:Like Ram? S vs D RAM (Score:5, Informative)
Both SRAM and DRAM require constant power to reliably store data.
SRAM differs from DRAM because the cells that hold bits are always charged [howstuffworks has a diagram, basically its 5 logical gates in feedback]. As a result SRAM takes more power but has no refresh delays [and is bigger]
DRAM uses capacitors to store the data and requires refreshing. This makes DRAM smaller, less power instense but much slower.
For example, cache inside processors is a version of SRAM. If SRAM were as cheap as DRAM we'd be seeing 2MB caches common place nowadays...
Anyways... Peace out.
Re:Like Ram? S vs D RAM (Score:5, Informative)
SRAMs can be designed for raw speed (CPU caches) or low power (CMOS memory in old PCs before flash). High speed SRAMs can suck down a lot of power due to all of the gates and frequent logic transitions.
OTOH, The low power SRAMs intended for nonvolatile storage use all CMOS FET transistors in their logic gates. These gates draw essentially zero current unless they are actually switching.
Thus, while low power SRAMs require a voltage (typically supplied by a battery) to retain their state, they draw no current when idle. Therefore, in a technical sense, they don't actually require "power" (voltage*current) to keep their state, just a static potential.
A hydraulic analogy would be rigging two toilet flush flap valves in series, then ensuring that they never open simultaneously. This setup could store one bit (1 - open/closed, 0 - closed/open) with just static water pressure and zero flow. (A little water would flow when the valves are actually flipped.)
(btw, IAAEE)
Re:Like Ram? S vs D RAM (Score:3, Informative)
ECL was fast, but it was just about as opposite of CMOS as you can get. It works using bipolar transistors to continually shunt large currents through resistors even when the gate is idle. That single 1K chip I worked on probably drew several of watts of power. Nevertheless, it was considered to be a SRAM.
(The mainframe CPUs put a hundred or more ECL chips on a ceramic substrate, then used the mother of all water cooled heatsinks to pull out the massive heat that was generated.)
Re:Like Ram? S vs D RAM (Score:3, Informative)
Not necessarily. There's an inherent slow-down associated with large address spaces. Not to mention the heat decipation. Heck, why else do we have 3 to 5 layers caching? The practical approach is to have successive layers of cheaper, larger and slower memory.
Since we already have 8 meg caches (in some high end machines), there's little value in doing away with multi-gig low-power, low-cost memories. Theoretically some apps will achieve noticable performance gains, but at enormous costs (today at least).
Furthermore, DRAM with internally managed refresh logic is functionally identical to SRAM (but non-deterministically slower). For something like video memory which regularly touches every byte of memory, the refresh logic would be unnecessary; thereby speeding up the memory. Further, DRAM is sufficiently performant enough to handle refreshes. 4MB * 80fps (for true color 1280x1024) = 320MBps. DDR can handle 2.1GBps alone. This doesn't even acknowledge the possibility of interleaving/banking/segmentation or what-ever types of tricks they may utilize.
Re:Like Ram? (Score:2, Informative)
CRT's are sensitive because the electrons are moving with respect to the magnetic field, thus being deflected. This display works via a static charge... no way, that'll be affected by a magnetic field.
EM? well, that depends on how they build the thing... but if they know what they're doing that DEFINITELY shouldn't be a problem.
Bad for gaming? (Score:3, Insightful)
It seems like it would *look* beautiful, but would be costly to operate.
Of course, if you're going to shell out the cash for this, then you're probably not going to be worried about the electric bill.
Still sticking to my CRT for now...
Re:Bad for gaming? (Score:4, Interesting)
Maybe once a third-party actually does a real comparison between the varying screen technologies, we can make an informed decision about the future of iMoD in the marketplace. Once again, PR's rule the day...
Re:Bad for gaming? (Score:3, Insightful)
Re:Bad for gaming? (Score:3, Funny)
Re:Bad for gaming? (Score:4, Insightful)
You do in most FPS games- modern games have a lot of grayscale and textures, dynamic lighting etc. Therefore if you turn even a tiny bit, practically every pixel needs to change, or potentially can do anyway.
Frame rate vs. Refresh rate. (Score:3, Informative)
Framerate, at least when you're talking about gaming, is how fast the game engine and graphics card can update memory. The refresh rate is how fast the electron beam is swept across a CRT. LCDs don't have refresh rates, but they do have response times And I would assume this thing would as well.
The "frame rate" on an LCD or one of these things is 1/response time.
Re:Bad for gaming? (Score:4, Informative)
60Hz refresh is ok-ish in places like Australia, New Zealand and anywhere else using 50Hz mains rather than North America's 60Hz. The flicker you see on a monitor is caused by the monitor and the room's lighting interfering with each other and causing beat frequencies: very much like two musical instruments that aren't quite in tune.
Re:Bad for gaming? (Score:4, Informative)
I usually have all the lights off when I work on a computer, and I can still see flicker whenever the refresh rate is under 85Hz. I've had cases where some unrelated change in my video driver settings caused (for whatever reason) the refresh rate to drop to 60Hz, and I had to go fix it because the flicker was bothering me so much. It has nothing to do with room lighting.
Uh, no... (Score:3, Insightful)
The problem is that you can see the image blinking on and off, and it's annoying. I can still see flicker at 70hz, and in general prefer something in the 80s.
Re:Bad for gaming? (Score:3, Informative)
The company's website reports microsecond response times [iridigm.com] for their iMOD elements. Ten microseconds would support 100 FPS, which should be fine for gaming (isn't TV interlaced 50 FPS?)
Re:Bad for gaming? (Score:2)
Re:Bad for gaming? (Score:3, Informative)
Now they don't seems to have any data on framerate you can achieve or power consumption when the complete screen is refreshed frequently.
Re:Bad for gaming? (Score:5, Informative)
Where an iMoD display wins isn't in framerate -- that's going to be driven by your graphics card, anyway -- but in the fact that it has no refresh per se, the way a CRT does. The problem with conventional CRTs is that the screen image is drawn in an essentially serial manner -- each pixel is displayed in scan line order, scan line by scan line. If you update the screen image data faster than the monitor can draw the whole image on the screen, you can wind up drawing the top part of the screen with data from frame X, the middle from frame X+1, and the bottom from frame X+2. If the screen image data is changing rapidly, the visible objects on the screen may not line up correctly across the whole frame; this is artifacting.
The iMoD display, because the pixels are addressable randomly, the same way that LCD displays are, can 'back up' to the top of the display for each frame. The pixel update time is short enough that, unlike LCD displays, you're not going to get 'trails' (and the pixels can be updated many more times per second than either an LCD or conventional monitor), and the addressing electronics can be designed to allow more than one pixel to be updated at a time, making a whole-screen update even faster, so that it's not impossible that it might be able to obtain an order-of-magnitude increase in screen redraw rate over a 60Hz (read: rock-bottom) CRT.
But the real advantage comes more from the fact that, without the screen redraw being tied to a fixed sweep rate, the actual display refresh rate can be exactly the same as the frame rate produced by your video card. With a CRT running at a refresh rate of 72Hz, no matter how many frames your video card can draw per second, you're only going to see 72 frames per second; having a video card that can draw 90 frames a second on the simple scenes only means that you can lose 18 fps due to scene complexity before you see any frame rate loss. With an iMoD display, if your video card can render 90 frames per second, you would be able to see all of them. On the other hand, since the display updates would be matched to the video card's frame rate, degradation of your frame rate due to scene complexity would be immediately visible (subject to the response of the human eye).
Re:Bad for gaming? (Score:2, Informative)
Granted, they don't give a lot of detail, and the graph doesn't even have a scale... but they claim that the technology uses little power even with moving images, and there is no basis to dispute that at this time.
I would have a tough time believing that it uses more power than a CRT. Even if it does use substantially more power than they claim, it would still be well within laptop territory.
Re:Bad for gaming? (Score:2)
Wonder what the useful lifetime of these things is (Score:5, Interesting)
Wonder if fractures would cause a failure, too.
I guess as long as it's at least as long as the expected useful life of an LCD backlight it's still a win.
Re:Wonder what the useful lifetime of these things (Score:2)
The problem is that each time the metal is bent it does still go though some changes on a microscopic level. If you only bend it X% then it will last for a long time, but that doesn't mean it will last forever. You will still get changes in the atomic matrix (migration of the atoms, the atomic structure changing from one form to another, micro-fractures, introduction of foreign materials) at the points of stress, enough so that eventually the metal will break at those points.
Re:Wonder what the useful lifetime of these things (Score:2)
Re:Wonder what the useful lifetime of these things (Score:5, Informative)
Most metals exist in more than one form of crystal matrix. These different types of crystals exist in almost every chunk of metal you find. You will usually end up with a small area of one form of crystal (with all atoms lined up in the same direction) which is surrounded by another form of crystal. These small areas are called grains. The smaller these grains are, the more easily the metal bends, due to the fact that the atoms on the edge of a grain do not bond well to the atoms outside the grain.
When you bend metal you tend to form more grains in it, due to the movement breaking up existing grains and splitting them into smaller pieces. The increase in grains causes the metal to weaken, even if it is a small amount every time. If the metal is allowed to "relax" for a period of time, there is the chance that two extremely close and aligned grains will convert the atoms between them into their crystaline form. This reduces the amount of grains and re-stiffens the material. This re-conversion is very slow under normal temperatures and pressures and thus is a minor effect.
You can increase the grain size and lower the number of grains by heating the metal at a certain temperature for a period of time. If you then quickly cool the metal (quench it in water, for example) you will end up with a harder material (but more brittle). This is how blades are made that hold an edge and stay sharp, the harder the blade is the better it will hold an edge. However, if you make the blade too hard then it will not bend at all and it will be brittle.
Re:Wonder what the useful lifetime of these things (Score:4, Informative)
I originally said, "When you bend metal you tend to form more grains in it, due to the movement breaking up existing grains and splitting them into smaller pieces. The increase in grains causes the metal to weaken, even if it is a small amount every time."
This is not exactly true, it had been a while since I studied metallurgy and I didn't have any reference texts to consult. To clarify, the reason the metal weakens is not that the number of grains is increasing and making the material more ductile (easily bendable), but that the dislocations (areas of stress in the metal matrix) and impurities are getting moved to the edge of the grains and are collecting together. This means that less of the metal has flaws distorting its structure and is therefore harder. Since it is harder it is now less flexible and more brittle. This causes micro cracks to form during the bending. Eventually these cracks lengthen and the metal fails.
Work hardening occurs when the metal is plasticly deformed. These deformations cause impurities and other strains to gather together and less distort the structure of the metal. Since more of the metal is ordered, it is harder than it was originally.
One thing you should know is that metallurgy is very complex. There are many factors which enter into the equation, such as grain size, alloys, impurities, many different phases (crystal structures) of the metal, etc. Often simply how the metal is composed, heated, cooled, worked can vastly change its properties.
Here are some sites to study more about metallurgy:
PLANT MATERIAL PROBLEMS - a site on metal failure [tpub.com]
Metallurgical Terms Made Simple - a site on the basics of steel metallurgy [swordforum.com]
The Metallurgy Of Carbon Steel - a more in-depth analysis of steel metallurgy [btinternet.co.uk]
Promising vapor, but vapor nonetheless.... (Score:4, Interesting)
That said, I'm currently tied to CRT technology because a lot of the media I have to deal with is color matched. Since color on a CRT screen is unreliable... it changes if you look at your screen from a different direction... this could offer a great deal of help to people like me who are tied to heavy, bulky displays rather than sweet flat-panels.
Of course the key here is that they have to deliver everything they promise in the way of omni-directional viewing and color-correctness.
Re:Promising vapor, but vapor nonetheless.... (Score:2)
Don't you mean LCD? If it occurs to you on a CRT, then spend your money on a GOOD CRT. Trinitron or better. Fact is that colour is better on a CRT; if you do any DTP, Photoshopping or anything else where colour matters, CRT is your only choice.
Re:Promising vapor, but vapor nonetheless.... (Score:5, Informative)
Check here [iridigm.com]
They have a Palm display side-by-side with display with their technology. (it's b&w) you canhardly see any individual pixels on their screen. Text is rather crisp, almost printed.
Etch-a-Sketch? (Score:5, Funny)
Re:Etch-a-Sketch? (Score:2)
Wasn't that the Laptop that Dilbert requisitioned for his clueless boss? ;)
nice, but (Score:2, Offtopic)
My idea for the use of this paper is for notebook computers to be like scrolls. Initially just a tube, you pull out the screen which is rolled up inside (and has a rigid piece across the top), and unfold two braces (on both sides) to hold it in place.
They already have keyboards that you can roll up, why not screens? The scroll-book would do the same thing to store the keyboard as with the screen.
Persistence of images when the power goes off is a big requirement for digital paper. But I'm waiting for the scroll-book, which please note could double as a book and notebook if you could write on it with a digital pen. Don't unfurl the keyboard if you don't want to type into it.
Some potential here... (Score:5, Interesting)
Re:Some potential here... (Score:3, Interesting)
This is the same concept that allows animated GIF's to have such small file size. Animated GIF's only redraw the portions of the frame which have changed since the last frame. While this doesn't work for hi-res color video, where just about every pixel changes every frame, it will be great for typical office applications, where all that is changing on your screen 90% of the time is the cursor, mouse, and whatever characters you are typing.
not quite (Score:4, Insightful)
I don't quite think the poster understood the article. From the article:
Once a voltage has been applied to an iMoD element, it requires less power to hold the metallic layer in place than it does to move it.
Looks to me that *some* power is still required to keep the display going. If it loses power the layers would go back to their default state (which while the article does not state, it would appear its white when its off).
Likewise this statement:
but the cool this is that the display actually works like RAM (it retains its state until voltage is applied to reset it)
I'm no RAM expert but from my understanding (with current RAM), as soon as power is lost, so is the data. Unless you're talking about old magnetic RAM from the 50's and 60's, or IBM's upcoming MRAM, but I seriously doubt you were thinking of those.
Yes, Quite...sort of... (Score:2, Informative)
Actually, the hysteresis [iridigm.com] in the MEMS position suggests that a residual image might be maintained if power is lost. It just won't retain the original colors.
LEP's (Score:3, Informative)
IIRC isn't this a property of Light Emitting Polymers? At least not the first incarnations, or the later revisions in that a charge is only needed to change the polymer state... so more power is used when viewing a constantly changing images (i.e. multimedia), whereas spreadsheet/office use would be on the lower end of the power scale.
Re:LEP's (Score:3, Insightful)
Iridigm's displays, on the other hand, are reflective -- that is, not emitting (or generating) their own light. That's why they can claim zero power for a static display.
pretty cool for a framed picture of grandkids that gets updated once a week, I'd say!
Re:not quite (Score:2)
Re:not quite (Score:3, Interesting)
This is why it is possible to have motherboards that support STR - Suspend to RAM, wherein the system shuts off, but all data is still in memory because a very low voltage is used to refresh the values. Its kinda cool, cause when I turn my PC on, right after the BIOS is finished posting and the hard disk is spun up, I am instantly in windows, with any programs up that I left running when I turned the PC off. If I turned the PC off mid-song, that song will instantly continue playing right where it left off. Maybe I'm just easily impressed.
Re:not quite (Score:2)
Something I think it would be interesting to do would be to use part of the screen to create assembler programs, and then, since the screen is technically very similar to RAM, you could use a section of the screen to mirror the registers, stack, etc. in the processor on a bit-by-bit basis. Would that be useful? Who knows, I sure don't program in assembler these days, but perhaps on a PDA...?
Re:not quite (Score:2)
I could forsee this being very useful for businesses and the like for 24 hour displays...little power and no burn in.
Although at home I would love to have this because my current 19" takes up my whole desk and puts out a whole lot of heat. I've been reluctant to get a LCD because of dead pixels and because the refresh is not good (for games), not to mention cost.
Re:Billboards (Score:3, Interesting)
Hook it up to a cellular network and they can download new ads into it....or even better, the states could have an emergency warning/traffic system to take over the billboards when needed...endless possibilities.
Extra long BSOD's! (Score:3, Interesting)
Cool, some people will get to watch their BSOD's a few seconds more.
On a serious note, I wonder if this could actually cause video card makers to make cards that use memory that does not have to be dynamically refreshed, since the monitor pixels can hold the image. Might reduce memory latency for the frame buffers of the future.
Butterflies! Butterflies, man! (Score:5, Funny)
The power of Iridigm displays derives from the replication of some of Mother Nature's most beautiful creations: Butterflies.
I reckon this is will be a sucess and here's why (Score:5, Interesting)
So, where do you have a CRT monitor and an application environment where high performance in the frame rate isn't an issue? Hmmm, how about every call centre in the world. If an IT manager sees the cost benefits of getting low power consumption monitors he or she will bite. If an accountant sees the numbers they'll bite the arm off the salesman. I can see these taking off in a big way with Call Centres and programming shops.
There's a market there for these things, I'd like to see how they do with CAD/CAM apps too.
Bad for games (Score:4, Informative)
This technology is great for displaying text (and pictures of butterflies) but it is very bad for games.
Look at the description [iridigm.com] of how it works. The colour is determined by the distance between glass layer and the metal plate. Big gap = red. Small gap = blue.
This is fine for static images, but it means that it takes 5 times as long for a red pixel to change state as it does a blue one.
When you have a quickly moving image, the result in severe ghosting for red objects. White objects will leave a rainbow trail - red at the far end, blue near the object. Blue objects are relatively unaffected.
If you do use this for playing Quake 3, just make sure you're on the blue team.
Re:Bad for games (Score:4, Insightful)
Besides your logic is flawed. What happens if you put the rest state in the middle of the spectrum, say green? Then it has to move an equal distance to get to blue or red.
However if you go from red to blue or blue to red, this would be the transition with the greatest delay. But again we are talking nanometers, how great can the delay be?
Re:Bad for games (Score:2)
Actually, you would want to be on the red team, because the blue team would have a harder time tracking you.
m-
Re:Bad for games (Score:2)
Don't you mean that you should be on the red team? After all, with your description above, it would be really easy to see the blue team to target them using this monitor, while seeing red targets would be hard. Now if members of the blue team were using this monitor too, they would have a hard time seeing you, thereby increasing the red advantage even more!
Re:Bad for games (Score:3, Informative)
Actually, it is not bad for games. Read their specs at http://www.iridigm.com/ben_quality.htm, they clearly state "Fast response allows artifact-free video and gaming", which basically means fast frame rates for your Quake needs.
Re:Bad for games (Score:5, Insightful)
But what does that say about time ? I don't think there is a real concern. As long as one of these babies can flip in less than 10 milliseconds (and it surely can), there will be no issue wrt speed. In fact, it can very likely be a LOT LOT faster than a CRT, because you merely need to change voltages on transistors, whereas a CRT has a scanning beam that has to traverse the whole screen.
The other thing I found REALLY interesting is that such a display could be run native in a HSB (hue-saturation-brightness) mode. Instead of three colors, each pixel could be ANY hue, since you only have to change a voltage to a new value to change the color. Way cool (they are planning initially for full RGB compat). But in the future it could be a new sort of color scheme entirely.
Of course, it's all vaporware until there are production models.
what about OLED (Score:2, Interesting)
Resisting LCDs until OLEDs or this Iridigm thing is like resisting the tape cassette and listening to vinyl until CDs came out.
Re:what about OLED (Score:2)
Huh? I've already got several CD-players, so I think it's safe to say, they are already out.
Good article.... (Score:4, Interesting)
You have to appreciate post-Dot.Com tech reporting:
provide breif overview of how new technology actually works - consult glossy side of start-up's brochure/PowerPoint presentation
Thank you c|net for providing us all with that fine peice of tech journalism. Too bad Richard Shim couldn't fill more copy space by staring at Maria Bartiromo on CNBC, and had to resort to describing technology halfway through the article.
Why aren't *LED Displays bigger news?! (Score:2, Interesting)
Overview and demonstrations of these are available here ->
Universal Display Corporation [universaldisplay.com] and Koda Research [kodak.com]
Re:Why aren't *LED Displays bigger news?! (Score:2)
Please send one to me, preferably one I can use in 1280x1024 on my desktop, and I'll tell you whether I like it...
Re:Why aren't *LED Displays bigger news?! (Score:2)
Maybe because you've been FOOLED!
Re:Why aren't *LED Displays bigger news?! (Score:3, Informative)
Actually, that stuff's electroluminescent tape.
http://www.3dxtreme.org/pcmodstape.shtml
Not quite the same as an OLED.
Si
clearing the screen after power outage (Score:4, Funny)
Re:clearing the screen after power outage (Score:3, Funny)
--
Forget about laptops ... (Score:3, Insightful)
Hopefully, this will have fewer dead pixels (Score:2, Interesting)
Re:Hopefully, this will have fewer dead pixels (Score:2)
Paint it black!
more like "e-paper" (Score:2)
Re:more like "e-paper" (Score:3, Interesting)
PDA Screens (Score:4, Interesting)
Interesting that the site spouts off on touch screen technology. I've always loved the spontaneous change of LCD to LSD when you press on you LCD pannel, with these, you might just semi-permenantly change the pixel!
And they are showing progress, definitely beyond the "vaporware" that some commentors have said. It appears that they *have* a working product that they demo'ed in May of 2000.
Iridigm Demonstrates First Color iMoD Matrix(TM) Display
SAN FRANCISCO, Calif. - May 20, 2002 - - Iridigm(TM) Display Corporation, a developer of flat panel displays for mobile devices, will demonstrate its iMoD Matrix(TM) technology at the Society for Information Display (SID) International Symposium in Boston, Massachusetts. During the Exhibition portion of the conference held May 21-23, 2002, Iridigm will demonstrate the color iMoD Matrix(TM) display in its booth #1805/1807. This is world's first direct-view color flat panel display based on MEMS (Micro-Electro-Mechanical-Systems).
Continued here [iridigm.com]
Re:PDA Screens (Score:2)
3 Bit Color? (Score:3, Interesting)
Re:3 Bit Color? (Score:3, Informative)
not like RAM. (Score:2, Insightful)
From NPR plastic based alternative to LCD (Score:5, Informative)
It uses the fact that certain plastics when charged with electricity will emit light and certain colors. The screen would be flat and completely flexible.
Literally you would have a screen (a TV for example) that could be rolled up and put into your backpack.
Right now they are looking into small scale electronics applications of the technology in terms of putting in screens for car radios and such but they have the big plan of a flexible plastic tv or computer monitor.
Of course if you pay attention is the fact that it needs no backlighting and can be extremely thin. Very neat stuff.
_______________________________________________
Will it work in the dark? (Score:2)
Temperature Insensitivity? (Score:5, Insightful)
They claim [iridigm.com] that since the entire display is inorganic, it's insensitive to temperature variations. Looks like the marketing folks have gone a bit too far on this one. Metal and glass have very different coefficients of thermal expansion. That suggests that the metal layer will be under tension at cold temperatures and under compression at high temperatures. This should affect the interference layer thickness achieved at a particular voltage. I expect that this will, at the very least, affect the display colors since interference wavelength is very sensitive to the thickness of the interference layer.
Anyone care to do the math?
End of screen savers (Score:2, Insightful)
(I might take this time to note that screen savers don't really have a place on a modern desktop other than eye-candy. But hey, I like eye-candy too.)
Photography Appliations? (Score:5, Interesting)
I'm a photographer myself and "amateur" would be an understatement. I've always been vexed by the inability of the camera to record what I see. For example, I went to the Boston Aquarium a few months back and while my shots were acceptable, the colors were nothing like what I was seeing in-person. Brilliant blues and yellows look painfully muted and boring in my results. I'm told that is a shortcoming of the photography medium and photographers have to use tricks to get those wonderful colors you see in mags like National Geographic, Photo, etc. Well
So what I guess I'm asking is "can this technology be used to not only create and present colors in a 'natural' way but possibly capture them that way as well?"
Re:Photography Appliations? (Score:5, Informative)
I used to work as an aquatic biologist, diving and photographing fishes from all over the globe. My photography skills are legendarily poor, but even the experts I worked with were continually frustrated with the inability of film to capture the brilliant metallic and irridescent colors we saw in person.
Alas, while it may be possible for this display technology to duplicate some of the bright colors, interference colors are usually dependant upon binocular viewing for most of their spectacular effects, and the monitor will definately be mono.
Finally, while I wish it were't so, this technology seems to be display only. I see no ready bridge to adopt this technology to CCD's or film (our two existing image capture options) or to use it directly as a capture device. More's the pity.
Re:Photography Appliations? (Score:3, Interesting)
There was a great article years ago in the Wall Street Journal about a Japanese scientist that reproduced great French wines in his lab. He created exact chemical duplicates of Margaux and others. Through exacting objective chemical analysis, it was impossible to tell which was the original Margaux, and which was his lab-created Margaux. The only problem was his "wine" tasted horrible.
Apart from technical issues, there is the problem of imagination. You look at your photos, and compare them not to the aquarium, but to your memory of the aquarium. Generally, you remember vivid perceptions, like strong smells, bright colours, loud or pleasing noises.
Lastly, professional photographers take great pains to create photographs rather than take them. They know their materials and what the do, or do not do, best. Low-speed slide film is great for brilliant colours where exessive contrast will not be a problem. Super-fast film is good where grain and lack of maximum colour saturation is not a problem. The pro makes the trade-offs to get the picture they want. The National Geographic photographers either light the subject, use an appropriate type of film, or just look for subjects that suit the kind of film they are using.
As an amateur, I would recommend you find the kind of pictures your equipment/film/tastes can best deal with. For example, I like my little Samsung point-and-shoot all-auto camera for snapshots. The Hassleblad takes better pictures, but is totally unsuited for the job. So I put people in front of bright colours, get close enough so that their belt-buckle, chest & head fills the frame and use a fill flash. Also, take lots of pictures. If you are going to take a shot, take three variations as well. Film is cheap.
Sounds Good (Score:2)
Question though - I may have missed htis but how efficient is the manufacturing process? Isnt the main problem with LCDs that the manufacturing process is incredably inefficient?
Higher Pixel Density (Score:2, Insightful)
Looks like they might be giving up some of the lower voltage benefits in order to get higher pixel density [iridigm.com]. Hence their claims about glossy magazine appearance?
Viewable angle? (Overcoming shimmer) (Score:4, Interesting)
Light reflects off two surfaces, one just beneath the other. If the distance between the surfaces is such that the reflected light waves are perfectly out of phase, the waves will cancel eachother out, making it look like the surface actually absorbs that frequency range, producing color. That means that the distance the light travels between the plates is absolutely crucial in producing the right color. That's why butterfly wings shimmer. Your eyes are each viewing the wing at a different angle, each seeing a different color.
When light hits the plates striaght on, the light travels a certain distace between the plates. But when light hits at an angle, it travels slightly farther, depending on the angle. So, for example, instead of being out of phase at 600nm, light at 620nm will be out of phase, making a different color appear if you look at a different angle.
So unless I missed something, what we'll end up with is a display that "shimmers" like a butterfly wing. The hue of the display will shift when the screen is angled. That means that the effective viewable angle will suck a lot more than it does for LCDs, and it will be almost impossible to be perfectly sure what color you're looking at (particularly important for desktop publishing).
Perhaps someone who knows more about physics can explain how they intend to make this actually work. For now, though, I'm going to wait till I see a working prototype before I sell the farm to invest in their product.
Not bad for a second worth of thought... (Score:2, Insightful)
Re:Light interference for display tech? (Score:2, Funny)
The iMoD elements are built upon two conductive layers--one a flexible metal membrane, the other is a thin film. These layers are held about 1 micron apart between two sheets of glass. When a voltage is applied to the element, the metal membrane layer becomes attracted to the thin film layer, turning the element black. Varying the voltage brings the layers closer and farther apart, and the distance between the layers determines what color--red, green or blue--the element displays.
Thus the only distance you have to control is between the membrane and film. Then unless you were moving at significant fraction of the speed of light the colors wouldn't change on your motion much.
Re:Light interference for display tech? (Score:5, Interesting)
The display uses two plates on each pixel that can get closer or farther one from the other. The interference occur in the reflective part of the monitor, only to create the right frequency. Just like a spinning black and white thing can take any perceived color, depending on the rotation rate. In their case, the distance between the plates modulate the light color. Once a ray leaves the screen, it is of a given color and won't change anymore.
What I didn't see is the issue of lighting the surface. This needs a front light. Put the technology has one main advantage: it can emits any visible frequency. Hence, its gamut should be much larger.
J.
Re:Light interference for display tech? (Score:2, Informative)
It needs a front light but only in dark environments [iridigm.com]. Apparently, the reflectivity of the surface is sufficient for normal lighted environments
correction... (Score:2, Informative)
but it was good of you to think of the modulation rate based color method. BTW, did you know that modulation based color perception is a genetic trait? not all people percieve color from the spinning disk experiment. i am one that does not, and i was very frustrated when i was trying to get the experiment to work until i found out that some people are not sensitive in that way. folks in my computer club were programming their B&W monitors to show color using the technique before there were any color TV interaces.
Re:Light interference for display tech? (Score:5, Interesting)
You bring up an interesting point: it's not clear how a device like this can produce different saturation levels for a pure hue. In other systems, a single subpixel has a single color but variable intensity, and subpixels of different colors can be combined to produce a range of colors. In this system, each subpixel is capable of producing any color, but only at an intensity defined by ambient light. Consider a three-subpixel unit where each subpixel can be either white, red, or black. This gives only the following possibilities: white, black, two shades of grey (BBW, BWW), and six kinds of red (RRR, RRB, RRW, RBB, RBW, RWW). Now, a single subpixel could be cyan or indigo all by itself, creating a different kind of flexibility, but I'm not sure if that's as useful as what we get with variable-intensity RGB subpixels.
Re:Light interference for display tech? (Score:2)
Re:Only 8 colors? (Score:2, Interesting)
dere's around 100 cells per pixel, so you night get significantly more than 8 colours...