Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

High Dynamic Range Monitors 131

An anonymous reader writes, "We are seeing more and more about high dynamic range (HDR) images, where the photographer brackets the exposures and then combines the images to increase the dynamic range of the photo. The next step is going to be monitors that can display the wider dynamic range these images offer, as well as being more true-to-life, as they come closer to matching the capabilities of the ol' Mark I eyeball. The guys who seem to be furthest along with this are a company called Brightside Technologies. Here is a detailed review of the Brightside tech." With a price tag of $49K for a 37" monitor (with a contrast ratio of 200K to 1), HDR isn't exactly ready for the living room yet.
This discussion has been archived. No new comments can be posted.

High Dynamic Range Monitors

Comments Filter:
  • Medical Imaging (Score:5, Insightful)

    by BWJones ( 18351 ) * on Thursday October 12, 2006 @06:53PM (#16415377) Homepage Journal
    Of course one of the other principal arenas where monitors like this are valuable is in medical imaging. One of the serious shortcomings in the migration of radiology to digital formats is the reduced quality of the images as compared to film. The dynamic range of film is simply so much greater than can be achieved with standard CRTs or LCD monitors that there is a real danger of missing out on very subtle changes in X-Rays for example. While it's true that image processing can make up for some differences, digital still can't quite compete with film for many purposes including data density in many cases.

    • I beg to differ. (Score:5, Informative)

      by purduephotog ( 218304 ) <hirsch@inorbitSLACKWARE.com minus distro> on Thursday October 12, 2006 @07:22PM (#16415691) Homepage Journal
      Mammography has gone completely digital. Why? Because the quality of the imagery is lightyears better than what you can get for film. Couple that with rapid processing from a scanner laser and throw in algorithms that contrast enhance areas of nearly neutral density and you have a recipe for catching growths that would otherwise miss detection.

      A good, excellent radiologist could detect subtle differences of about 80% that of a standard person. I'd give you the exact quote but it's been a while since I remembered the data- suffice to say I was impressed at the level (in controlled lighting situations) that they were able to see in film.

      A good medical display is a peeled LCD- all the colors have been chemically removed from the surface- and has typically a brighter backlight and another polarizer to knock down the lmin even further. This gives you better dynamic range that is easily adjusted faster than film can- want to zoom in? No problem- touch and zoom- or if you had film, grab a loupe (or crane your head closer). Digital wins hands down.

      Yes, if you digitize a negative you have a data density that can't be reached very easily (I used to estimate this for a job for large quantities of imagery and at high quality ratios- 2 micron spot sizes). But frankly alot of that information is useless- you don't need to know what isn't of relevance.

      The most important aspect of digital imaging is proper viewing environment- something no one seems to get. Reduce the lighting of the area to 0.5 fc and remove any sources of glare off the monitor. Wear dark clothing. Have wall wash lighting appropriate to about 3-9 fc. Have surfaces neutral gray. Ceiling black.

      Digital definately competes with film in many markets for medical xray- Mammography was just the easiest to choose because it has been such a radical change in such a short time period.

      I should note I used to work for Eastman Kodak and did work with other individuals on these digital products (specifically, algorithms)... but I'm not biased because of that. Just the simple truth- from the raw data I've seen I'll feel happy and safe knowing my wife gets a digital mammagram every year.

      • by mattkime ( 8466 )
        Further, digital imaging often requires less radiation exposure.

        I had an interesting conversation with my dentist about the new digital xray equipment.
      • by pz ( 113803 )
        You also forgot to add that exact film processing conditions (concentrations, age of the reagents, temperatures, humidity, etc.) can affect the image tremendously in often unreproducible ways. Digital imaging reduces many of these uncontrolled variables.
      • Re:I beg to differ. (Score:4, Informative)

        by BWJones ( 18351 ) * on Thursday October 12, 2006 @08:58PM (#16416979) Homepage Journal
        I think that you are missing the point of my argument. I was supporting the use and implementation of HDR monitors because of some of the current limitations of digital radiology. All of the things that are done to medical quality LCDs and digital enhancement are an attempt to narrow the difference in image quality between film and computer display and HDR monitors will help this out considerably.

        I am not arguing against digital radiology, rather I am all for it because of the inherent benefits (less rads, less time, less film processing variability, more convenient, etc....etc....etc...), but the reality is that digital radiology is still not all it could be. You said it yourself in that a well trained radiologist can detect about 80% of the differences present in digital representation. Well..... 20% is still a lot of potential misses on diagnoses.

        The reasons that digital has been so successful is not necessarily because of its inherent superiority in image quality. Rather it has been successful because it is cheaper and more convenient especially given the trend away from traditional medical records management.

        As to the density of information, I routinely take film images of electron microscopy captures and digitize them because of the convenience, and that is working on the nanoscale range. I am throwing information away by the conversion, but it is more convenient for all of the reasons we have already talked about.

      • Re: (Score:2, Insightful)

        For the answer on digital vs. film.

        How many silver halide molecules can you fit into any given area?

        Now, how many pixels can you fit into that same area?

        Exactly.
        • by Firehed ( 942385 )
          It's not that simple. Not by a long shot. For one, you're comparing a negative to a display. Extrapolate those molecules to what it looks like when you're seeing the enlarged print and you're getting close. Then consider innumerable other variables, such as film speed. Then consider how you can increase the physical size of the CCD in your new camera - it's not quite as simple to replace a decades-old film standard.

          Prosumer-level equipment already is comparable to 35mm film, and will probably exceed it
          • by Hast ( 24833 )
            DSLRs have already surpassed 35mm equivalents. Full format dSLRs are currently competing with medium format cameras.

            There are areas where film has a slight edge (HDR black and white) but that's also being challenged.

            For the record I also have experience with developing film manually. Considering the time and work (and cancer from the nasty chemicals) that you avoid with a digital work flow there are just not many benefits from film anymore.
      • Just skimmed over your post quickly, this is what I got from it:
        Mammography: the surface, No problem- touch, hands, proper viewing environment,

        You are indeed taunting the teenage /.ers here.
    • Re: (Score:3, Informative)

      by ketamine-bp ( 586203 )
      I believe that in interpretation of X-ray (chest or abdomen), most disease state/patterns are pretty obvious and do not require anything more than a careful eye on a 1000x1000 image of 8-bit grays to actually interpret it. As for X-ray skeletal parts, you can usually lesions, or it is simply not there.

      For CT and MRI, however, the best thing about using a computer to read it rather than reading it on printed films, is that you can actually adjust the window (from the bone window to the soft-tissue window etc
    • by dfghjk ( 711126 )
      "The dynamic range of film is simply so much greater than can be achieved with standard CRTs or LCD monitors..."

      Nonsense. You can't compare a recording device (film) with a playback device (monitor). Digital sensors (the equivalent of film) can have just as great dynamic range as film and are inherently more linear. Monitors have greater dynamic range than print output. slide film is better than print.

      There are already specialized displays for medical imaging.
      • by BWJones ( 18351 ) *
        Nonsense. ......... slide film is better than print.

        Hello? Say what?

        Just for your edification though, it is generally accepted that given current technology, the difference in dynamic range is still about 1.5 to 3 stops better with film than digital. Clicky clicky [anandtech.com] for just one reference out of many.

        • by dfghjk ( 711126 )
          Please. If you're going to talk photography, at least has the decency to quote a photography site. Anandtech is no expert and the link you provided had no date to back of their incorrect claim.

          Print has a maximum dynamic range of about 5.5 to 6 stops unless it uses a special process. Film itself can go up to about 11 stops although typical films offer less than 10. Furthermore, film is not linear and the resultant output is typically compressed to 8.5 or less. Dmax for slide film can be as great as 3.6
          • by sgant ( 178166 )
            Think you should also differentiate between negative film and slide film. Slides don't have near the dynamic range as negative...which was why when I was shooting trans I would bracket to 1/3 a stop to make sure I got correct exposure. When shooting negative film I could bracket almost a full stop and get good exposures...which is what I do now when shooting and using RAW in digital. Digital doesn't approach the latitude that negative film gave us, but it blows slide film out of the water....in terms of dyn
            • by dfghjk ( 711126 )
              Negative film has great latitude because, when it goes to print, it only needs to produce 6 stops of range. When a 9 or 10 stop film gets converted to 6, there's 3+ stops of exposure latitude to play with. When you shoot slide film it's the film itself that you produce so you don't have that "luxury". It's all a matter or perspective, but I think a lot of the forgiveness that negative film provides is a result of the low dynamic range output (print).

              One of the great things about digital is seeing the imm
    • by grogo ( 861262 )
      I'm a radiologist (and a geek of course). I've been witnessing the shift from film to digital over the years of my practice. While the OP is correct in pointing out that the standard radiographic film posted on a bright backlight source has a terrific dynamic range, newer black and white medical monitors approach that quite nicely.

      One huge advantage of digital imaging is that it's quite easy to adjust the window and level (analogous to brightness and contrast) of any image to look at deep shadows and brig
  • Maybe you're not ready for one for your living room, but I'm looking for the order form! Who wouldn't want this? And if you think the price is outrageous, consider how expensive LCD TV's were 7-10 years ago.
  • Comment removed based on user account deletion
    • I was at Siggraph in 1997, and saw an SGI monitor that was amazing - I had to do a double-take, it looked so real. I don't know if it was an early HD monitor or what, but damn, I wish I had one!
  • This seems like a good candidate for high dynamic range if it is not vaporware:

    http://hardware.slashdot.org/article.pl?sid=06/10/ 11/0214254 [slashdot.org]

    • Yeah, I'll go with the laser TV. I've known about Brightside for a few weeks now and realized that it is not needed at all. I don't know how bright my 27" mid 2004 SD CRT gets, but I experience the bloom effect (images so bright there appears a kind of glare around the image) with it. TV sets are bright enough, its the dynamic range (Brightside's referral of contrast ratio) that's important. This laser TV has an infinite contrast ratio (contrast ratio: brightest part of the TV divided by the darkest point)
  • It's tres cool (Score:4, Interesting)

    by PhantomHarlock ( 189617 ) on Thursday October 12, 2006 @07:06PM (#16415525)
    I've been seeing these at Siggraph for years. They do look very nice. You basically need a very bright light source (not hard) that doesn't generate too much heat (a little harder) and a way to modulate that light over a very large range (harder). It would be fun to have a converter for DSLR RAW images to display in HDR, or the usual bracketed ones.

    The examples they usually use are things like light streaming through stained glass in a church, where normally you'd either only see the stained glass properly exposed, or the rest of the room, but not both. It does work to very good effect in those instances, and heightens the "window into the world" effect that high resolution displays have. If this were to be combined with 2X HD resolution 60P motion video (about 4,000 pixels across) it would kick serious ass as the next 'Imax' lifelike motion picture display.

    Oddly enough, the captcha for the post reply screen right now is 'aperture'.
    • Re: (Score:3, Informative)

      by squidfrog ( 765515 )
      Using dcraw [cybercom.net] and the Radiance (HDR) file format [lbl.gov], it should be trivial to convert any digicam or SLR's raw image to an HDR.

      For manually-captured bracketed images, there's AHDRIC [uh.edu] (disclaimer: I wrote this). As long as the EXIF info is intact and the only thing that changes between shots is the shutterspeed, this should do the trick. A related tool (AHDRIA) lets you capture HDRs automatically by controlling a digicam via USB (Canon digicams only, sorry). This process can take 20-120 seconds, depending on the
    • by Apotsy ( 84148 )
      If this were to be combined with 2X HD resolution 60P motion video (about 4,000 pixels across) it would kick serious ass as the next 'Imax' lifelike motion picture display.

      It could also breath new life into older motion pictures. A lot of movies shot during the 20th century look quite feeble on home video with today's display tech, but would look inredible if scanned and stored in a format that preserved the full dynamic range of the image. There is a tremendous amount of HDR info locked in Hollywood's va

  • HDR rules (Score:1, Interesting)

    by Anonymous Coward
    If it takes an expensive display system before people stop going nuts over excessively tonemapped HDR images, then so be it. It's still going to be different from viewing the real scene because bright highlights and dark shadows will be much closer together on a relatively small screen, so our eyes won't be able to adapt as easily. A nicely tonemapped picture, perhaps combined with a slightly higher dynamic range than on today's displays, will beat "1:1" recreations any day.
  • by UnknownSoldier ( 67820 ) on Thursday October 12, 2006 @07:09PM (#16415557)
    Since "true" HDR consumera camera's don't exist (anyone know?), it can be faked [flickr.com], quite convincingly, I might add.
    i.e.
    "It's a feature in Photoshop CS2 or Photomatix or FDRTools."

    Even black and white can be support HDR. This is a great B&W example [flickr.com] of why 8-bit greyscale just doesn't cut it.

    --
    "The difference between Religion and Philosophy, is that one is put into practise"

    • by spoco2 ( 322835 ) on Thursday October 12, 2006 @07:24PM (#16415709)
      The thing to me about the 'HDR' images produced by that technique is that they look far more 'unreal' than normal photos. They have this 'hypereal' effect that reminds me of postcards from the erm... I guess 1940s/1950s that had some hand retouching done to them, or a foil look.

      They just, to me, look a little silly, and that's a result of having an image with more information in it than the medium they are displaying on can handle.

      Now, with a display that can ACTUALLY display the full spectrum of a HDR image. THAT I'm interested in.

      Why is this story only being posted now though? It's from last year!
      • Re: (Score:2, Informative)

        HDR images are not at their best on a computer monitor, they look much better in print. Side by side a 3 stop HDR digital print generally looks better than a single exposure.
        • by dfghjk ( 711126 )
          Conventional printing offers lower dynamic range than any computer monitor.
          • While I agree with you I do not understand why you felt the need to point this out. For instance: it would have been just as topical for you to note that you think cucumbers taste better than pickles.
            • by dfghjk ( 711126 )
              Because you said "HDR images are not at their best on a computer monitor, they look much better in print."

              Why would a high dynamic range image be best presented on the lowest dynamic range output available? Any monitor is better for the job than a conventional print. You could just as easily have said that a full color image looks much better in black and white. The ideal output device would have enough dynamic range to display the image without compression.

              BTW, HDR imaging is about capturing high dynamic
      • Re: (Score:3, Informative)

        by Atario ( 673917 )

        Now, with a display that can ACTUALLY display the full spectrum of a HDR image. THAT I'm interested in.

        Me too! And I sure am glad they included some screenshots in TFA; I can see how they're much better-looking than what my regular old CRT can display! I sat there, dumbfounded, thinking how much wider a dynamic range they had than my actual monitor.

        Maybe they can set up a service where you can look at more great HDR photos at home on your regular monitor so you can at least get used to it...

    • DSLRs are limited to 12 or 14 bits. Merge to HDR in Photoshop CS2 will do as many images as you care to take.

      I have never seen a huge advantage in color prints but B&W, even with HDR, can't quite produce the results film can.
    • > Since "true" HDR consumera camera's don't exist [...]

      Depends on what you mean by HDR.

      For some people, HDR simply means "a very high dynamic range" (compared to competing products, or the "normal" standards). That's the case with these monitors.

      For othr people, HDR means a dynamic range that is greater than your output medium. By definition, a "HDR" monitor can't comply with this definition, although a monitor can certainly be compatible with "HDR input", if both it and the graphics card support it.

      An
      • by dthree ( 458263 )
        Would it be that hard for the manufacturer to make a camera tha brackets the shots and creates fake HDR images internally? Seems like all it would take is adding DSP chip.
        • The issue isn't the logic. The issue is the fact that CMOS and CCD chips have a limited dynamic range (as does film, as do our eyes, etc.).

          In fact, even our eyes' instantaneous dynamic range is far more limited than some people believe. What happens is our brain builds an "HDR" mental picture from multiple "exposures".

          Likewise, you can do multiple-shot exposure bracketing on a digital camera (using different shutter speeds), and then load those images into HDRShop (for example) and create an HDR file from t
    • Well, they do and they don't. My 20D gets about 5 stops of dynamic range, which looks to be about what this thing outputs. It's not at all rare, in fact, for my 20D's dynamic range to exceed that of scene if it's relatively flat.

      A lot of the HDR images made with photoshop have more dynamic range than the human eye does. The human eye can trick one into thinking contrary that however due to how quickly it adapts on changing focus.
    • Fuji's S3Pro (and the upcoming S5Pro) do have a extended DR feature that works quite well in many cases. It's far from perfect, but it does improve the results. Basically, it uses adjacent photosites divided into two sets to capture at different intensities (I'm not really sure of the technical aspects, but I guess each set works at a different "ISO value").

      Of course this implies a loss of resolution, since you are using 2 captured pixels to create 1 pixel in the final image

      Just my 2 cents
    • by Builder ( 103701 )

      Since "true" HDR consumera camera's don't exist (anyone know?),

      HDR is a just technique... you take three images at different exposures and blend those images into one. I personally use 1EV difference between each exposure.

      Mostly, this is done because digital camera makers are too focussed on the megapixel war and not focussed enough on real improvements. My new D200 has a dynamic range of 11 stops, making certain exposures with a lot of range between the shadows and highlights difficult or impossible.

      The H

  • I was pretty excited until I saw the 49K price tag. That really killed the ole puppy. 5K I'd be highly interested but 49K is about 44K out of my budget range. Strictly for ultra high enders.
    • I hear ya! Just like plasma (displays) when they first came out. Had to wait a while for the price to drop from $50k to hit below $5k, but it was worth it.

      It's interesting that in "graphics", resolution is being pursued first, instead of the bit-depth issue, when the later is just as important.

      Cheers
      • Re: (Score:2, Insightful)

        Excellent point. Truth be told I'd much rather see the color depth approached first. They've gotten better but for film level work none of them display full color resolution. Frustriating that the software will handle 48 bit, three channels of 16, but the monitors won't. Mostly becomes an issue when you are working with a lot for gradient images, skies and such. You still get some pixelation that isn't in the actual image file. Then again if you're doing TV who cares. They call it NTSC, never the same color
        • Truth be told I'd much rather see the color depth approached first.

          Color depth is HDR. They are one and the same. A monitor which can faithfully display 48 bit color is an HDR monitor by definition.
          • by 4D6963 ( 933028 )

            Color depth is HDR

            I wouldn't say that. Mainly because basically a high dynamic range means that your range go outside of your traditional 0.0-1.0 range, as color depth is mainly about staying in the 0.0-1.0 range, but with a better precision. This being said, you still can use a greater color depth to display HDR.

            It's as if you had a 24-bit sound card, you could turn your 16-bit sounds into 24-bit sounds but 256 times lower, crank up the volume of your speakers so that these sound normal and with the norm

            • Re: (Score:3, Insightful)

              by Sparohok ( 318277 )
              There is no such thing as a 0.0 - 1.0 visual range. The human visual system is floating point, pretty much literally. You have your exponent, which is how well adapted your eyes are to the light, how dilated your pupil is, etc. You have your mantissa, which is the relative intensity within your current visual field. Physiologially, we have about 28 bits of exponent and about 10 bits of mantissa. So, proper HDR is floating point. But we're not quite there yet.

              In both audio and video, this whole idea of quant
              • by 4D6963 ( 933028 )

                There is no such thing as a 0.0 - 1.0 visual range.

                I was talking about the range on the computer/screen side. Your images are stored in the 0.0 - 1.0 range.

                you should also realize that the idea of "going outside" the 0.0 - 1.0 range is absurd.

                It's all about scaling. And as for the "incredible human senses", our vision may be incredible, but our hearing is hardly good enough to pick up the -93 dB (iirc) noise (with respect to the maximum sample value) that you have with a 16-bit sound.

                That's why HDR and

                • Human hearing covers about a 120dBA range. That's 20 bits. Not nearly as wide as our visual acuity range but it's considerably more than 16 bits. I do think that's pretty incredible. For example, it's quite a challenge to get a -120dB noise floor with room temperature electronics.

                  You can store an entire HDR image in one bit: the spotlight pointed into your eyes is either on or off. That seems to be the point you're making, and I don't think it's particularly relevant to any realistic HDR application.

                  If you
        • by 4D6963 ( 933028 )

          Frustriating that the software will handle 48 bit, three channels of 16, but the monitors won't

          That may be frustrating, but there's still a solution to make things look better if you got 48-bit images to display in 24-bit : dynamic random dithering.

    • That's short sighted. This technology seems highly amenable to economies of scale and low cost mass production. I'm sure there are challenges, particularly efficiency and cooling, but those are really only a problem if you need high brightness. I could see a product based on this technology costing only 20% or 30% more than equivalent LDR displays within a year or two - predicated only on market penetration.

      Other HDR technologies I've seen involve far higher barriers to low cost production. Laser projection
  • BrightSide DR37-P HDR display

    Published: 4th October 2005 by Geoff Richards

    Um, yeah.

  • I saw a hdr display from brightside tech either last year or the year before at Siggraph a conference on computer graphics. I thought it was such a cool Idea untill I saw it in person. They had a few images up and one was of a sunset, it was like looking directly at a sunset, pretty amazing but ULTRA bright. So bright that I didnt want to look at it and my eyes couldnt adjust to it. Why buy a display so you have to have sunglasses too look at it?
    • by jonTu ( 839883 )
      Yep, and honestly, that's all this is: a bright as hell monitor. Which may be great, and may possess a "high dynamic range" but it doesn't display what we call HDR. HDR images are a series of exposures merged into one image such that a tonal range beyond the perception of a normal camera is captured. This can be a still or a still sequence (i.e. video). If you're digital compositor, this is tremendously useful, as you can take HDR images and tweak the exposure to match the lighting of the rest of the scene

      • You are correct in your description of what is being labeled "HDR" currently. However, you are actually a bit backwards on the subject. HDR is really the gimmick here. It's a trick, a way of approximating reality.

        The term HDR is misleading. It's more accurate to describe it as a technique which uses dynamic range compression. Taking a real-life scene with a large dynamic range and compressing it into the limited range available on a monitor or in print. You are not increasing dynamic range, you are merely c
  • I always figured if you really looked at a real lightsaber, instead of being, say, pink in the middle [wikipedia.org], they'd really look intense burn-your-retinas red, like looking at the little red lights on the underside of your mouse.

    I'm not sure I'd pay $49,000 for a tv just for that purpose, but it's the best one I could think of. I'd certainly pay that much for a lightsaber though.
    • I haven't had a close look at that image, but I think lightsabres are white in the middle, and the corona is the only colourful thing about them. Only found that out a week ago, when I started fiddling with some photos of my own :)
      • by catbutt ( 469582 )
        I think you are wrong. You can tell by the way they look when they move quickly (bright, rich color smeared across the image), as well as from the light they shine on things.

        They basically look the way neon lights look when photographed....washed out where they are brightest. Not the way they look when you see them in real life.
  • If you read their site, they explain that actually the contrast goes to 0, because a pixel on their screen can have 0 brightness.

    Apparently this actually breaks the industry equation for deriving contrast (divide by 0), so they had to bump it up to like .1 or .01.

    Pretty awesome technology.
    • by Eccles ( 932 )
      Nigel Tufnel: It's like, how much more black could this be? and the answer is none. None more black.
  • I was under the impression that SED monitors were set to be the new big thing starting in the next couple years and that they're also boasting very high contrast ratios too with very low power consumption. And given that they're supposed to be mass-produced by comparison, hopefully the price would be significantly lower.
    • The problem is that all they're doing to get their high contrast ratios is to make the darks very dark. The max brightness is bitiful, less than a standard lcd screen.
  • by pla ( 258480 ) on Thursday October 12, 2006 @07:23PM (#16415701) Journal
    The BrightSide DR37-R EDR display theoretically has an infinite contrast ratio. How? Because it can turn individual LED backlights off completely (see How It Works), it has a black luminance of zero. When you divide any brightness value by this zero black value, you get infinity.

    It goes from 0 to 4000cd/m^2. Their comparison model, the LVM-37w1, goes from 0.55 to 550cd/m^2.

    So this toy gets as close to true black as you can get - "off", thus constrained by the ambient light level. For white, they manage 4000cd/m^2, or comparable to fairly bright interior lighting.


    Consider me impressed, but realistically, this only amounts to roughly an 8x brightness improvement over the best normal displays, with true-black thrown in as a perk (they suspiciously don't mention the next lowest intensity, no doubt because it goes back into the realm of a contrast ratio of only a few thousand.
    • Re: (Score:3, Insightful)

      by kidgenius ( 704962 )
      You missed what they said. If you kept reading just another sentence or two, you would've understood.

      Using the "next lowest intensity" as you described gets them to the 200k figure, not only a few thousand. The perfect black, "off", gets you to infinity.

      • Re: (Score:2, Funny)

        by pla ( 258480 )
        If you kept reading just another sentence or two, you would've understood.

        D'oh! My bad... I must have glazed over for that part, because I seriously didn't notice it. But yeah, I suppose that pretty much negates the bulk of my point.
      • how good the dynamic range is depends on the monitor as well as the ambient environment you are going to use it in. the material that makes up the thing also counts. if your room has anything that is light emitting it kind of defeat the purpose of turning the thing off.
  • I know slashdot always runs behind digg by a few days or even a week or two, but this is ridiculous.
  • We've all seen the scenes in movies (WarGames, Sneakers, 2001, etc...) where someone is looking at a monitor and we see the reflection of the image on the screen projected out onto their faces.

    Question: is the image showing up like that purely a function of the brightness of what the person is looking at? IOW, would an HDR monitor have the effect of "projecting" the image out as if one were staring into an overhead?

    Someone mentioned above that pictures/video of stained glass windows were often used as demos
    • To project an image on someone's face, you need to focus it somehow. Think of a movie projector, or the overhead you mentioned. Since there's no lenses involved in this screen, it would be impossible to focus the image from the monitor onto someone's face, or any surface for that matter. All you'd get with this monitor is a diffuse glow, just brighter. This is a silly 'effect' in a movie, as to create it, they would have to use some sort of a projector. The actors would have blinding lights in their ey
    • Re: (Score:3, Insightful)

      by SEMW ( 967629 )
      No. Think about it: unless you're really pressing your nose right up to the screen, for a monitor to display a reflection of the image on the face of whoever's looking at it, it would have to radiate at a single angle (probably perpendicular) only. You wouldn't be able to see the whole screen, only a few pixels per eye at any one time. Ever stood in front of a projector screen and looked at the projector? Like that. It would be utterly useless as a monitor.

      N.B. if you have something like the left si
  • I'm all for making monitors with better contrast but the BrightSide solution is a little silly.

    4000 cd/m^2 is their models peak luminance. The nice thing about a standard 300 cd/m^2 monitor is that I can stare at a picture of the sun for as long as I want without blinding myself. I'm not sure I would want to do that with one of these... Not because it's enough to blind you or anything, but it could cause your pupils to dialate so when you turn it off everything would be really dark.
  • These kinds of monitors are probably not worth it. For the purposes of mammalian vision, high dynamic range is a nuisance that needs to be gotten rid of, and that's exactly what the human eye is doing. You still notice that a high dynamic range is present, but you don't really perceive it.

    A little more dynamic range than what your average LCD monitor has would be nice, but aiming for reproducing anything resembling the full dynamic range of natural scenes is a waste of time and money.
    • by dfghjk ( 711126 )
      Exactly! The purpose of HDR photography is to capture all the dynamic range of the scene, not to display it in all it's eye-blasting glory. Part of the HDR technique is to selectively compress the dynamic range to achieve interesting images.
  • ...having sanwditch of two identical LCD panels, glued one on another and driven with essentially the same signal ?

    One would probably need a bit stronger backlight and maybe special mask between the panes so that ligt form one pixel on one LCD could go just through tha same position on another panel...

  • Sweet!!!! (Score:4, Funny)

    by MoxFulder ( 159829 ) on Thursday October 12, 2006 @08:26PM (#16416559) Homepage
    I can't wait till this goes mainstream. Then I'll be able to watch a video of a solar eclipse and actually get blinded by the image. Coool.
  • Haven't seen anybody comment on this yet -- if you dig through the actual specs you'll see one reason why this technology hasn't already taken off. The power consumption of the display is 1680 watts. You basically wouldn't want to put anything else on a household circuit with it.

    -G
    • The end of the article mentioned some Moore's law equivalent for LEDs that would likely reduce the power consumption and heat output to reasonable levels by the time this technology became ready for consumer applications. However, if they really meant it was doubling light output per watt every 12-18 months, at some point that would become a perpetual motion machine, so I'm not sure I really understood this point. IANAEE.
      • Theoretically, LEDs can become 100% efficient at converting electricity into light. Perhaps they mean't moore's law, until the physical limit?
  • Polaroid XR film (Score:5, Interesting)

    by dpbsmith ( 263124 ) on Thursday October 12, 2006 @09:18PM (#16417211) Homepage
    I can't seem to find a reference to it online... I'd appreciate one if someone has one... but circa 1960 the Polaroid company developed a film for recording nuclear tests, which was similar to three-layer color film except that the three layers, instead of being sensitized to different colors, were given emulsions with widely different sensitivities. The fastest emulation was similar to Kodak Royal-X Pan, ISO 1600, and the slowest was similar to Kodak Microfile, and if I recall correctly had an ISO speed of something like 0.1

    The result was to extend the useful dynamic range of the film by a factor of 10000 or so--more than a dozen additional f-stops of latitude, or extra Ansel zones, if you like.

    The film was processed in regular Kodacolor chemistry (IIRC), each layer coming out a different color. In color, the result was a "false color" image displaying a huge dynamic range of light intensity; or, it could be printed as black-and-white using different filters to select different intensity ranges.

    In effect, the photographer was automatically bracketing every shot by a dozen F-stops, in a single shutter click.

    It was an incredibly neat hack. I wonder whatever became of it?
    • Right now, there are digital cameras (Fuji S3 & S5) that use an array of alternating high/low sensitivity photocells to capture an extended dynamic range. In one of the reviews I was reading, it said this is very similar to some high dynamic range films, which use an emulsion that contains a mix of high sensitivy and low sensitivity particles in the emulsion. I don't have any more specific info than that, but I guess that's kind of how the concept evolved.
  • > We are seeing more and more about high dynamic range (HDR) images,
    > where the photographer brackets the exposures and then combines
    > the images to increase the dynamic range of the photo.

    So instead of an image that goes from black to white, you have an image
    that goes from dark grey to light grey. Now you can see all the stuff
    that would've been hard to make out in a single photo. I think what has
    happened is that you've *decreased* the dynamic range in the photo (or,
    more properly speaking, you've
    • The point of high dynamic range is to get an image where there is much finer information (contrast). If I take a photo with a normal digital camera I have 0-255 greyscales and my monitor has pretty much the same. If I take the same image and collect information on a range 0-4096 then I can get my 0-255 monitor to show sub-sets (an area of the photo with brightness in the range 256-512 for example) or I can just see the whole range with the information averaged out (which will usually give lower noise). A HD
  • Crysis [crysis-thegame.com] was shown on a BrightSide monitor at SIGGRAPH. These monitors hook up to commercial off-the-shelf hardware. The price should drop dramatically in the future when better production processes are in place, much like any new product. ;)

    (as long as the software supports HDR/their DLL)

  • Just think of all the great p0rN you can view with one of these things... you know, all those dark recesses and everything. And haven't we all wondered what was happening in the rest of the room, away from the lights??? And... well, uh... you know.... other stuff too :)

    Now all we need is smell-a-vision!

    (Uh... or maybe not!)
  • Does anyone know which part of the screen justifies such a high price?
    Isn't this screen basically a commercial LCD with a modified backlight (a couple hundreds of LEDs controlled by a special channel in the signal)?
  • referance?

    IS everyone on slashdot o.k.??

To the systems programmer, users and applications serve only to provide a test load.

Working...