Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Data Storage Upgrades Media

Pioneer Promises 400GB Optical Discs 228

schliz writes "Pioneer has developed a 16-layer read-only optical disc which it claims can store 400GB of data. The per-layer capacity is 25GB, the same as that of a Blu-ray Disc, and the multilayer technology will also be applicable to multilayer recordable discs."
This discussion has been archived. No new comments can be posted.

Pioneer Promises 400GB Optical Discs

Comments Filter:
  • by sexconker ( 1179573 ) on Tuesday July 08, 2008 @12:46PM (#24102093)

    Interestingly, many of these formats were bought by SONY and led out to pasture.

    FMD (Fluorescent Multi-Layer Disc) being the most promising (back in the day).

  • Re:Blu Ray (Score:5, Informative)

    by ergo98 ( 9391 ) on Tuesday July 08, 2008 @12:48PM (#24102129) Homepage Journal

    Good thing we all updated early to the blu-ray player, when something is about to come along to blow it out of the water

    There's always something better coming along. In this case it's pretty much just a research paper, not an actual product, so not all that exciting.

    And Blu-ray had burnable 4-layer (100GB) discs two years ago.

  • Re:Blu Ray (Score:5, Informative)

    by halsver ( 885120 ) on Tuesday July 08, 2008 @12:54PM (#24102213)

    From the article:
    "The huge capacity of these discs means that the new technology will be best suited for applications such large volume data archiving, rather than consumer use."

    The tech they are using to read so many layers of information is impressive. However as the article states, this format is in no way intended for consumers.

    Your BluRay hardware is probably safe for another five years or so.

  • Re:Blu Ray (Score:5, Informative)

    by Lunix Nutcase ( 1092239 ) on Tuesday July 08, 2008 @01:03PM (#24102391)

    Artifacts which I would not have noticed on DVD are readily apparent on BluRay disk.

    Unless you are talking about film grain, I have no clue what "artifacts" you are talking about as Blu-Ray, outside of the early Mpeg-2 releases, and HD DVD both use more efficient compression codecs than DVD does. If you are talking about film grain, yes it is more apparent now due to the higher resolution which is able to resolve such detail now, but it is supposed to be there.

  • Re:Blu Ray (Score:2, Informative)

    by tb()ne ( 625102 ) on Tuesday July 08, 2008 @01:21PM (#24102631)

    I wouldn't worry just yet. It looks like the discs may actually be 400 GB Bluray discs [blu-ray.com] that will be compatible with existing players.

  • Re:Blu Ray (Score:1, Informative)

    by Anonymous Coward on Tuesday July 08, 2008 @01:27PM (#24102719)
    STURDY for fucks sake
  • Re:Blu Ray (Score:3, Informative)

    by Anonymous Coward on Tuesday July 08, 2008 @01:28PM (#24102727)

    Blu-ray and HD-DVD support the same compression schemes (for video at least). The difference was that some early blu-ray discs were using mpeg2 (the same that DVDs use) while HD-DVD movies often used one of the better codecs like h.264 and VC-1 already in the beginning of the "war".

  • Re:Blu Ray (Score:3, Informative)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday July 08, 2008 @01:35PM (#24102843) Homepage Journal
    Small (say, 1.8") hard drives have VERY high G-shock resistance while turned off. (Shock-G resistance varies.)
  • Rubbish (Score:3, Informative)

    by encoderer ( 1060616 ) on Tuesday July 08, 2008 @01:56PM (#24103165)

    This, frankly, is rubbish.

    No matter how good the upscaling chipset is, it cannot divine information that's not on the disc.

    It's like taking a 640x480 picture, stretching it to to 1920x1280 and calling it "nearly as good."

    All this talk of "bluray not catching" is just a matter of time. I never gave bluray a second thought until I bought an HDTV. Soon after, I bought a bluray.

    And before long, everybody will be buying HDTV's. Many will wait until their existing set bites the dust, but it will happen, just as everybody eventually switched to Color, then to Stereo.

  • Re:Blu Ray (Score:5, Informative)

    by nabsltd ( 1313397 ) on Tuesday July 08, 2008 @03:16PM (#24104375)

    Call me a snob if you like, that does not change the fact I can tell the difference very quicly on my 56" HDTV between HD content and DVD content, especialy when the HD content was recorded with a HD camera, not upconverted from film.

    This statement basically says you don't know what you are talking about, as conversion from film to HD formats (1920x1080 or 1280x720) involves a loss of resolution, sometimes massive depending on exactly how the image is stored on the film.

    Film easily has a resolution of 4000x4000, so even using a film format where black bars are stored on the film, you end up with about 4000x2200 at the 16:9 HDTV aspect ratio. Film is then telecined to whatever HD resolution is required, which results in a loss in resolution, but you still have at least full HD quality at that point. Now, special effects aren't always rendered at full film resolution, so some movies (or TV shows) will not have the full film resolution in all scenes, but generally the lowest rendering these days is 2K, which is more than enough for 1920x1080.

    What's probably confusing you is that HDTV cameras have more depth of field than most lens/film combinations on 35mm film cameras. This gives the scene a much more "in focus" look for more of the image, and gives the illusion that it is sharper. Film can do this, but it is more difficult due to the complex interaction between the type of lens, the film speed, and the lighting for the scene.

  • Re:Blu Ray (Score:3, Informative)

    by pushing-robot ( 1037830 ) on Tuesday July 08, 2008 @04:44PM (#24105677)

    It depends on how the monitor is configured; some monitors have poor color curves, so some colors are a bit "farther apart" than others. Worse, some monitors (even ones marketed as 8-bit) show less than 8 bits per channel due to cheap controllers or "dynamic contrast" systems. These displays show distinct banding on many images and should be avoided.

    But as long as the display is well designed and capable of outputting a solid 8 bits per channel, it's unlikely that anyone will notice banding outside of special test patterns, even those of us (myself included) who can detect 9-10 bits of color definition per channel. While I wouldn't mind the extra colors, and it's a relatively easy thing for display manufacturers to implement, it's not a feature I'd spend much extra for.

    The big advantage to >8 bits per channel color, though, is during the editing process.

    When working with raster-based programs like Photoshop, it's pretty normal to create a gradient, then compress or tweak the colors, then mess around some more, then adjust the color levels again, lather, rinse, repeat. What started out as a fine gradient got compressed into a small range of colors, then expanded back again, and you now have very ugly, obvious color bands.

    With higher color depth (16 bits per color channel is the norm for good image editors) you have a ton more headroom, so you can mess with levels to you heart's content without losing any color definition.

    But once the editing is complete, it's pretty normal to export the final distribution copy at 8 bits per channel. It saves space, and anything beyond 8 bits per channel is virtually imperceptible.

    It's kind of like lossy audio encoding: If you do it once, after the editing is complete, the music will still sound great. But if you compress, expand, and compress again twenty times you'll end up with crap.

    The one practical use for high color depth displays right now is for color profiles. With extra overhead, it becomes possible to tweak displays warmer or cooler, compress or expand the color gamut, or even slightly tweak certain patches of the display to compensate for uneven backlighting, all without any loss of definition. It's not something the average person would bother with, but it's a good reason to add a few extra bits per pixel to professional displays.

    Also, monitors today can't show extremely vivid or bright colors that the human eye is capable of perceiving. Backlight technology is improving this to some degree, but if there is ever a quantum leap in display technology we may *need* a lot more bits to describe all the new colors we can show. In fact, we would probably need to start using floating point color, which is already used for video editing and HDR video games.

The use of money is all the advantage there is to having money. -- B. Franklin

Working...