Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Data Storage

Breakthrough In JPEG Compression 648

Kris_J writes "The makers of the (classic) compression package Stuffit have written a program that can compress JPGs by roughly 30%. This isn't the raw image to JPG compression, this is lossless compression applied to the JPG file. Typical compression rates for JPGs are 2% to -1%. If you read the whitepaper (PDF), they are even proposing a new image format; StuffIt Image Format (SIF). Now I just need someone to write a SIF compressor for my old Kodak DC260."
This discussion has been archived. No new comments can be posted.

Breakthrough In JPEG Compression

Comments Filter:
  • Fractal image format (Score:5, Interesting)

    by nmg196 ( 184961 ) * on Wednesday January 12, 2005 @04:28AM (#11332726)
    I would have thought that rather than 'zipping' an existing image format to create a new one just to save 30%, they'd be better off improving the original image compression algorithm or coming up with a new one.

    Quite a while ago (years!) I had a program which could compress images into a fractal image format. It was amazing - the files were much smaller than JPEGs but looked a lot better. The only drawback was that it took ages to compress the images. But with the extra CPU horsepower we have today I'm surprised fractal image compression hasn't become more prevailant. It would still probably be useless for digital cameras though as it would probably be impossible to implement the compression in hardware/firmware such that it could compress a 6+ megapixel image within the requisit 1-2 seconds.

    Does anyone know what happened to fractal image format files (.fif) and why they never took off?
    • by Nik13 ( 837926 )
      For those of us who use DSLRs, 1-2 seconds is way too long. True enough, buffers help, but I wouldn't buy such a slow camera.
      • by nmg196 ( 184961 ) * on Wednesday January 12, 2005 @04:52AM (#11332836)
        Most existing DSLRs are at least this slow. They only seem faster because they have a frame buffer which can store 4-8 uncompressed images. The main cause of delay is writing out the images to the memory card rather than compressing the images. You can see this by looking at the flashy red light on the back of your EOS after you take 4 shots in sports mode... The camera is busy writing for several seconds after the last shot has been taken.

        If the image compression algorithm was more efficient then there would be less data to write to the card and perhaps overall, it would actually be faster - even though the image compression algorithm might be slower.
    • So that would be the ticket for .fif.
      Optimized hardware, specifically designed to run .fif algorithms?

      I'd love to see something like this in a camera.
      No doubt future Mars rovers would benefit from smaller download sizes?

    • by mcbevin ( 450303 ) on Wednesday January 12, 2005 @04:43AM (#11332796) Homepage
      Maybe, but StuffIt is an archival program. If I have 10gb of existing jpgs and want to archive them, then this is whats wanted. Reencoding them as you suggest would be equivalent to converting say an mp3 to ogg format - a surefire way to lose quality with little gain.

      Re fractal compression, people have been hyping it up for years but as far as I know it never really delivered. I'm dubious about any claims to some mysterious program which compresses anything amazingly well without strong evidence.

      Wavelet compression however is used in jpeg2000 which is a bit better than jpeg but even that isn't supported by any digital cameras.

      If StuffIt really does compress jpegs 25-30% that is a massive leap forward over the previous state-of-the-art compressors which reached I think around 3-4% - http://www.maximumcompression.com/data/jpg.php . Heres hoping their claims pan out, and that they release at least some details regarding the methods they used.
      • by nmg196 ( 184961 ) *
        > I'm dubious about any claims to some mysterious program
        > which compresses anything amazingly well without strong evidence.

        It's hardly mysterious. You can download trial versions and try it yourself - it's a well known compression technique that there are whole books about [amazon.co.uk]

        There is near infinite evidence that it works so I don't know why you're doubting it. The issue isn't whether or not it works, it's why hasn't somebody made an opensource algorithm that we can all use.

        The problem is that the exi
      • by imsabbel ( 611519 ) on Wednesday January 12, 2005 @05:23AM (#11332966)
        Believe him, i used the same programm. It was from iterated systems (they are long gone).
        Its not on their homepage anymore. I dont know if they really used IFS or they just did some wavelets and faked it, but the compression was honestly much better than jpeg. (but of course slower, too. IIRC, compressing a 1024x786 picture took about 40-50 seconds on my pentium.
        What was unique was the viewer. it was "resolutionless", so you could zoom in farther than the original without pixelation. Shapes started to look painterly then, as if traced by outlines, which would actually be in favour of it really being a fractal compressor.
        No idea why it was canned.
        • and btw: jpeg2000 is MUCH better than jpg. half the size is definitvely possible. But of course space isnt really limited, but cpu power is (especiall in cameras and other low-power appliances), so trading 50% filesize for 10 times the calculation needed doesnt seem so hot in the end.
      • by jridley ( 9305 )
        Re fractal compression, people have been hyping it up for years but as far as I know it never really delivered. I'm dubious about any claims to some mysterious program which compresses anything amazingly well without strong evidence.

        A program called Real Fractals has been in use by people who make very large blowups of images for years. It's pretty much standard practice to conver to Real Fractals before making a very big enlargement (like more than a few feet on a side).
      • If StuffIt really does compress jpegs 25-30% that is a massive leap forward over the previous state-of-the-art compressors

        I noticed the chart at the bottom of the whitepaper showed their "25-30%" figure was based on tests done on PNGs converted to JPGs at 50% quality. There is a lot of data degredation at 50% and I don't think many people regularly use anything below about 70%. I'd be much more interested in seeing what compression would be on a JPG straight out of a Digital Rebel or other camera set to

    • Interesting, never heard of Fractal image compression: http://netghost.narod.ru/gff/graphics/book/ch09_0 9 .htm Fractal Basics A fractal is a structure that is made up of similar forms and patterns that occur in many different sizes. The term fractal was first used by Benoit Mandelbrot to describe repeating patterns that he observed occurring in many different structures. These patterns appeared nearly identical in form at any size and occurred naturally in all things. Mandelbrot also discovered that these
    • by happyhippy ( 526970 ) on Wednesday January 12, 2005 @05:18AM (#11332950)
      I did my final year project in Fractal Image Compression about 5 years ago.

      I concluded that it isnt practical for general use, it took too much time to compress an image (alright it was five years ago and so today it probably wouldnt matter), but most importantly there is no easy general compression solution for all images (for instance one that compresses tree pics well wont do faces well and vice versa).

      For a general dip into fractal image compression try to get and read 'Fractal Image Compression By Yuval Fisher'. Damn good read.

      • by Corpus_Callosum ( 617295 ) on Wednesday January 12, 2005 @08:14AM (#11333754) Homepage
        I concluded that it isnt practical for general use, it took too much time to compress an image (alright it was five years ago and so today it probably wouldnt matter), but most importantly there is no easy general compression solution for all images (for instance one that compresses tree pics well wont do faces well and vice versa).

        I did some basic expirementation with Genetic Algorithms and fractal compression and I can tell you, GA does solve the problem. Not only solve it, but obliterates it. With GA, fractal compression can achieve compression ratios and quality that are unheard of with other techniques.

        Of course, this is to be expected, after all - it is what nature does with us. Our genecode is the compressed image, our bodies are the uncompressed results.

        Interesting thought food, huh...
    • Patents. (Score:5, Interesting)

      by xtal ( 49134 ) on Wednesday January 12, 2005 @05:47AM (#11333049)
      I worked in this field for awhile, and the liscencing and other issues sent the company I was with running in the other direction. JPEG was good enough, everyone was using it, so JPEG it was.

      Fractal compression is cool.. but encumbered by IP issues. Too bad.
    • by Ilgaz ( 86384 )
      It has already been done. I use it all the time on OS X instead of TIFF.

      http://www.jpeg.org/jpeg2000/ [jpeg.org]

      I really don't understand why the camera etc companies doesn't adopt it.
    • by Anonymous Coward
      Does anyone know what happened to fractal image format files (.fif) and why they never took off?

      Hard to implement. Patent mess. CPU requirements. No better fidelity [they have just as many artifacts as JPEG, but the artifacts are 'nicer looking']. Massive increases in available bandwidth. Fractal wierdness with editing, you always have to convert back to raster anyway.

      The tech has found niche applications though, such as image scaling : Lizardtech's Genuine Fractals [lizardtech.com] is pitched as an image rescaling
    • by AeiwiMaster ( 20560 ) on Wednesday January 12, 2005 @06:34AM (#11333207)
      wikipedia
      http://en.wikipedia.org/wiki/Fractal_compression [wikipedia.org]

      Some sourceforge projects

      http://sourceforge.net/projects/mpdslow/ [sourceforge.net]
      http://sourceforge.net/projects/fractcompr/ [sourceforge.net]

      for audio
      http://sourceforge.net/projects/fraucomp/ [sourceforge.net]
    • by dcw3 ( 649211 )
      It would still probably be useless for digital cameras though as it would probably be impossible to implement the compression in hardware/firmware such that it could compress a 6+ megapixel image within the requisit 1-2 seconds.

      Why must we have it in 1-2 seconds? You could implement this a couple of ways without impacting the need for quick back to back shots. 1.) A compression button (menu option, or some such), allowing the owner to do a manual compress when running low on space. 2.) Do it automatica
  • What's the point? (Score:5, Interesting)

    by CliffSpradlin ( 243679 ) * <cliff.spradlin@nOSPAM.gmail.com> on Wednesday January 12, 2005 @04:29AM (#11332730) Journal
    The linked page shows average decompression times of 6-8 seconds for 600-800 KB files, rising with the size of the file. Who would benefit from this? It's obviously too slow to speed up web pages, and would be far too CPU intensive for consumer cameras. Professional photographers would have no use for this since they would use RAW images.

    I mean, it's cool and all to be able to compress JPGs by that much more, but the size gains are negated by the time it takes to decompress them. This seems just like those super high compression algorithms that have rather amazing compression rates, but take -forever- to compress or decompress, making them unusable. The difference is those are obviously and labeled as simply for scientific research into compression, but Aladdin seems to be trying to market this product for public consumption. The listed uses ( http://www.stuffit.com/imagecompression/ [stuffit.com] ) seem trivial at best.

    Who's gonna be buying this?

    -Cliff Spradlin
    • How about emailing your holiday snaps to your mum on 56k? Especially if she's paying by the minute, this will be a big advantage, as she can download your archive and then unpack it offline.
    • Archive a bunch of images sometime. Then it's useful. I needed to put several thousand images (scanlations of manga) onto a cd. It went over the 700mb limit. Using this, I could have saved $0.10 on cds and 2 minutes of time. Not a big deal, but if you do such things a lot, it could add up. So the program is probably worth about $5. Maybe $10, but that could be a bit much.
      • >Archive a bunch of images sometime. Then it's useful. I needed to put several thousand images (scanlations of manga) onto a cd. It went over the 700mb limit. Using this, I could have saved $0.10 on cds and 2 minutes of time. Not a big deal, but if you do such things a lot, it could add up. So the program is probably worth about $5. Maybe $10, but that could be a bit much.

        The 2 minutes time saving...remember it takes 6-8 seconds to compress as well, so multiply that by several thousand images. I don't k
    • by The Rizz ( 1319 ) on Wednesday January 12, 2005 @04:54AM (#11332848)
      The linked page shows average decompression times of 6-8 seconds for 600-800 KB files, rising with the size of the file. Who would benefit from this?

      Any websites with the primary purpose of hosting images would benefit greatly from this - such as art & photography sites. (Yes, and porn sites, too.)

      Why? Because 99% of the traffic they generate is for their images. Of those images, 99% of them are in JPEG format. So this compression would give a good savings in bandwidth on all those pictures.

      At large sites, a 30% cut in required bandwidth could mean a very large savings. Now, if they can take a large cut off their operating expenses, and all they need to do that is to make the users wait a few extra seconds for their pictures, I think we know what they'll do.

      As for the decompression time, one thing to remember is that with how slow internet connections are (even broadband), you're much more likely to be waiting for one of these images to transfer than you are to be waiting for it to decompress (so long as it allows you to start the decompression without waiting for the end of the file, of course).

      --The Rizz

      "Man will never be free until the last king is strangled with the entrails of the last priest." -Denis Diderot
      • but instead of using this, why not use jpeg2000. Its even smaller, no more blocks with higher compression ratios, and it is well documented...
        (oppossed to this "patend pending" superalgorithm)
        Btw: the "whitepaper" should be called "advertisement", because it contained zero technical information.
    • by Zhe Mappel ( 607548 ) on Wednesday January 12, 2005 @05:12AM (#11332920)
      The listed uses ( http://www.stuffit.com/imagecompression/ ) seem trivial at best.

      You're right. I read the list, reproduced below. Who'd want to:

      * Send more photos via email

      * Fit more on CDs, DVDs, and other backup media
      * Save time when sending pictures over the internet or across the network
      * Reduce bandwidth costs

      After all, electronic storage media is infinite, and bandwidth is free!
      • >After all, electronic storage media is infinite, and bandwidth is free!

        I can't really tell if you're being sarcastic here=)

        Media IS cheap, as is bandwidth (provided you live in an area that provides it of course, but such areas are rapidly expanding). The price you pay for their product is probably a lot more than you get for not using some extra CD-Rs or DVD-Rs (which have incredibly small per gigabyte costs).
      • by jbn-o ( 555068 ) <mail@digitalcitizen.info> on Wednesday January 12, 2005 @05:59AM (#11333082) Homepage
        Given that the software is likely to be proprietary and the algorithm will be patented, it becomes completely useless to me and it is completely unsuitable for archiving anything (smart people don't play nickel and dime games with archives and backups).

        Maybe if computers were a lot faster and the compression worked on any array of pixels, not just those that have undergone the lossy transformation of JPEG compression. But even then, it would have to do better than what PNG can do in terms of all the other things PNG does and no patented format will beat PNG at its game.

        In theory what you say sounds nice, but in practice I genuinely can't recall a situation where a little more compression would have allowed me to send all the photos I wanted to via e-mail. But the reasons I mentioned at the top of this post are more important reasons why this should be rejected out of hand.

        What meager benefits this affords are far outweighed by the costs. I don't see this going anywhere, and for very good reason.
    • Hardly any websites use 600-800 KB files. For the 30 KB images you download for most webpages, this compression would only reduce the load time a tiny bit. For slow dialup connections, it might actually improve load times. Some sites might find it useful. And in a year or two, faster computers will cut the delay in half. Of course the question is moot unless Firefox achieves >90% market share, since Microsoft isn't likely to add IE support for this format anytime soon.
  • I just installed an 800Gb hard disk in my system. I have a gigabyte worth of webmail space (more than that, if you consider that I can send myself gmail invitations). Storage is, as they say in the vernacular, very inexpensive.

    Even in cameras where storage is tight, the bounds of memory is expanding all the time. Whereas a couple years ago the average storage card size was a measly 64Mb, today we are talking about gigabytes of storage memory inside our *cameras*!

    So let's say we can squeeze another 30% of pictures onto a card. Does that really help us? Not really, if you consider that the compression itself rides atop JPEG compression and that computing time needs to be accounted for.

    Currently, the fastest continuous shooting digital camera (the Nikon D2X) can only take 4 shots in a row before its memory buffers get full and the whole camera becomes useless. Compare that with the 9 shots per second F5 and you can see that the speed of shooting isn't going to cut it for digital cameras.

    We need a compression method that is lossless, not one that creates compact files. Space is cheap, CPU cycles aren't.
    • ``We need a compression method that is lossless, not one that creates compact files. Space is cheap, CPU cycles aren't.''

      Err, didn't you mean 'fast' rather than "lossless"?
    • by NMerriam ( 15122 ) <NMerriam@artboy.org> on Wednesday January 12, 2005 @04:57AM (#11332857) Homepage
      Currently, the fastest continuous shooting digital camera (the Nikon D2X) can only take 4 shots in a row before its memory buffers get full and the whole camera becomes useless

      I beg your pardon? Just about every digital SLR on the market is able to handle more than 4 images in buffer at a time. My year and a half old 10D can buffer 9 RAW images, and the D70 processes JPEGs before they hit the buffer, so it can buffer JPEGs in the dozens.

      I doubt this is intended for any use other than archiving of images, where it will kick ass. It's clearly processor-intensive from the timing results, but for long-term storage that makes little difference.

      I've got a few TB of images in storage and I'd love to be able to save 20-30% of that space, regardless of how cheap it is. That means a little longer between burning DVDs, and having more stuff on mounted drives for reference.
    • by Motherfucking Shit ( 636021 ) on Wednesday January 12, 2005 @04:58AM (#11332866) Journal
      Dancin_Santa says:
      I just installed an 800Gb hard disk in my system.
      I always wondered how much space it took to keep track of who's been naughty and who's been nice...
    • http://www.dpreview.com/reviews/canoneos1dmkii/

      thats the fastest continuous shooting camera, you'll notice it has room for 40 JPEG frames in its buffer, and it shoots at 8.3 f/s
  • Questions (Score:5, Interesting)

    by RAMMS+EIN ( 578166 ) on Wednesday January 12, 2005 @04:33AM (#11332746) Homepage Journal
    The linked page did not answer some of my questions:

    1. Does this only work for JPEG, or also for other (compressed or plain) files?

    2a. If it only works for JPEG, why?

    2b. If it works for others, how well?

    Anybody who can answer these?
    • Re:Questions (Score:5, Interesting)

      by ottffssent ( 18387 ) on Wednesday January 12, 2005 @04:51AM (#11332827)
      The whitepaper suggests they're tearing out the run-length encoding that's the final step in jpeg and replacing it with something more space-efficient.

      In a nutshell, JPEG works like:

      Original image data -> frequency domain -> high frequencies truncated (via division matrix) -> RLE

      RLE is fast, but not terribly compact. Replacing it with something better improves compression. However, RLE generates not-very-compressable output, which is why traditional compression software does poorly. I imagine if you took a jpeg, undid the RLE, and zip compressed the result, you'd get something close to what the stuffit folk are claiming. If someone wants to try that, I'd be interested in the results.
      • Re:Questions (Score:5, Informative)

        by azuretongue ( 180140 ) on Wednesday January 12, 2005 @05:30AM (#11333001)
        JPEG does not use Run-length encoding as its last compression step. Quote from the faq:"The JPEG spec defines two different "back end" modules for the final output of compressed data: either Huffman coding or arithmetic coding is allowed." http://www.faqs.org/faqs/jpeg-faq/part1/section-18 .html [faqs.org] It goes on to say that arithmetic coding is ~10% smaller, but is patented, so don't use it. So what they are doing is removing the known chubby huffman coding and replacing it.
      • Re:Questions (Score:3, Interesting)

        by imsabbel ( 611519 )
        only that jpegs dont use RLE for encoding the koefficient data, but also a high level compression algorithm. Im not sure if its huffman, but something comareable. NO RLE.
        Thats the reason zip failes: its like zipping a zip. And try using rar on a zip compared to unzipping and raring the contends of the zip.
      • Re:Questions (Score:3, Informative)

        by tangent3 ( 449222 )
        High frequencies are not exactly truncated. It just so happens that after quantisation, most of them drop to 0. If there are any outstanding high frequencies that are non-zero, it will not be truncated.

        JPEG doesn't just rely on RLE, but rather it is a Huffman-RLE combination. Basically, for each element in the DCT block, JPEG will code the coefficient value together with the number of 0-coefficient elements after this, with the code taken from a Huffman table.
  • by g00z ( 81380 ) on Wednesday January 12, 2005 @04:34AM (#11332749) Homepage
    they are even proposing a new image format; StuffIt Image Format (SIF).
    Gee, I wonder where you could license that format?

    Man, they could have been a little bit more covert about their intentions and named it something a little less, umm, obvious.

    The current formats might not be perfect, but at least they are (relativly) free.
  • Um (Score:2, Interesting)

    by Anonymous Coward
    Stuffit repackages its expensive compression software every year (it seems). Now I would be happy to admit my mistake, but their main area of expertise seems to be marketing. Not technology. I reckon JPEG2000 or any number of other newer-than-JPEG formats already exceed whatever Stuffit purports to have accomplished. I suspect they are trying to tie their name to JPEG as a marketing gimmick to win hearts and minds. I doubt this is worth a mention on Slashdot.

    We don't need any new standards unless the
    • Re:Um (Score:3, Insightful)

      by dosius ( 230542 )
      I would go further: We don't need any new standards unless they are free of patents and open for use in FOSS.

      Moll.
    • " I suspect a company that actually specializes in imaging might have come up with a better solution a while ago."

      Shhhh!...

      They have a lot of CCDs and Memory cards to push out of stock yet. :->
  • I'm thinking that for cameras a high performance compression processor for this new algorithm might be the solution to the Camera issue. But the issue on PC's is more the processor time, not space.

    Anyone know if the compression on a chip for the camera is a feasable idea, or am I just not awake yet.
  • DV Video? (Score:3, Interesting)

    by patniemeyer ( 444913 ) * <pat@pat.net> on Wednesday January 12, 2005 @04:40AM (#11332777) Homepage
    Would this technique apply to DV video?
  • by putaro ( 235078 ) on Wednesday January 12, 2005 @04:41AM (#11332779) Journal
    If you read the whitepaper you will see that their algorithm is patent pending. The patent will almost certainly be granted, and, since no one else has done additional jpg compression before, it may even be deserved.

    However, do we want to subject ourselves to a new tax on images? If they make it, we don't have to go! Just say NO to patented file formats!
    • Nothing wrong with this kind of patent (method).

      If they start to enforce the patent immediately and are reasonable about it (going high volume instead of price), why not?

      If however they intend to hide it in the closet and go "Ahaaaaah" ten years later, screw them.
    • since no one else has done additional jpg compression before,

      You're kidding, right? [google.com] Don't buy into the patent office's self-serving assumption that software ideas are hard and deserving of government intervention. With 6,500,000,000 people in the world and the low entry bar for software it is a statistical certainty that most software ideas will be thought of by more than one person with no one person deserving protection.

      ---

      Patents by definition restrict distribution and are incompatible with sta

  • by wcitechnologies ( 836709 ) on Wednesday January 12, 2005 @04:42AM (#11332790)
    I have a friend who's father is a professional photographer. He has gigabytes and gigabytes of images stored for his customers, should they want to order re-prints. They're thinking about setting up raid terabyte file server. I can certainyl say that this is good news for them!
  • WTF? (Score:3, Funny)

    by m50d ( 797211 ) on Wednesday January 12, 2005 @04:50AM (#11332825) Homepage Journal
    An image compression comparison with no lenna? What's the world coming to?
    • Buncha snot-nosed little whippersnappers, that's what. When I wrote my first image compression program, I had to push the chads out of my paper tape. I had to open a window to let the heat from the processor out. I had to... Look, without Lenna [cmu.edu], we can't be sure this is an image algorithm at all. Probably just some genetic code gotten out of hand. Where's my raid, anyway? Oh. It's stuffed full of JPEGs. Including a beautiful hires one of lenna, considerably more than her head. See above link. :)
  • Bah (Score:2, Interesting)

    I thought they had found a method to further compress a JPEG, while still maintaining the original format. i.e. it could still be viewed with a regular JPG viewer. That kind of optimization would've been great, especially if it could be used on webpages, forums etc.

    But this is somewhat disappointing. The compression changes the format, and it must be decompressed to view it. Plus they don't intend on releasing the format, and their proposal for a new filetype which can be read by a "plugin" reeks of incomp
  • Patents? (Score:5, Insightful)

    by arvindn ( 542080 ) on Wednesday January 12, 2005 @04:59AM (#11332870) Homepage Journal
    If they've really achieved 25-30% over jpg, and it looks like they have, then its a truly amazing invention considering that jpeg has been around for so long. It would save at least about ten dollars worth of space on every digital camera. If you look at the humongous image archives that NASA and other research projects generate and the cost of tape to store them all, we're talking tens of billions of dollars of savings here.

    So, a question to slashdotters: do you think this kind of invention deserves to be protected by a patent? The standard response "software is already protected by copyright, patents are unnecessary" doesn't work, because anyone can study the code (even the binary will do), describe the algorithm to someone else, who can then reimplement it. Standard cleanroom process; takes only a couple of days for a competent team.

    If you're RMS, you probably believe that no one has the right to own anything and all inventions and ideas belong to the public, but the majority of us will agree that that's a tad extreme. So whaddya'll think? Myself, I'm totally undecided.
    • Depends what the patent is for. All they've done is strip off the existing lossless compression and applied a different compression algorithm. If that's what the patent is for, HELL NO. If it is for their particular algorithm, whatever.
    • Re:Patents? (Score:3, Insightful)

      by Vo0k ( 760020 )
      If they've really achieved 25-30% over jpg, and it looks like they have, then its a truly amazing invention considering that jpeg has been around for so long. ...because JPEG is free for all. Look up JPEG2000, DJVU and several other revolutionary "JPEG killers" that would rule the net nowadays if only the authors were insightful enough to release them as open standards. Now they all rot forgotten as nobody uses them, because nobody is willing to pay for using formats nobody uses because nobody is willing to
    • Re:Patents? (Score:5, Informative)

      by bit01 ( 644603 ) on Wednesday January 12, 2005 @08:49AM (#11334075)

      If you're RMS, you probably believe that no one has the right to own anything and all inventions and ideas belong to the public,

      Nonsense, RMS has never [fsf.org] said that. Please read a more widely before making such malicious accusations again. Don't buy into M$ marketing smear and FUD campaign.

      ---

      It's wrong that an intellectual property creator should not be rewarded for their work.
      It's equally wrong that an IP creator should be rewarded too many times for the one piece of work, for exactly the same reasons.
      Reform IP law and stop the M$/RIAA abuse.

  • http://www.djvuzone.org/wid/

    It's been around for a long time and open sourced.

  • by tangent3 ( 449222 ) on Wednesday January 12, 2005 @05:10AM (#11332907)
    The quantised DCT coefficients of a JPEG image are compressed using a JPEG standard huffman table. From what I've seen, this table is far from optimized even for "the average of the majority" of images out there.

    Ogg Vorbis stores its own huffman table in its own stream. The default encoder uses a table optimized for the general audio you can find out there. There is a utility called "rehuff" (goggle it yourself please) that will calculate and build a huffman table optimized for a particular stream and it seems that on average it reduces an Ogg Vorbis filesize by about 5-10%.

    Building an optimized huffman table for individual JPEGs will probably yield such improved compression rates too. If the original JPEG tables are less optimized than the Ogg Vorbis ones, the reduction will be even higher. But 30% seems a little... optimistic.
  • Are they getting a patent on it?
  • Try these links:
    Stuffit [nyud.net]
    compress JPGs by roughly 30% [nyud.net]
    whitepaper (PDF) [nyud.net]
  • by gotan ( 60103 ) on Wednesday January 12, 2005 @05:12AM (#11332922) Homepage
    To be honest i don't care about the 30% compression if there's the slightest danger that anyone might start a patent-war over the image-format or the compression algorithm.

    I've really seen enough of that (gif, mp3, jpeg) and i prefer spending the additional storage/bandwith capacity to another uppity "IP-shop" coming out to "0wn0rz" the internet with lawyering (maybe after a management-change).

    Let's have another look at that compression algorithm in 20 years or so.
  • by happyhippy ( 526970 ) on Wednesday January 12, 2005 @05:28AM (#11332990)
    so am probably one of the few to be lucky to delve into the mind warping subject.

    I wont bother going into the details of how it works (go read 'Fractal Image Compression' by Yuval Fisher*) but I concluded that fractal compression wasnt viable as there wasnt a general solution to suit all images. There are about 20 variables that you can decide, which give variable results to the final compressed image.
    And one set of variables would be excellent to compressing pics of trees down 80%, another set of variables would be excellent compressing pics of animals down 80%, but using each others set would only give results equal or worse than normal compression algorithms.
    Another factor at the time, five years ago, was that it took an hour to compress a 256x256 greyscale image on a 300MHz machine. Nowadays that isnt a factor.

    * If anyone has this on ebook please post a link here, Im dying to read this again.

  • Okay, so as near as I can figure out, this is a white paper because:
    It describes a new format .SIF as basically being having everything a jpg does except higher compression.

    Some might think it's a press release, since the "White Paper"'s discussion of the new format is largely limited to the benefits "OEMs" will have in using this (soon-to-be patented) technology, and explaining how it integrates into Allume's fine line of existence and future Digital Lifestyle(tm) products.
    Some might further argue tha
  • open patents (Score:3, Informative)

    by temponaut ( 848887 ) on Wednesday January 12, 2005 @06:32AM (#11333199) Homepage
    The JPEG standard specifies 2 entropy coding methods; Huffman coding and arithmic coding. As arithmic coding is patented it is not in use. The patents for this arithmic coding called Q-coding http://www.research.ibm.com/journal/rd/426/mitchel l.html [ibm.com] are in hands of IBM. Perhaps they will allow OSS to use this patent along with the 500 other patents recently allowed? http://www.ibm.com/ibm/licensing/patents/pledgedpa tents.pdf [ibm.com] The particular variant of arithmetic coding specified by the JPEG standard is called Q-coding. This variant has the advantage of not requiring any multiplications in the inner loop. Q-coding was invented a few years ago at IBM, and IBM has obtained patents on the algorithm. Thus, in order to use the JPEG arithmetic coding process, you must obtain a license from IBM. It appears that AT&T and Mitsubishi may also hold relevant patents.
  • JPEG - get it right (Score:3, Informative)

    by northcat ( 827059 ) on Wednesday January 12, 2005 @07:00AM (#11333310) Journal
    It's JPEG, not JPG. JPG is the file extension used for JPEG files on DOS systems because of 3 character file-extension limitation. JPEG is the name of the format/compression and the extension (which should be) used on systems that support 3+ character file-extensions. Because of the internet the JPG extension has spread and now people ignorantly use JPG to even refer to the format/compression.
  • Here's why it works (Score:5, Informative)

    by Richard Kirk ( 535523 ) on Wednesday January 12, 2005 @07:53AM (#11333628)
    These are the old JPEG images. I worked on DCT compression systems before JPEG, and had a tiny contribution to the freeware DCT code. When I saw the posting I immediatly suspected that the JPEG compression had been pushed up too high.

    The original JPEG compression algorithm had Huffmann coding for the DCT variables, but it also had some fixed-length codes for the beginning and end of blocks. If you set your compression at about 10x then you can hardly see the difference with real images. bring it up to 15x and the changes are still modest. However, yank it much over about 22x, and the image will go to hell. The reason is the block handling codes meant that a JPEG image with no data at all - a flat tint - would only compress at about 64x, so at 22x compression these block handling codes are about 1/3 of your overall code. The fractional bit wastage you get with using Huffmannn coding instead of arithmetic coding mops up some of the rest as you are usig shorter Huffmann codes. The codes are also very regular, as about 1/3 of the code is not particularly random. The 1/3 figure also matches the 30% compression figures too, which isn't surprising.

    Why didn't the original JPEG developers make a better job of this? Well, doing an experimental DCT compression used to take me hours or days when I was developing on a shared PDP-11, and there was always the worry that a dropped bit would lose your place in the code, and scramble the rest of the image. A little regular overhead was also useful for things like progressive JPEG control. I guess we all knew it was not as tight as things could have been, but it got the job done. We knew if you want to get 40x compression, then reducing your image to half size, and the compressing that by x10 will look better. Unfortunately, people who just drag a slider to get more compression don't always know that.

    The right solution would be to use JPEG2000 which has a much smaller block overhead, and so fails much more gracefully at higher compressions.

    • Somebody mod this up (Score:3, Interesting)

      by pclminion ( 145572 )
      I was going to post a similar explanation, but this person did it first, and probably better than I could have anyway.

      As another person who has implemented JPEG, I vouch for his accuracy :-)

  • just another humbug (Score:3, Interesting)

    by l3v1 ( 787564 ) on Wednesday January 12, 2005 @08:48AM (#11334062)
    You know what ? The breakthrough in JPEG compression was mainly JPEG-2000.

    There had been, was, and has been possible to achieve better compression ratios, and guess what, even in the high times of JPEG we knew that what's inside, it's not the optimal solution. But there were certain aspects which made it stay the way it was, and that was good to be so. The same goes with JPEG-2000. Since the appearance of it there have been many attempts to make it better, and there have been some good results achieved (even I have read and sometimes even reviewed papers dealing with the subject).

    It's really no question whether one can make an X% better compression to JPEG with the same quality (expecially today, when JPEG is so old that every joe and his dog had time to develop better ways). The question is, has it enough practical usability to justify its presence ? Is there a well-justified reason why we should use it ? Does it deliver
    - better compression rates (smaller size with the same objective & subjective quality) ?
    - lower compression times ?
    - compatibility ?
    - is it any better for hardware implementation purposes (same or lower computation times) ?

    If it's just a "better" compression for the compression's own sake and not for our sakes, then this is even less news than me cleaning my room.

  • Super! (Score:4, Insightful)

    by samrolken ( 246301 ) <samrolkenNO@SPAMgmail.com> on Wednesday January 12, 2005 @10:53AM (#11335596)
    Just yesterday I was looking to download their "Expander" in order to view some files a macintosh user was sending me (proofing a newspaper ad). By default, I suppose, it used the .sit format.

    After my experience there, can I expect to be led through a complicated and deceptive trick into downloading the trial version of some overpriced software, where I'm required to give up my email address, and the whole thing never works anyway? Stuffit might have great technology, but any company that wants to provide a proprietary format for anything will only be useful if *anyone* can open the format. Adobe knows that. And trying desperately to hook into people who *have* to turn to you to uncompress things and sell them things they most likely don't need isn't useful. It will just make StuffIt (.sit files) useless to people who *are* paying customers, because it's such a hassle for people to open the files.

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...