Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software

Choosing Better-Quality JPEG Images With Software? 291

kpoole55 writes "I've been googling for an answer to a question and I'm not making much progress. The problem is image collections, and finding the better of near-duplicate images. There are many programs, free and costly, CLI or GUI oriented, for finding visually similar images — but I'm looking for a next step in the process. It's known that saving the same source image in JPEG format at different quality levels produces different images, the one at the lower quality having more JPEG artifacts. I've been trying to find a method to compare two visually similar JPEG images and select the one with the fewest JPEG artifacts (or the one with the most JPEG artifacts, either will serve.) I also suspect that this is going to be one of those 'Well, of course, how else would you do it? It's so simple.' moments."
This discussion has been archived. No new comments can be posted.

Choosing Better-Quality JPEG Images With Software?

Comments Filter:
  • Easy (Score:3, Interesting)

    by Anonymous Coward on Thursday July 16, 2009 @06:05PM (#28723511)

    Paste both images in your image editor of choice, one layer on top of each other, apply a difference/subtraction filter.

  • File size (Score:2, Insightful)

    by Tanman ( 90298 )

    it is lossy compression, after all . . .

    • Re:File size (Score:5, Informative)

      by Robotbeat ( 461248 ) on Thursday July 16, 2009 @06:14PM (#28723651) Journal

      File size doesn't tell you everything about quality.

      For instance, if you save an image as a JPEG vs. first saving as a dithered GIF and _then_ saving as JPEG, then the second one will have much worse actual quality, even if it has the same filesize (it may well have worse quality AND have a larger file size).

      • Re: (Score:3, Interesting)

        by Vectronic ( 1221470 )

        Also, stuff like Photoshop, will insert a bunch of meta/exif-bullshit but something like Paint, doesn't... it's usually only about 2 to 3kb, but it's still tainting your results if you are going by size alone.

    • Re:File size (Score:4, Insightful)

      by teko_teko ( 653164 ) on Thursday July 16, 2009 @06:16PM (#28723667) Homepage

      File size may not be accurate if it has been converted multiple times at different quality, or if the source is actually lower quality.

      The only way to properly compare is if you have the original as the control.

      If you compare between 2 different JPEG quality images, the program won't know which parts are the artifacts. You still have to decide yourself...

      • I just thought of a possible way to compare...

        Assuming both JPEG aren't at the lowest (or very low) quality:

        1. Take image A, create 10 or 20 more copies using different levels of quality (5, 10, 15, and so on).
        2. Compare each of them with image A, from lowest to highest quality.
        3. Stop where the diff no longer change with the previous image, then we can assume image A is at the previous image's quality level.

        Do the same with image B.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      File size doesn't tell you anything. If I take a picture with a bunch of noise (eg. poor lighting) in it then it will not compress as well. If I take the same picture with perfect lighting it might be higher quality but smaller file size.

      Why this is modded up, I don't know. Too many morons out there.

      • I sort of had the impression the person was talking about the exact same picture, saved from the original to two different qualities of JPEG. If he were trying to tell the difference between the amount of JPEG artifacts in two different pictures, I imagine he would get inconsistent results, given many trials, for the reasons you say.

        I suppose he could have meant something different than what he said, but there aren't too many politicians trolling slashdot, I'd guess.

      • But if they're duplicate pictures (some kind of matching heuristic), then file size most certainly IS appropriate. You're starting from the same point, choosing the result with less lost during compression, and therefore larger, would be quite logical.
      • by 4D6963 ( 933028 )

        Why is yours modded up higher I wonder. The OP wants to "compare two visually similar JPEG images and select the one with the fewest JPEG artifacts". That means they're the same image. That means file size will help you there, unless they're not the same resolution, although it should do regardless.

        If I take a picture with a bunch of noise (eg. poor lighting) in it then it will not compress as well. If I take the same picture with perfect lighting it might be higher quality but smaller file size.

        That's

      • Re:File size (Score:5, Insightful)

        by timeOday ( 582209 ) on Thursday July 16, 2009 @10:11PM (#28725545)
        This is the kind of problem you can solve in 2 minutes with 95% accuracy (by using file size), or never finish at all by listening to all the pedants on slashdot. When people know a little too much they love to go on about stuff like entropy and information gain, just because they (sort of) can.

        Try file size on the set of images of interest to you and see if it coincides with your intuition. If it does, you're done.

    • Re:File size (Score:5, Informative)

      by Shikaku ( 1129753 ) on Thursday July 16, 2009 @06:29PM (#28723825)

      http://linux.maruhn.com/sec/jpegoptim.html [maruhn.com]

      No. You can compress JPEG lossless.

    • NO. Not file size. File size would be a potential test if all images were from the same original source and if they were only ever jpeg compressed once. Unfortunately, quite often one will come across images that have been jpeg compressed and re-compressed, and the final re-compression was done at "high quality', So the file is large for the image, but it still contains all of the jpeg artifacts from the lower quality compression. You can also see extra artifacts when one file has only been compressed once

  • But what if you saved both images in an uncompressed format (bmp?), then compressed them both using a lossless format (gzip?), and compared the file sizes...

    Do it with a bunch of images, and I expect you'll discover that the low-quality-gzipped image will be smaller than the high-quality-gzipped image...

    Maybe? *shrug*

    • I agree that this would probably be the simplest method. Note, I wonder if something as simple as examining the file size of the jpeg would be good enough for most cases.
    • Good idea. I'm also not an expert. Though, I would think there is a limit to how well this would work. If it were cell shaded to some extent, it might look better than a lossy jpg, but compress to a smaller size. The question is if there would be any point in between where loss of information would actually result in better image quality.

      Imagine a chess board is in the image. If an image is sort of lossy, the lines between the black and white might get a little blurred with some black running into some whit

    • by sznupi ( 719324 )

      I'm also not an expert, but I suspect it might work in the other direction far too often.

      Perhaps artifacts of low-quality jpeg images, embedded in simple stream of bmp, could look more like noise to general purpose compressor; more than "natural" photographs with gradual gradients.

      And random noise is incompressible.

  • Have you tried just comparing the files' sizes with respect to the images' dimensions? It'll vary from encoder to encoder, but higher-quality JPEGs will be larger than lower-quality ones. You could just use the number of pixels in the picture and the file size to obtain a rough approximation of "quality per pixel" and choose the image with the higher value. It won't be perfect, but it's a lot easier than trying to pick out JPEG artifacts.

    Also, the number of artifacts doesn't tell the full story. One imag
    • by PCM2 ( 4486 )

      And BTW, isn't this what most of us do already when we're searching Google Images?

  • Given a set a pictures, it would be really nice to see them grouped by "these are several pictures of the same scene/object/subject". This is a tool I'm not aware of yet, and I'd love to hear what open-source tools people are using.

    As a next step, it would be neat to pick out the one that's most in focus...

    • by Chabo ( 880571 )

      I saw a piece of software that does something similar to what you're talking about; recently I watched James May's Big Ideas [wikipedia.org], they showed a camera that you wear around to create a lifelog [wikipedia.org].

      The camera took photos every 30 seconds or so, and the software was able to divide sets of photos into "events"; it distinguished between the time the wearer was in the kitchen making breakfast, and when they sat at their computer typing up an article, for instance. I imagine that someone's created similar software for pub

    • Check out Tineye - http://tineye.com/faq [tineye.com]

      It does not do exactly what above post suggests, but it partially does what submitter asked (finding similar images on the net).
  • I suppose you could recompress both images as JPEG with various quality settings, then do a pixel-by-pixel comparison computing a difference measure between each of the two source images and its recompressed version. Presumably, the one with more JPEG artefacts to start with will be more similar to its compressed version, at a certain key level of compression. This relies on your compression program generating the same kind of artefacts as the one used to make the images, but I suppose that cjpeg with the

  • or else the problem is not truly resolvable. The other way is to
    assume all the similar images come from the same source, if so then
    its as simple as looking at the compression level in the file format
    and the various levels of scaling applied to the lossy images.

  • by bcrowell ( 177657 ) on Thursday July 16, 2009 @06:12PM (#28723623) Homepage

    The ImageMagick package includes a command called identify, which can read the EXIF data in the JPEG file. You can use it like this:

    identify -verbose creek.jpg | grep Quality

    In my example, it gave " Quality: 94".

    This will not work on very old cameras (from ca. 2002 or earlier?), because they don't have EXIF data. This is different info than you'd get by just comparing file sizes. The JPEG quality setting is not the only factor that can influence file size. File size can depend on resolution, JPEG quality, and other manipulations such as blurring or sharpening, adjusting brightness levels, etc.

    • Re: (Score:3, Informative)

      imagemagick can also compare two images, and tell you how different they are. That is -- quantify the differences by returning a floating point number or two (PSNR, RMSE) in a way that a more-compressed JPEG image will return a correspondingly different floating point value. I know the question concerns two JPEG-compressed images, but if you do have an original image -- and you want to test which is closest to the original, ImageMagick can do that. Use the ImageMagick compare function.
      See http://www.imag [imagemagick.org]
    • It's appears that you assume that he wants to compare images for which he himself is the source? What if the images he actually wants to compare are pr0n, of the same hi-res glamour photo sets obtained from different sources? He needs to decide which is the "best" pron to keep, right? (Never mind that he can probably jack off equally well to either/any... he's a COLLECTOR so it matters. :-)

      Such images will almost always have the EXIF data scrubbed from them, so your technique wouldn't work at all for rac

  • by Chyeld ( 713439 ) <chyeld.gmail@com> on Thursday July 16, 2009 @06:13PM (#28723627)

    Dear Slashdot,

    Recently I checked my porn drive and realized that I have over 50 gigibytes of jpg quality porn collected. Unfortunately, I've noticed that a good portion of these are all the same picture of Natlie Portman eating hot grits. Could you please point me to a free program that will allow me to find the highest resolution, best quality version of this picture from my collection and delete the rest?

    Many Thanks!

  • all things the same, jpeg quality gives a good index to the quality of the image,
    but it can be just as true that a lower jpeg quality image might be a better quality image.

    for example, two images: the first image might be scanned off a badly faded
    colour photocopy of a famous painting - it is saved at 300 dpi - approximately
    2800 x 1200 pixels, and the jpeg quality set at 12 -- the second image is a
    well lit photograph of the original painting, scanned on a scitex scanner,
    and brought in as a tiff original -- a

  • To make a JPEG, you cut it into blocks, run the DCT [wikipedia.org] on each block and mess with the 4:2:2 color formula and pkzip the pieces... That said, I would think measuring the number of blocks would be related to number of artifacts... In my barbaric approach to engineering, (assuming there is no other suggested way on slashdot), I would get the source code to the JPEG encoder/decoder and print out statistics (number of blocks, block size) of each image...
  • It's easy (Score:5, Insightful)

    by Anonymous Coward on Thursday July 16, 2009 @06:16PM (#28723673)

    Run the DCT and check how much it's been quantized. The higher the greatest common factor, the more it has been compressed.

    Alternatively, check the raw data file size.

  • by angryargus ( 559948 ) on Thursday July 16, 2009 @06:17PM (#28723683)

    Others have mentioned file size, but another good approach is to look at the quantization tables in the image as an overall quality factor. E.g., JPEG over RTP (RFC 2435) uses a quantization factor to represent the actual tables, and the value of 'Q' generally maps to quality of the image. Wikipedia's doc on JPEG has a less technical discussion of the topic, although the Q it uses is probably different from the example RFC.

  • Measure sharpness? (Score:4, Interesting)

    by Anonymous Coward on Thursday July 16, 2009 @06:18PM (#28723693)

    Compute the root-mean-square difference between the original image and a gaussian-blurred version?
    JPEG tends to soften details and reduce areas of sharp contrast, so the sharper result will probably
    be better quality. This is similar to the PSNR metric for image quality.

    Bonus: very fast, and can be done by convolution, which optimizes very efficiently.

    • by PCM2 ( 4486 )

      But this method requires a copy of the original -- or failing that, you'd need to already know which of the JPEGs is the highest quality, which defeats the purpose.

      • by uhmmmm ( 512629 )

        No it doesn't. This method has another problem (see my replies to it), but other than that, it could work. He's suggesting that to each copy of the image, you look at the difference between that copy and a blurred version of it. This will give you an idea of how sharp that copy is. And since JPEG throws out high frequency information first, resulting in blurring, it would appear at first glance that the sharper image should be the higher quality one.

        As I said in another comment though, JPEG operates on

    • by uhmmmm ( 512629 ) <uhmmmm@gmCOUGARail.com minus cat> on Thursday July 16, 2009 @06:52PM (#28724109) Homepage

      Even faster is look at the DCT coefficients in the file itself. Doesn't even require decoding - JPEG compression works by quantizing the coefficients more heavily for higher compression rates, and particularly for the high frequency coefficients. If more high frequency coefficients are zero, it's been quantized more heavily, and is lower quality.

      Now, it's not foolproof. If one copy went through some intermediate processing (color dithering or something) before the final JPEG version was saved, it may have lost quality in places not accounted for by this method. Comparing quality of two differently-sized images is also not as straightforward either.

    • by uhmmmm ( 512629 )

      Also, JPEG works on blocks. While it's true that JPEG gets rid of high frequency details first (and thus results in blurring), this is only useful within each block. You can have high contrast areas at the edge of each block, and this is actually often some of the most annoying artifacting in images compressed at very low quality. So just because it has sharp edges doesn't mean it's high quality.

  • DCT (Score:5, Informative)

    by tomz16 ( 992375 ) on Thursday July 16, 2009 @06:18PM (#28723695)

    Just look at the manner in which JPEGs are encoded for your answer!

    Take the DCT (discrete cosine transform) of blocks of pixels throughout the image. Examine the frequency content of the each of these blocks and determine the amount of spatial frequency suppression. This will correlate with the quality factor used during compression!

       

    • Re: (Score:3, Insightful)

      by mikenap ( 1258526 )
      This seems to me the best suggestion, and there's a simple visual way to accomplish it! The hardest hit part of the image is going to be the chroma information, which your eye normally has reduced resolution sensitivity for in a normal scene. To overcome this, load your JPEGs into your favorite image editor and crank the saturation to the max(this throws away the luminance data). Now the JPEG artifacts in the chroma information will HIT YOU IN THE FACE, even in images that seemed rather clean before. Pick
    • Re:DCT (Score:4, Insightful)

      by eggnoglatte ( 1047660 ) on Thursday July 16, 2009 @10:12PM (#28725553)

      That works, but only if you have exact, pixel-to-pixel correspondence between the photos. It won't work if you just grab 2 photos from flicker that both show the Eiffel tower, and you wonder which one is "better".

      Luckly, there is a simple way to do it: use jpegtran to extract the quantization table form each image. Pick the one with the smaller values. This can easily be scripted.

      Caveat: this will not work if the images have been decoded and re-coded multiple times.

  • by Anonymous Coward on Thursday July 16, 2009 @06:20PM (#28723719)

    load up both images in adobe after effects or some other image compositing program and apply a "difference matte"

    Any differences in pixel values between the two images will show up as black on a white background or vise versa...

    adam
    BOXXlabs

    • Re: (Score:3, Insightful)

      by uhmmmm ( 512629 )

      So, that will show you which parts differ. How do you tell which is higher quality? Sure, you can probably do it by eye. But it sounds like the poster wants a fully automated method.

  • Try ThumbsPlus (Score:3, Informative)

    by Anonymous Coward on Thursday July 16, 2009 @06:21PM (#28723729)

    ThumbsPlus is an image management tool. It has a feature called "find similar" that should do what you want as far as identifying to pictures that are the same except for the compression level. Once the similar picture is found you can use ThumbsPlus to look at the file sizes and see which one is bigger.

  • Compute the number of bits per pixel of the image data.
  • by trb ( 8509 ) on Thursday July 16, 2009 @06:28PM (#28723813)
    google (or scholar-google) for Hosaka plots, or image quality measures. Ref:

    HOSAKA K., A new picture quality evaluation method.
    Proc. International Picture Coding Symposium, Tokyo, Japan, 1986, 17-18.

  • Blur Detection? (Score:2, Informative)

    by HashDefine ( 590370 )

    I wonder if out of focus or blue detection methods will give you a metric which varies with the level of jpeg artifcats, after all the jpeg artifacts should make it more difficult to do things like edge detections etc which are the same the things that made more difficult by blurry and out of focus images

    A google search for blur detection should bring up things that you can try, Here [kerrywong.com] is series of posts that to do a good job of explaining some of the work involved

  • Assuming the only quality loss is due to JPEG compression, I guess a fourier transform should give you a hint: I think the worse quality image should have lower amplitude of high frequencies.

    Of course, that criterion may be misleading if the image was otherwise modified. For example noise filters will typically reduce high frequencies as well, but you'd generally consider the result superior (otherwise you woldn't have applied the filter).

  • Filters (Score:5, Funny)

    by mypalmike ( 454265 ) on Thursday July 16, 2009 @06:47PM (#28724045) Homepage

    First, make a bumpmap of each image. Then, render them onto quads with a light at a 45 degree angle to the surface normal. Run a gaussian blur on each resulting image. Then run a quantize filter, followed by lens flare, solarize, and edge-detect. At this point, the answer will be clear: both images look horrible.

  • by yet-another-lobbyist ( 1276848 ) on Thursday July 16, 2009 @06:55PM (#28724165)
    For what it's worth: I remember using Paint Shop Pro 9 a few years ago. It has a function called "Removal of JPEG artifacts" (or similar). I remember being surprised how well it worked. I also remember that PSP has quite good functionality for batch processing. So what you could do is use the "remove artifact" function and look at the difference before/after this function. The image with the bigger difference has to be the one of lower quality.
    I am not sure if there is a tool that automatically calculates the difference between two images, but this is a task simple enough to be coded in a few lines (given the right libraries are at hand). For each color channel (RGB) of each pixel, you basically just calculate the square of the difference between the two images. Then you add all these numbers up (all pixels, all color channels). The bigger this number is, the bigger the difference between the images.
    Maybe not your push-one-button solution, but should be doable. Just my $0.02.
  • compare both images against the original, not each other.
    count number of pixels different from the original, then calculate max and average difference between either image and the original.

    decide which parameter means more to you.

    go forward from there.

    • adding to that, you can run the following algorithm on the diff images.
      1. blur image by an arbitrary value,
      2. darken the image by an arbitrary value.
      3. repeat until image is all black.

      count the number of repetitions. given various values for steps one and two, you can tune the algorithm to find images that have large areas of mismatch.

      possibly not useful to you, but have found it good for validation testing for image manipulation software.

  • How about audio? (Score:2, Interesting)

    I would very much like to do the same with audio. I have so many duplicate tracks in my music collection in different formats and bitrates.
    • If you're running a mac and have all your files in an itunes library, then Dupin [dougscripts.com] is extremely useful. It matches on name, size, length, bit rate, or all at once.

      It's pretty useful, and the freeware version lets your delete from drive as well as library.

      If you're on windows, I searched for years and couldn't find anything :(

  • by uhmmmm ( 512629 ) <uhmmmm@gmCOUGARail.com minus cat> on Thursday July 16, 2009 @07:09PM (#28724301) Homepage

    JPEG works by breaking the image into 8x8 blocks and doing a two dimensional discrete cosine transform on each of the color planes for each block. At this point, no information is lost (except possibly by some slight inaccuracies converting from RGB to YUV as is used in JPEG). The step where the artifacts are introduced is in quantizing the coefficients. High frequency coefficients are considered less important and are quantized more than low frequency coefficients. The level of quantization is raised across the board to increase the level of compression.

    Now, how is this useful? The reason heavily quantizing results in higher compression is because the coefficients get smaller. In fact, many become zero, which is particularly good for compression - and the high frequency coefficients in particular tend towards zero. So partially decode the images and look at the DCT coefficients. The image with more high frequency coefficients which are zero is likely the lower quality one.

  • Something like $\frac{1}{N} \sum_{i=1}^{N}(x_i-y_i)^2$, where $x$ and $y$ are arrays of pixels, and $N is the number of pixels in each array?

  • Does JPEG header have the compression method listed as well as compression ratio? If not, is there any way to figure out what kinda compresison engine is used base on how an image is constructed?

    If so, simply do some testing against some of the most popular compression engine base on the artifact to determines what engine is used, then find out their compression ratio (perhaps a simple files size might work?). Then simply pick the images with the best quality base on engine used and ratio?
  • Compute the variance of the Fourier coefficients within each block and then calculate the average for each image. The better quality image should have lower variance. If a block has a lot of edges, then the higher frequency coefficients should have much higher values than the lower ones. If a block is uniform, then the lower frequency coefficients should have higher values. So if you have a good image, it will be easy to see the difference between uniform parts and edges. That is the coefficients of th
  • find dupes on the internet http://tineye.com/ [tineye.com]
    find dupes on your HDD http://www.bigbangenterprises.de/en/doublekiller/ [bigbangenterprises.de]

  • JPEG is pretty efficient at compressing images -- the only way they get smaller on average is by increasing the quality loss. Therefore, the larger of the two images in bytes is probably the better looking copy.

  • Well, your problem is that image quality is subjective. Can computers make good subjective judgements? Not really.

    Let's say you count the number of pixels that are different? Well, what if JPEG usually slightly alters the brightness? You could weight the difference, but what if JPEG sometimes moves an edge by a pixel?

    I think if you study a bit about how JPEG works, you might find that you can computationally determine how much information that is lost; but that does not mean that your computed number in

  • Expert's answer (Score:2, Interesting)

    by mezis ( 595240 )
    Exploit JPEG's weakness.

    JPEG encodes pixels by using a cosine transform on 8x8 pixel blocks. The most perceptually visible artifacts (and the artifacts most suceptible to cause troble to machine vision algorithms) appear on block boundaries.

    Short answer:
    a. 2D-FFT your image
    b. Use the value of the 8-pixel period response in X and Y direction as your quality metric. The higher, the worse the quality.

    This is a crude 1st approximation but works.
  • Aside from the mathematical tests some have suggested, my gut tells me this is going to be almost impossible. There are tasks that a human can perform that just aren't doable given the present state of our software systems. The gap has as much to do with our understanding about how we perceive through our senses as it does with algorithms and calculation methodologies. We just don't know yet enough about the underlying processes to make a computer do it.

    The same goes for other areas where AI is sorely la

  • by rwa2 ( 4391 ) * on Friday July 17, 2009 @11:43AM (#28730659) Homepage Journal

    You probably don't necessarily want to find the "best quality" image, but rather the image that was closest to the original.

    I take it you're either trying to eliminate the low-quality duplicates or thumbnails from a really large collection of pr0n, or trying to write an image search engine that tries to present the "best" rendition of a particular image first.

    1. As a quick first pass (after you've run through to collect all the similar images into separate groups), you'd obviously want to find the version of the image with the highest resolution. This might let you easily throw out thumbnails or scaled down versions you might come across. Of course, some dorks will upscale images and post them somewhere, so you might still want to hang on to some of them for the second stage.

    2. For the second pass, you'd likely want to scan through the metadata first, especially stuff exposed by EXIF. So you'd want to give higher scores to EXIF data that makes it sound like it came directly off a digital camera or scanner, and bump down the desirability of pictures that appeared to have been edited by any sort of photo editing software.

    3. Then maybe you want to look at something that would rank down watermarks or other modifications.

    4. Another step would be to compare compression quality, but I think that's what most of the other posts are concentrating on. But this is a difficult step because it can be easily fooled, since idiots can re-save a low quality image with the compression quality cranked all the way up so the file size becomes high even though the actual image quality is worse than the original. You probably need to run it through one of those "photoshop detectors" that could tell you whether the image has been through smoothing or other filters in a photo editor. The originals (especially in raw format and maybe high quality JPEG) will have a certain type of CCD noise signature that your software might be able to detect. In the same vein, a poorly-compressed JPEG will have lots of JPEG quantization artifacts that your software might be able to detect as well. Otherwise, you're kinda left with zooming in on pics and eyeballing it.

    5. Finally you might be left with a group of images that are exactly the same but have different file names... you probably want some way to store some of the more useful bits of descriptive text as search/tag metadata, but then choose the most consistent file naming convention or slap on your own based on your own metadata.

    Hopefully this gives you a start to important parts of the process that you might have overlooked...

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...