Losslessly Compressing My JPEG Photos with jpegoptim2019-09-21
I’ve recently been running low on disk space on my laptop. I’ve freed some by removing files, but I’ve also been looking for ways to save space through compression.
My photo collection is currently 117GB. And that’s after removing the “everyone closed their eyes” shots and walrus memes!
Looks like a prime candidate for compression.
jpegoptim is an open source utility for losslessly optimizing JPEG’s.
It’s from simpler times when names were obvious and websites didn’t need CSS.
It works its magic by optimizing the Huffman coding used to compress the image data.
JPEG encoders don’t always find the optimal coding for an image, prioritizing speed over perfection. Camera software especially opts for speed to keep the “shutter” available.
I wanted to verify jpegoptim would be safe to use on my photos. I first optimized a single photo:
I verified jpegoptim preseved the exact pixels with GraphicsMagick
compare on the before and after images:
Zero difference - they’re exactly the same!
The hashes of the bitmap files are the same, so they contain exactly the same data.
jpegoptim is indeed lossless.
JPEG’s also have Exif metadata. This contains tags such as date/time the photo was actually taken, the GPS coordinates, and the camera settings.
exiftool and [
diff] to check that
jpegoptim preserves all this metadata:
The only differences are in non-EXIF fields that
exiftool outputs, such as the file name and creation time.
The other EXIF data remains in tact, including the time the photo was actually taken.
But How Much Savings?
When I ran
jpegoptim above its output ended with:
15% saving - not bad! (Or “good” in non-British English.)
This image might have been randomly more or less compressible though.
To get a more accurate figure, I ran
jpegoptim on my entire “incoming imports” folder.
This contains the last 30 days of photos imported from my phone (my only camera).
I checked the disk usage of the folder before and after with
So we went from 2.4GB to 2.0GB. 0.4 / 2.4 = 0.16 = 16% savings. Across my current collection that would be 117 ⨉ 0.16 = 18.7GB. Not bad indeed!
(N.B. The first and last images in the folder, starting with random letters rather than
IMG_, are from WhatsApp.
It seems they are better compressed than those taken with my phone’s camera, so there are less savings to be had.
The camera images compressed up to 40%.)
jpegoptim compresses an image “negatively,” like the last one in my output, it means it can only find worse codings.
It leaves the original in place, rather than increasing the file size!)
I’ll be running
jpegoptim on my whole collection in due course.
It takes a lot of internet bandwidth afterwards: my backup software Arq has to back up the photos again, since they’re entirely new data.
I’ll also make it part of my import process so I don’t need to think about it again.
I hope this post has helped advertise a great tool to you,
Are your Django project's tests slow? Read Speed Up Your Django Tests now!
One summary email a week, no spam, I pinky promise.
© 2020 All rights reserved.