I would like to know if anybody has any experience with image compression using Go.
I have to create multiple smaller versions of many images (mostly jpg, some png), which works really great using Go.
But well, the customer found Google Page Speed Insights, which tells him that images could be smaller by X % with compression blablabla...
Any ideas how I could shrink filesizes with native Go? I'm not a friend of using jpegoptim with cli calls, couldn't find a Go port and tbh my math-fu isn't good enough to understand and implement the used algorythms myself
Also, what do you think about stripping EXIF and similar unncessary things that photoshop/lightroom/etc add to files?
评论:
kjell_k:
joetsai:Hi.
In image/jpeg.Encode - you have a quality option - you can lower this a bit to reduce file size and stil have good quality.
https://golang.org/pkg/image/jpeg/#Encode
Sorry for brief reply. Just have my Phone here :)
The PNG package in the standard library uses compress/flate under the hood. The algorithm in flate has been vastly improved over the years, but the compression ratio still falls short slightly when aiming for highly compressed static content.
There are several reasons:
- For performance reasons, compress/flate only performs 4-byte offset matches, while the DEFLATE format actually allows for 3-byte matches. This sometimes matters for images without an alpha channel (RGB is 3-bytes).
- Most PNG crushers use a specialized implementation of DEFLATE that heavily sacrifice compression speed for compression ratio (see https://en.wikipedia.org/wiki/Zopfli). This algorithm does not exist in compress/flate.
- (not specific to compress/flate) Some PNG crusher are actually lossy in that they discard some details from the image that humans cannot perceive so that DEFLATE can further benefit from more matches.
I can't really speak for JPEG and other lossy formats.
WikiTextBot:peterhellberg:Zopfli
Zopfli is data compression software that encodes data into DEFLATE, gzip and zlib formats. It achieves higher compression than other DEFLATE/zlib implementations, but takes much longer to perform the compression. It was first released in February 2013 by Google as a free software programming library under the Apache License, Version 2.0. The name Zöpfli is the Swiss German diminutive of “Zopf”, a unsweetened type of Hefezopf.
[ PM | Exclude me | Exclude from subreddit | FAQ / Information | Source ] Downvote to remove | v0.27
SSoreil:You can try this package if you want to experiment with lossy PNG compression: https://github.com/peterhellberg/lossypng
For JPEG you can use the quality option in the syandard library to reduce the file size (and visual quality)
Let me know if you find any (Go) native implementations of JPEG optimizers.
On stripping EXIF, it's highly unlikely to be a significant amount of data. Files which do have a lot of EXIF data are probably straight out of camera and aren't what you are putting up to begin with. Also, stripping out EXIF data for no reason might not be what you want at all points.
Not that I know much about the rather arbitrary goal you are trying to reach here, if you want to crush files you probably want to call out to the CLI for that. I wrote some binding for PNGquant or whatever it's called a few years back to call it as a library and it was very inconvenient. I don't think the code still exists at this point.
image/jpeg and image/png are probably good enough.
