website
website copied to clipboard
Impliment webp image format
Expected behavior
Use the latest and greated image format to improve speed. Gracefully degrade to larger images when needed.
Actual behavior
We use an old, dated, image formats.
Want to back this issue? Post a bounty on it! We accept bounties via Bountysource.
Have tests been done that show webp will blanket improve performance?
I Ask because generally, this isn't true across the board, like all formats, webp has use-cases.
https://caniuse.com/#feat=webp
Google chrome audit shows improvements with webp, but I haven't done tests. I think the biggest improvement will be on the homepage, where we have a lot of full transparent-backed screenshots.
Was it yellow or red? Sadly Google is trying to get a lot of people on their format through tactics I don't agree with. For one site I was on it said just 20kb saved across images for conversion (yellow). Even on dial-up 20kb isn't the end of the world.
For example there is low-hanging fruit with the jpg images that has yielded significant savings using C utility (available in apt) jpegoptim --max=80 --threshold=1 --dest=prod/assets/imgs/
. This sets jpeg quality to 80 and threshold for savings to minimum 1%.
_images/screenshots/[email protected] 229,448 byte image became 141,018 bytes without using any exotic commands. That is from the current website (just shy of 40%).
It's really up to you. There are optimizers for all assets, I'd still retain originals if I were you, but testing something less exotic IMO might be less effort all round.
pngquant --ext .opt.png --quality 80 --iebug [email protected]
basically halved the PNG filesize from 222503 bytes to 92,447 bytes. It's also debian / ubuntu available.
Looks like the JPEG for this camera file can be halved to < 300kb
The png more than halved from 2.5 MB to 764.3 kB
@Lewiscowles1986 The images in the _images
folder are not what is served on the site. Running npm run build
will create an images
directory that shows the final images that would be served.
The first example of images were downloaded from the final site that was live last night. Other images were taken direct from master branch of repo
Chrome shows it as red (1546 kb).
Also, the images directly from master branch are not the ones we serve. All of the images are put through imagemin
during the build process, which in turn has them go through jpegtran
and pngquant
.
The JPEG for the camera screenshot is served at 122 KB.
The settings I've linked take the appcenter.jpg
image from the homepage original, optimized
56.1kb (from the method I've posted here) from 93.1kb (on website). They take the camera.jpg
image original, optimized from 125.1kb(live website) and make it 68.6kb (from the method above in comments here).
I'm unsure which page is 1546kb (1.5 mb) or what settings you're passing to jpegtran
and pngquant
, if you've considered late loading, and other mechanisms. If you're saying you'd rather use webp than switch tooling etc.
I'm just providing an alternative that works in vanilla browsers and saves quite the % for both @2x and regular jpg, and png images being served.
Looking at the source code, it seems that the jpeg command ends up being jpegtran -copy none -progressive -optimize
. There is no option to change the quality.
Chrome shows a 1546 kb improvement on the homepage with webp. We have not looked into late loading or viewport loading yet.
The numbers are not inconsistent with the numbers I'm seeing here. Total savings on @2x images with 80% quality (very high visual-parity) is .8mb (1.1mb from 1.9mb), regular size images from homepage is 511.2 kilobytes down from 1.2mb, so about 1.2mb savings on 3mb of the homepage are available without any additional formats. These are all applied to the files served from the web currently.
If you were to accept minor visual effects (I get it you're targetting a high-gloss userbase) then you could push that a bit further to ~1mb savings on @2x images, 651.3kb saved on regular images down from 1.2mb to 548.7kb for a total of 1.4mb served from 3.0mb downloaded (1.6mb saving which is greater than webp figures quoted).
I realize this is a lot of information, the only change for jpegoptim is the max quality number from 80 to 70. I also see looking into the branches you have a fair bit of effort for webp.
In addition to the changes something I use on a media heavy client site is slick.js. It can tell when images are visible, you add an svg loading img for all images, when images are not visible the full file is not downloaded, so that can also improve pagespeed scores and reduce bandwidth from bounced visitors, but it adds to frontend JS, and users without JS get left behind, which is a change from current behavior.
I started looking at this repo by accident. I was trying to find a place to report a bug with the software center locking up, mis-clicks and a bit of curiosity later down the rabbit hole :wink:.