webxr-input-profiles icon indicating copy to clipboard operation
webxr-input-profiles copied to clipboard

Compressing assets textures and a possible asset pipeline optimization?

Open riccardogiorato opened this issue 4 years ago • 4 comments

What have I done?

I have done a rough compression of some of the biggest assets combining a simple decompression from glb to gltf with texture extraction, compression of textures and then went back again to glb. I'm doing these tests in my fork: https://github.com/Giorat/webxr-input-profiles

I used:

All done with a simple bat file cause I'm using windows but should be similar to what you can do with a simple linux/mac script.

Idea for an optimization pipeline

If it's needed I could also add in a folder you want a more complete node script to automatically compress new models for the future assets? This could be also useful if we want to produce automatically models with Draco compression or Basis texture easily.

Side question always on models

I have seen that some of the assets have png textures and others have jpg. What format should be used on all of them? Using png could help to keep the quality fixed without artifacts due to compression.

riccardogiorato avatar Jan 24 '20 06:01 riccardogiorato

Taken from @toji comment on my closed first PR here: https://github.com/immersive-web/webxr-input-profiles/pull/156 "I tried reading up on tinypng, but I'm still fuzzy on a couple of things. It is a lossy compression algorithm, right? While that is probably OK for base color maps I'm concerned that it would introduce noticable artifacts into non-sRGB inputs, such as normal maps or metallic-roughness maps. Also, does repeatedly running tinypng on an image cause stacking artifacts? And just to verify, the compressed files are still just standard PNGs, right?

To answer a question from the related issue: I would be totally fine with changing any JPG textures to PNG as long as it doesn't significantly increase file size. Many of the textures have large sections of flat or similar colors, so I suspect that we should be OK on that front with the tinypng compression."

Will post more options as comment here to discuss them before doing any other new PR on this issue.

riccardogiorato avatar Jan 27 '20 10:01 riccardogiorato

Putting the compressing on the texture will help with download size but it will be loaded uncompressed on the end user's hardware.

Once Basis support is in GLTF it maybe better to use that format to get the compression benefits on both download and in graphics memory.

An additional form of compression which can be applied would be Draco compression for the Geometry. Which is something I think is already supported in GLTF but I haven't tried it myself yet.

AdaRoseCannon avatar Jan 27 '20 10:01 AdaRoseCannon

Putting the compressing on the texture will help with download size but it will be loaded uncompressed on the end user's hardware.

In some cases this won't happen if you simply decrease textures colors. But it will change for sure with Draco that will require some decompression time.

riccardogiorato avatar Jan 27 '20 10:01 riccardogiorato

glTF already supports Draco (It's used with all the meshes on https://xrdinosaurs.com, for example) and while Basis isn't currently embeddable it could be distributed separately alongside the mesh, but that's not the real hold up in the case of using those two libraries.

The primary issue is that we want to ensure that the assets are accessible to as large a number of pages as possible. While implementing Draco and Basis isn't overtly difficult, we still don't want it to be a barrier to entry, and both libraries do come with download and startup overhead that may not make sense if these controller assets are the only thing you're using them for.

It is absolutely possible that we could make Draco/Basis compression part of the build process and have separate assets hosted in a different location for pages that do want to opt in to using them, but I don't have the time to put into that tooling at the moment. Also, given that we'd then be hosting multiple copies of each asset I'd prefer that we had a dedicated CDN available before making that leap. In the meantime, a compression method like this introduces no new overhead on the part of the page, which makes it an ideal first step into shrinking our asset sizes.

toji avatar Jan 27 '20 21:01 toji