jaxngp
jaxngp copied to clipboard
Standalone hash encoding network?
Hello,
Thank you for this amazing work! I am looking for a lof of the elements that you have used in your nerf application, for example the jax-tcnn is a really useful library on its own. I am not familiar with "nix" and am using Docker for my development environment (https://github.com/JAX-DIPS/JAX-DIPS), is it possible to provide me with instructions of how I can only install this library in my project please? I appreciate any help you could offer me.
Thank you, Pouria
Hi Pouria,
Thanks for the kind words, JAX-DIPS looks like a very useful library that I may use in the future.
I do not have much time to try out another environment recently, but may be able to come up with a solution around next week.
Another note, if what you need is only a usable standalone hashgrid encoder, you can directly copy the code of the class HashGridEncoder
from models/encoders:
https://github.com/blurgyy/jaxngp/blob/1ef676a8c1fa7435aea9c17d42379ed17174e839/models/encoders.py#L16-L256
It's a JAX/Flax implementation of the hashgrid encoder, and it has identical access pattern as the hashgrid encoder from the tiny-cuda-nn library. The core part is its __call__
method, it only depends on JAX so you can adapt it if you are not using Flax. All the NeRF experiments reported in jaxngp's README are using this implementation during training, and using tiny-cuda-nn's hashgrid during inference. I only created the jax-tcnn library as an attempt to speed up rendering during inference (turned out the speed difference between this implementation and tiny-cuda-nn's hashgrid is pretty small).
Please let me know if that works for you.
Regards, blurgyy
Hi Blurgyy,
Thank you so much for your quick response, this indeed is really helpful. I will follow your instruction and use the code you mentioned, thank you so much for your help 🙏
I have been looking across github for some time and honestly there are so many pieces that stand out in your project! I will keep following your development and learn from you.
Sincerely, Pouria