Aditya Singh

Results 30 comments of Aditya Singh

Wow! this was super helpful. Thanks @teddykoker

> Is there a good way to change **all** bias initializers to zero? +1 for this. Right now I have a hacky method relying on recursive `getattr` to first get...

`Option 2` covers my use-cases with a minor modification; `if has_bias(x) and x.bias is not None`. Thanks!

Hi, I don't have much experience with neural rendering. As per my limited understanding, the **main** novelty in this domain is the training procedure and not the `architecture` design. If...

Thanks for the suggestion on the `IterValueDict` ! Regarding the additional methods, I only had use-case for `append` functionality and suggested the others based on differences from `torch.Sequential`. For `append`,...

Saw that `dilation` is not supported at the moment. However, decided against pushing the change here. Will open a new request corresponding to it in few days.

Thanks for the detailed response! I am happy to stick with `filter_*` for stuff built on top of `equinox`. So probably the `p` field in `nn.Dropout` also requires a `static_field`...

Thanks! I agree on removing the static_field and moving towards the `filter_*` approach for everything. I can give it a shot and see if everything is still passing.

Unsurprisingly couple of tests fail. I just wanted to make sure that the tests need fixing and its not the behaviour breaking. Small representative example of a failing test case...

A little update on this ```python import jax import equinox as eqx def test_no_static(getkey): def h(x): return jax.tree_map(lambda u: u if eqx.is_array_like(u) else None, x) h = eqx.filter_jit(h, filter_spec=eqx.is_array_like) og_lin...