Daniel Ward

Results 15 comments of Daniel Ward

The NaNs seem to be introduced in `updateμ!(r::GlmResp{V,D,L}) where {V

I think the issue is just that it does `exp` (inverse of log) and the values are too great leading to Infs. I presume I'll be able to fix this...

I've been using this in my package [flowjax](https://github.com/danielward27/flowjax) for registering parameters for equinox modules. ``` def register_params( name: str, model: PyTree, filter_spec: Callable | PyTree = eqx.is_inexact_array, ): """Register numpyro...

Another quick thought on this issue. Importing at the top I imagine only partially solves the issue for ``@decorator_that_needs_to_be_bottom``. If you use a package wide import hook, if you call...

FWIW I promise I don't actually write code like the above example. A pattern that is used a lot in jax is something like ```python @beartype @partial(jax.jit, static_argnames=["length"]) def fn(a,...

Thanks for the information. Good to hear things will be improving! Right now sphinx requires `from __futures__ import annotations` to be used in order to have type aliases displayed nicely...

Great, thanks, that will do for the time being! Yes, I probably should switch, I have like you found sphinx to be on the clunkier side. It's just mostly my...

Sorry, I'll try to rephrase with a bit of code. So ``beartype`` at the bottom can sometimes avoid issues, e.g. with ``numba.jit``. But presumably, as soon as you start nesting...

I think your approach works, but it would have a bit of extra overhead as like you said the masked autoregressive network will still produce a set of (unused) parameters...

You can do that, but note that the transform of the transformed dimensions will be independent of the identity transformed variables if you do