Ricardo Vieira
Ricardo Vieira
This PR allows for known shapes to be propagated to the outputs of Elemwise OPs It won't solve #732, but it doesn't seem like there was a good reason not...
This bug is an unexpected consequence of https://github.com/aesara-devs/aesara/pull/928 and rewrites that make certain assumptions: https://github.com/aesara-devs/aesara/issues/1089#issuecomment-1291561804 ```python import aesara import aesara.tensor as at import numpy as np x_row = at.row("x_row") x_matrix...
I am getting wrong results for the gradient of a simple Normal logp when using the `NUMBA` backend (and indexing is involved). I have a gist documenting the problem here:...
https://numpy.org/doc/stable/reference/generated/numpy.ufunc.outer.html
Unlike most `Op`s, that convert inputs with `as_tensor_variable`, several of the helper functions in `tensor.basic` do not convert the inputs, leading to some failures when trying to access attributes that...
The documentation example fails, but that might be due to the variable number of `n_steps` and an intermediate `Op` that implies dynamic shape ```python import aesara import aesara.tensor as at...
```python import aesara import aesara.tensor as at x = at.vector("x") y = at.matrix("y") z = x + y aesara.dprint(z, print_type="True") ``` ``` Elemwise{add,no_inplace} [id A] '' |InplaceDimShuffle{x,0} [id B] ''...
Probably quite some work, but would be nice to have an equivalent: https://numpy.org/doc/stable/reference/generated/numpy.einsum.html
JAX version 0.2.12 was released in April 2021 ``` UserWarning: JAX omnistaging couldn't be disabled: Disabling of omnistaging is no longer supported in JAX version 0.2.12 and higher: see https://github.com/google/jax/blob/main/design_notes/omnistaging.md....
This was attempted before, but due to some undocumented issues with Scan, the changes were reverted here: https://github.com/Theano/Theano/pull/2970 Not knowing what the issues were, the only reasonable thing is to...