Brandon T. Willard

Results 202 issues of Brandon T. Willard

The scalar multiplication `Op`, `Mul`, has an `impl` method that actually uses `np.product`. When used in conjunction with `Elemwise`, optimizations like `local_mul_canonizer` construct graphs that essentially have `Elemwise(Mul)(a, b, c,...

bug
refactor
Op implementation

The `tests.tensor.test_basic` module takes a while to run (e.g. another case of #23 ), and a big part of that is the unreasonably repetitious numeric testing that's performed in all...

help wanted
testing
important
refactor

The `deprecated` wrapper works fine for deprecating _functions_ but not for renamed classes and class instances—e.g. we don't want to break things by changing the expected type of an object....

enhancement
good first issue
help wanted

Like Dask does with Pandas/NumPy (see [here](https://github.com/dask/dask/blob/1d0262c303b00fb053500d775ddb83437c6fcfe0/dask/utils.py#L666)), we can use a decorator that copies the docstrings from the NumPy functions that we're emulating/wrapping and adds a disclaimer at the end...

documentation
good first issue
help wanted
important
refactor

We might be able to speed up Aesara's compiled function evaluations by Cythonizing classes like [`Function`](https://github.com/pymc-devs/aesara/blob/84936418af87e84322bd668c7e05e62766699906/aesara/compile/function/types.py#L236)—or at least the methods within it and/or the functions it calls. Every time an...

enhancement
question
performance concern

We can't afford to write or maintain documentation for external libraries like NumPy or even Python itself—no matter how big or small the material may be. The two most egregious...

documentation
good first issue
help wanted
refactor

The `CAReduce` `Op` base class doesn't have a general gradient implementation, but it seems like it could.

enhancement
help wanted
question
gradient implementations

We still don't have a gradient implementation for `Eig` (and a few other `aesara.linalg` `Op`s like https://github.com/aesara-devs/aesara/issues/836). [Here's](https://github.com/aesara-devs/aesara/pull/1020#issuecomment-1175405721) an example implementation that could work.

enhancement
gradient implementations
Op implementation

Some broken broadcasting-related logic is causing `local_elemwise_alloc` to produce graphs that fail to perform simple multiplications: ```python import numpy as np import aesara import aesara.tensor as at from aesara.compile.mode import...

bug
help wanted
important
graph rewriting

This PR implements https://github.com/aesara-devs/aesara/issues/695. It's currently just an outline.

enhancement
important
Op implementation