spidr
spidr copied to clipboard
Accelerated machine learning with dependent types
In some case it's useful to be able to say "this thing works for this shape and any shapes isomorphic to it". Isomorphic shapes include those with extra or fewer...
Support complex numbers (integral and/or floating point depending on what XLA supports).
Many `Tensor` tests don't test with inf and nan values. This is particularly true for hard-coded test cases (often arrays). All ops that use `Double` should have such tests.
Probabilistic programming seems to be an increasingly popular approach to programming with probability distributions. Determine what that might look like in Idris, and whether it's worth adopting.
The Cholesky factor can be reused for inference and for calculating the marginal likelihood. Write an implementation that takes this into account so it's not recalculated. This may help https://gregorygundersen.com/blog/2019/09/12/practical-gp-regression/
Write and implement a tutorial on Gaussian process inference and how it is designed in spidr. We can do this as a latex literate file and include equations for the...
The [XLA docs](https://www.tensorflow.org/xla/broadcasting) explain how a tensor of shape [n, m] can be broadcast to [p, n, m] or a [n, p, m] when the axes to match with are...
I suspect there is well-founded logic behind what shapes can and can't be broadcast to others. It would be really nice to have `Broadcastable` use that logic, so that we...
TensorFlow allows broadcasting to dimensions of length 0. Do we allow this and if not, should we? e.g. `1 :: t` -> `0 :: t`
Implement a `ProbabilisticModel` for whatever was implemented in #62, and use it in the Bayesian optimization tutorial for the failure data. Whether it's appropriate for that usage is probably not...