nexus
nexus copied to clipboard
Experimental tensor-typed deep learning
It somehow breaks again after the fix in #21. I think the link should be: https://github.com/ctongfei/nexus/blob/master/jvm-ref/backend/src/test/scala/nexus/XorTest.scala The diff: jvm-ref-backend -> jvm-ref/backend
- jvm-ref-backend -> jvm-ref/backend - Issue: #40
`nexus.algebra.IsComplex[C]` `nexus.algebra.IsComplexTensorK[TC[_], C, TR[_], R]` Update existing ops if they support complex Additional complex ops: `nexus.ops.{Re, Im, Conj, Arg}`
Would be nice if it is possible to index tensors with other index-typed tensors.
Amazing idea from @kitsing: A `LongTensor` tensor used as an indexer should not be typed as `LongTensor[U]`, instead it should be `IndexTensor[U, X TR[Append[U, I]] argmax: (T[U], I) => TI[Remove[U,...
For certain 2nd-order optimization algorithms, e.g. (Martens & Grosse, 2015 JMLR): ```scala def backward[G[_]: Algebra, X, Y](dy: G[Y], y: Y, x: X): G[X] ``` where `G[_]` encapsulates backward computations. Trivially...
Given `D[_]: DifferentiableAlgebra`, ```scala def jacobian[X, Y, `∂Y/∂X`](y: D[Y], x: D[X])(implicit J: Jacobian.Aux[X, Y, `∂Y/∂X`]): `∂Y/∂X` ``` How should we define `Jacobian.Aux` for all differentiable types?
Hopefully in Scala 3, with https://github.com/lampepfl/dotty/pull/4672, we could do ```scala type Func1[X, Y] = [F[_]: Algebra] -> F[X] => F[Y] ```