cubed
cubed copied to clipboard
Optional autodiff support?
It would be awesome if the backing array implementation supported auto differentiation, that we could access some grad method from Cubed.
It looks like a bunch of stakeholder libraries have this functionality:
https://data-apis.org/array-api/latest/purpose_and_scope.html#stakeholders
Though, differentiable programming may be out of scope for Cubed. @TomNicholas @tomwhite @rbavery any thoughts here?
I have a pipe dream of turning Cubed into an ML framework, and I think this would play an important part.
I haven’t thought of all the implications, but a potential sharp edge that @shoyer once pointed out to me: there will probably be significant memory differences between an op graph and its gradient. Can Cubed’s spec model be extended here?