Don Syme

Results 1218 comments of Don Syme

@cgravill and I came across a need for this when porting a sample let f = dsharp.einsum("ijk,mik->mij", L, x)

> instead of which we can again support more general comparisons like PyTorch Annoyingly the `

Some technical notes regarding Fable. 1. Fable or WebSharper cross-compile would be somewhat intrusive - both in the code and in the testing. 2. Because of this, the best approach...

OK cool. We'll aim to make sure the support is in TorchSharp when we want to use it

@pkese reports that all tests now pass! https://github.com/DiffSharp/DiffSharp/pull/121#issuecomment-637740358 We will now need to automate the testing both in-repo and during CI

Just to note that Torch testing on GPU could in theory be automated in Azure DevOps using GPU-enabled containers. However * We need a manual container group in the devops,...

Yes that looks good Maybe add to DEVGUIDE.md the single-line invocation of `dotnet test` providing the property on the command line for cut-and-paste

More generally, we need test cases for taking hyper-gradients of each of the optimizers https://github.com/DiffSharp/DiffSharp/blob/dev/tests/DiffSharp.Tests/TestDerivatives.Nested.fs

> Just to check: the immutability mentioned here also relates to in-place operations, is that correct? Yes, in place operations = mutable. To use an in-place operation you'd need to...

Some sample symbolic shape inference, for example the Conv2DTranspose model: ``` Model.AnalyseShapes(optionals=false) ``` gives ``` DiffSharp.Model.Conv3d(C,outChannels) Conv3d__bias : [outChannels] Conv3d__weight : [outChannels,C,1,1,1] forward([N,C,D,H,W]) : [N,outChannels,D,H,W] ``` This is correctly inferring...