TorchSharp
TorchSharp copied to clipboard
A .NET library that provides access to the library that powers PyTorch.
The optional algorithm for GELU is to internally use tanh See more here: https://pytorch.org/docs/stable/generated/torch.nn.GELU.html#torch.nn.GELU I was expecting this to just work: var gelu = nn.GELU(approximate: "tanh"); When the approximate argument...
Fix #1356
fix #1356
#1354 introduce API break change, we need to check in a suppression file to overpass the API compatibility check and bump the version 
It seems that PyTorch has something called `IterableDataset`, which is different from `IterableDataset` in TorchSharp.
In looking at Torchsharp's build I had a few thoughts for improvement. - [ ] Evaluate using arcade and provided build templates. - [ ] Remove as much as possible...
https://github.com/dotnet/TorchSharp/issues/1314#issuecomment-2133339026
 Here are two Tensor fields, which are just regular fields. However, after calling RegisterComponents(), TorchSharp will turn these two fields into parameters. Could you add an attribute to mark...
Are there any plans to support torchaudio, such as StreamReader and StreamWriter classes
Fixes #692 TLDR: `Tensor.backward` has a different parameter order compared to PyTorch and also swaps `retain_graph` and `create_graph` in its internal function call. See https://pytorch.org/docs/stable/generated/torch.Tensor.backward.html for backward's function signature: `Tensor.backward(gradient=None,...
This is printed when I call `functional.scaled_dot_product_attention`: > [W914 13:25:36.000000000 sdp_utils.cpp:555] Warning: 1Torch was not compiled with flash attention. (function operator ()) I'm on Windows with `TorchSharp-cuda-windows=0.103.0`