HCareLou

Results 31 comments of HCareLou

``` var m = torch.nn.Conv2d(3, 64, 7, 2, 3, bias: false).cuda(); for (int i = 0; i < 1000000; i++) { using var x = torch.randn(1, 3, 224, 224).@float().cuda(); using...

`torch.NewDisposeScope()` is a relatively elegant solution, although not as elegant as Pytorch.

I'm kind of considering giving up, as unexpected exceptions occur when torch.NewDisposeScope is nested, especially when one function calls another and the called function also has torch.NewDisposeScope. Objects that shouldn't...

``` public static class Ops{ public static Tensor clip_boxes(Tensor boxes, int[] shape) { using (torch.NewDisposeScope()) { boxes[TensorIndex.Ellipsis, 0] = boxes[TensorIndex.Ellipsis, 0].clamp(0, shape[1]); boxes[TensorIndex.Ellipsis, 1] = boxes[TensorIndex.Ellipsis, 1].clamp(0, shape[0]); boxes[TensorIndex.Ellipsis, 2]...

Yes, the simulated code can resolve the issue by removing MoveToOuterDisposeScope(), but for the code where the actual memory leak occurs, I cannot handle it in this way. I need...

No no no, your code will cause a memory leak in my project, but adding .clone() fixes it. However, the simulated code still leaks memory even with .clone() added. Please...

Please watch the VCR. https://github.com/dotnet/TorchSharp/assets/55724885/20f3a37a-1267-4ebd-a2e8-81365a122a8a

Sorry about that, it’s not convenient at the moment.

Not too sure, haven't found the exact cause yet, it's a bit odd.

``` public static Tensor clip_boxes(Tensor boxes, int[] shape) { using (torch.NewDisposeScope()) { boxes[TensorIndex.Ellipsis, 0].clamp_(0, shape[1]); boxes[TensorIndex.Ellipsis, 1].clamp_(0, shape[0]); boxes[TensorIndex.Ellipsis, 2].clamp_(0, shape[1]); boxes[TensorIndex.Ellipsis, 3].clamp_(0, shape[0]); return boxes; } } ``` This...