AlphaBetaGamma96

Results 42 comments of AlphaBetaGamma96

Hi @samdow, so just to check, I need to download the latest pytorch nightly from https://pytorch.org/get-started/locally/ and then install functorch from source (from https://github.com/pytorch/functorch#installing-functorch-main), and that should be ok? (Assuming...

For completeness, I thought I'd share the results for the latest nightly version ``` PyTorch version: 1.13.0.dev20220820 CUDA version: 11.6 FuncTorch version: 0.3.0a0+86a9049 N: 1 | Walltime: 0.4445 (s) N:...

Hi @samdow, apologies for re-opening this issue but could the `torch.linalg.lu*` functions also be added for a batching rule? It seems that when `torch.linalg.slogdet` is called it calls `torch.linalg.lu` for...

Hi @zou3519, sorry for the late response. I've installed the latest pytorch nightly and the `UserWarning` isn't there anymore. Thank you! EDIT: removed issue with functorch install. Fresh install works...

Hi @zou3519, I've just noticed that if `torch.slogdet` (instead of torch.linalg.slogdet) is used it defaults to a for-loop and stack, but `torch.linalg.slogdet` works fine as expected. I assume `torch.slogdet` is...

Yes, after posting I did notice that it looked pretty similar to #292 I only used `torch.slogdet` within this example as a simplified version of my actual network which has...

Hi @zou3519, I think I have a solution for removing `.item` from `slogdet_backward`. The only problem is it requires a batching rule for `at::equal`. The backward I've just written is...

Hi @zou3519, I was just wondering if there's been any update on adding custom autograd Function support to FuncTorch? Thank you! :)

I can across this which explains the issue, and gives some solutions to it! https://pytorch.org/functorch/stable/batch_norm.html

Hi @MaxH1996, that's unfortunate. Potentially you could copy the source for `BatchNorm2D` and replace all in-place commands with their out-of-place equivalent? (If that's at all possible) @samdow Hutchinson estimator is...