taichi
taichi copied to clipboard
ti.ad.no_grad decorator and needs_grad=False not prevent creating gradients tensors
Describe the bug I using the function with ti.ad.no_grad decorator but taichi engine creates gradients tensor (on forward pass!!)
To Reproduce
import torch
device = 'cuda'
import taichi as ti
ti.init(getattr(ti,device))
ti2d = ti.types.ndarray(ndim=2, needs_grad=False)
@ti.kernel
def taichi_sinfactor (x: ti2d, out: ti2d, factor: ti.f32):
for i,j in x:
out[i,j] = ti.sin(x[i,j])*factor
@ti.ad.no_grad
def taichi_nograd(x, out, factor):
taichi_sinfactor (x, out, factor)
N = 64
x = torch.rand([N,N],device=device,requires_grad=True)
factor = 5.
out = torch.zeros_like(x)
print([x.grad, out.grad])
taichi_nograd(x, out, factor)
print([x.grad, out.grad])
Log/Screenshots
$ python my_sample_code.py
[Taichi] version 1.8.0, llvm 15.0.1, commit 5a994612, win, python 3.8.8
[I 01/10/24 11:50:00.948 13700] [shell.py:_shell_pop_print@23] Graphical python shell detected, using wrapped sys.stdout
[Taichi] Starting on arch=cuda
[None, None]
[tensor([[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
...,
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.]], device='cuda:0'), None]
Additional comments
The hotfix
def taichi_nograd(*argin):
return taichi_sinfactor (*[a.detach() if isinstance(a,torch.Tensor) else a for a in argin])