UniDepth icon indicating copy to clipboard operation
UniDepth copied to clipboard

TypeError: autotune() got an unexpected keyword argument 'use_cuda_graph'

Open CharlotteHao opened this issue 1 year ago • 2 comments

/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/timm/models/layers/init.py:48: FutureWarning: Importing from timm.models.layers is deprecated, please import via timm.layers warnings.warn(f"Importing from {name} is deprecated, please import via timm.layers", FutureWarning) Traceback (most recent call last): File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/./scripts/demo.py", line 5, in from unidepth.models import UniDepthV1, UniDepthV2 File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/unidepth/models/init.py", line 1, in from .unidepthv1 import UniDepthV1 File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/unidepth/models/unidepthv1/init.py", line 1, in from .unidepthv1 import UniDepthV1 File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/unidepth/models/unidepthv1/unidepthv1.py", line 17, in from unidepth.models.unidepthv1.decoder import Decoder File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/unidepth/models/unidepthv1/decoder.py", line 14, in from unidepth.layers import (MLP, AttentionBlock, ConvUpsample, NystromBlock, File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/unidepth/layers/init.py", line 5, in from .nystrom_attention import NystromBlock File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/unidepth/layers/nystrom_attention.py", line 7, in from xformers.components.attention import NystromAttention File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/components/init.py", line 15, in from .attention import Attention, build_attention # noqa ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/components/attention/init.py", line 18, in from ._sputnik_sparse import SparseCS File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/components/attention/_sputnik_sparse.py", line 9, in from xformers.ops import masked_matmul File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/ops/init.py", line 8, in from .fmha import ( File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/ops/fmha/init.py", line 10, in from . import ( File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/ops/fmha/triton_splitk.py", line 110, in from ._triton.splitk_kernels import _fwd_kernel_splitK, _splitK_reduce File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/ops/fmha/_triton/splitk_kernels.py", line 632, in _fwd_kernel_splitK_autotune[num_groups] = autotune_kernel( ^^^^^^^^^^^^^^^^ File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/ops/fmha/_triton/splitk_kernels.py", line 614, in autotune_kernel kernel = triton.autotune( ^^^^^^^^^^^^^^^^ TypeError: autotune() got an unexpected keyword argument 'use_cuda_graph'

CharlotteHao avatar Nov 26 '24 13:11 CharlotteHao

How to solve this problem?(Ask questions sincerely!!! )

CharlotteHao avatar Nov 26 '24 13:11 CharlotteHao

Hey i met the same error as you but i'm not sure your setting is the same as mine.

i split this into several small solutions:

1 cuda 124 not 118: it turns out that triton not supports cuda118 on windows 2 i changed "from xformers.components.attention import NystromAttention" into "from xformers.components.attention import *", probably there's ome structure difference and cause the error that couldn't find the location 3 i used triton 3.2.0 not 3.3.0

hope this could help you!

zniihgnexy avatar Mar 20 '25 14:03 zniihgnexy