Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: autotune() got an unexpected keyword argument 'use_cuda_graph' #84

Open
CharlotteHao opened this issue Nov 26, 2024 · 1 comment

Comments

@CharlotteHao
Copy link

/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/timm/models/layers/init.py:48: FutureWarning: Importing from timm.models.layers is deprecated, please import via timm.layers
warnings.warn(f"Importing from {name} is deprecated, please import via timm.layers", FutureWarning)
Traceback (most recent call last):
File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/./scripts/demo.py", line 5, in
from unidepth.models import UniDepthV1, UniDepthV2
File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/unidepth/models/init.py", line 1, in
from .unidepthv1 import UniDepthV1
File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/unidepth/models/unidepthv1/init.py", line 1, in
from .unidepthv1 import UniDepthV1
File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/unidepth/models/unidepthv1/unidepthv1.py", line 17, in
from unidepth.models.unidepthv1.decoder import Decoder
File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/unidepth/models/unidepthv1/decoder.py", line 14, in
from unidepth.layers import (MLP, AttentionBlock, ConvUpsample, NystromBlock,
File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/unidepth/layers/init.py", line 5, in
from .nystrom_attention import NystromBlock
File "/home/shahao/my_study/PF3plat/UniDepth/UniDepth/unidepth/layers/nystrom_attention.py", line 7, in
from xformers.components.attention import NystromAttention
File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/components/init.py", line 15, in
from .attention import Attention, build_attention # noqa
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/components/attention/init.py", line 18, in
from ._sputnik_sparse import SparseCS
File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/components/attention/_sputnik_sparse.py", line 9, in
from xformers.ops import masked_matmul
File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/ops/init.py", line 8, in
from .fmha import (
File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/ops/fmha/init.py", line 10, in
from . import (
File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/ops/fmha/triton_splitk.py", line 110, in
from ._triton.splitk_kernels import _fwd_kernel_splitK, _splitK_reduce
File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/ops/fmha/_triton/splitk_kernels.py", line 632, in
_fwd_kernel_splitK_autotune[num_groups] = autotune_kernel(
^^^^^^^^^^^^^^^^
File "/home/shahao/anaconda3/envs/Unidepth/lib/python3.11/site-packages/xformers/ops/fmha/_triton/splitk_kernels.py", line 614, in autotune_kernel
kernel = triton.autotune(
^^^^^^^^^^^^^^^^
TypeError: autotune() got an unexpected keyword argument 'use_cuda_graph'

@CharlotteHao
Copy link
Author

How to solve this problem?(Ask questions sincerely!!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant