Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

visual studio 2022,cuda 12.4 ,but it still can't build the wheel, and return this #1161

Open
zslefour opened this issue Nov 22, 2024 · 2 comments

Comments

@zslefour
Copy link

zslefour commented Nov 22, 2024

❓ Questions and Help

visual studio 2022,cuda 12.4 ,but it still can't build the wheel, and return like this
(For the sake of easy looking, some of the information that seems to be okay was deleted by me):---------->
---------------------------------------------------------------------------------------------------------------------->
H:\V.0.2.7>python_embeded\python.exe -m pip install --pre -U xformers
Collecting xformers
Using cached xformers-0.0.29.dev941.tar.gz (7.8 MB)
Preparing metadata (setup.py) ... done
Requirement already satisfied: torch>=2.4 in h:\v.0.2.7\python_embeded\lib\site-packages (from xformers) (2.5.1+cu124)
Requirement already satisfied: MarkupSafe>=2.0 in h:\v.0.2.7\python_embeded\lib\site-packages (from jinja2->torch>=2.4->xformers) (3.0.2)
Building wheels for collected packages: xformers
Building wheel for xformers (setup.py) ... error
error: subprocess-exited-with-error

× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [5829 lines of output]
fatal: not a git repository (or any of the parent directories): .git
running bdist_wheel
running build
running build_py
creating build\lib.win-amd64-cpython-312\xformers
copying xformers\attn_bias_utils.py -> build\lib.win-amd64-cpython-312\xformers
copying xformers_flash_attn\ops\triton_init_.py -> build\lib.win-amd64-cpython-312\xformers_flash_attn\ops\triton
running build_ext
H:\V.0.2.7\python_embeded\Lib\site-packages\torch\utils\cpp_extension.py:382: UserWarning: Error checking compiler version for cl: [WinError 2] 系统找不到指定的文件。
warnings.warn(f'Error checking compiler version for {compiler}: {error}')
building 'xformers.C_flashattention' extension
creating C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn
creating C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src
Emitting ninja build file C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\build.ninja...
Compiling objects...
Allowing ninja to set a default number of workers... (overridable by setting the environment variable MAX_JOBS=N)
[1/85] C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4\bin\nvcc --generate-dependencies-with-compile --dependency-output C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_bf16_sm80.obj.d -std=c++17 --use-local-env -Xcompiler /MD -Xcompiler /wd4819 -Xcompiler /wd4251 -Xcompiler /wd4244 -Xcompiler /wd4267 -Xcompiler /wd4275 -Xcompiler /wd4018 -Xcompiler /wd4190 -Xcompiler /wd4624 -Xcompiler /wd4067 -Xcompiler /wd4068 -Xcompiler /EHsc -Xcudafe --diag_suppress=base_class_has_different_dll_interface -Xcudafe --diag_suppress=field_without_dll_interface -Xcudafe --diag_suppress=dll_interface_conflict_none_assumed -Xcudafe --diag_suppress=dll_interface_conflict_dllexport_assumed -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\cutlass\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\torch\csrc\api\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\TH -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\THC "-IC:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4\include" -IH:\V.0.2.7\python_embeded\include -IH:\V.0.2.7\python_embeded\Include "-IC:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\include" "-IC:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Auxiliary\VS\include" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\cppwinrt" -c C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_bf16_sm80.cu -o C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_bf16_sm80.obj -D__CUDA_NO_HALF_OPERATORS
_ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -DHAS_PYTORCH --use_fast_math -U__CUDA_NO_HALF_OPERATORS__ -U__CUDA_NO_HALF_CONVERSIONS__ --extended-lambda -D_ENABLE_EXTENDED_ALIGNED_STORAGE -std=c++17 --generate-line-info -DNDEBUG --threads 4 --ptxas-options=-v -Xcompiler /Zc:lambda -Xcompiler /Zc:preprocessor -Xcompiler /Zc:cplusplus -O3 -std=c++17 -U__CUDA_NO_HALF_OPERATORS -U__CUDA_NO_HALF_CONVERSIONS__ -U__CUDA_NO_HALF2_OPERATORS__ -U__CUDA_NO_BFLOAT16_CONVERSIONS__ --expt-relaxed-constexpr --expt-extended-lambda --use_fast_math --ptxas-options=-v -gencode=arch=compute_80,code=sm_80 -gencode=arch=compute_86,code=sm_86 -gencode=arch=compute_90,code=sm_90 -DFLASHATTENTION_DISABLE_ALIBI --generate-line-info -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=C_flashattention -D_GLIBCXX_USE_CXX11_ABI=0
FAILED: C:/Users/Monday/AppData/Local/Temp/pip-install-vu94rg2a/xformers_85fe5e8f6a0041038e778ab49585b085/build/temp.win-amd64-cpython-312/Release/Users/Monday/AppData/Local/Temp/pip-install-vu94rg2a/xformers_85fe5e8f6a0041038e778ab49585b085/third_party/flash-attention/csrc/flash_attn/src/flash_bwd_hdim128_bf16_sm80.obj
C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4\bin\nvcc --generate-dependencies-with-compile --dependency-output C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_bf16_sm80.obj.d -std=c++17 --use-local-env -Xcompiler /MD -Xcompiler /wd4819 -Xcompiler /wd4251 -Xcompiler /wd4244 -Xcompiler /wd4267 -Xcompiler /wd4275 -Xcompiler /wd4018 -Xcompiler /wd4190 -Xcompiler /wd4624 -Xcompiler /wd4067 -Xcompiler /wd4068 -Xcompiler /EHsc -Xcudafe --diag_suppress=base_class_has_different_dll_interface -Xcudafe --diag_suppress=field_without_dll_interface -Xcudafe --diag_suppress=dll_interface_conflict_none_assumed -Xcudafe --diag_suppress=dll_interface_conflict_dllexport_assumed -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\cutlass\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\torch\csrc\api\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\TH -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\THC "-IC:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4\include" -IH:\V.0.2.7\python_embeded\include -IH:\V.0.2.7\python_embeded\Include "-IC:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\include" "-IC:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Auxiliary\VS\include" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\cppwinrt" -c C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_bf16_sm80.cu -o C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_bf16_sm80.obj -D__CUDA_NO_HALF_OPERATORS
_ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -DHAS_PYTORCH --use_fast_math -U__CUDA_NO_HALF_OPERATORS__ -U__CUDA_NO_HALF_CONVERSIONS__ --extended-lambda -D_ENABLE_EXTENDED_ALIGNED_STORAGE -std=c++17 --generate-line-info -DNDEBUG --threads 4 --ptxas-options=-v -Xcompiler /Zc:lambda -Xcompiler /Zc:preprocessor -Xcompiler /Zc:cplusplus -O3 -std=c++17 -U__CUDA_NO_HALF_OPERATORS -U__CUDA_NO_HALF_CONVERSIONS__ -U__CUDA_NO_HALF2_OPERATORS__ -U__CUDA_NO_BFLOAT16_CONVERSIONS__ --expt-relaxed-constexpr --expt-extended-lambda --use_fast_math --ptxas-options=-v -gencode=arch=compute_80,code=sm_80 -gencode=arch=compute_86,code=sm_86 -gencode=arch=compute_90,code=sm_90 -DFLASHATTENTION_DISABLE_ALIBI --generate-line-info -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=C_flashattention -D_GLIBCXX_USE_CXX11_ABI=0
flash_bwd_hdim128_bf16_sm80.cu
cl: 命令行 warning D9025 :正在重写“/D__CUDA_NO_HALF_OPERATORS
_”(用“/U__CUDA_NO_HALF_OPERATORS__”)
cl: 命令行 warning D9025 :正在重写“/D__CUDA_NO_HALF_CONVERSIONS__”(用“/U__CUDA_NO_HALF_CONVERSIONS__”)
cl: 命令行 warning D9025 :正在重写“/D__CUDA_NO_HALF2_OPERATORS__”(用“/U__CUDA_NO_HALF2_OPERATORS__”)
cl: 命令行 warning D9025 :正在重写“/D__CUDA_NO_BFLOAT16_CONVERSIONS__”(用“/U__CUDA_NO_BFLOAT16_CONVERSIONS__”)
flash_bwd_hdim128_bf16_sm80.cu
fatal : Could not open output file C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_bf16_sm80.obj.d

  [2/85] C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4\bin\nvcc --generate-dependencies-with-compile --dependency-output C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim160_fp16_sm80.obj.d -std=c++17 --use-local-env -Xcompiler /MD -Xcompiler /wd4819 -Xcompiler /wd4251 -Xcompiler /wd4244 -Xcompiler /wd4267 -Xcompiler /wd4275 -Xcompiler /wd4018 -Xcompiler /wd4190 -Xcompiler /wd4624 -Xcompiler /wd4067 -Xcompiler /wd4068 -Xcompiler /EHsc -Xcudafe --diag_suppress=base_class_has_different_dll_interface -Xcudafe --diag_suppress=field_without_dll_interface -Xcudafe --diag_suppress=dll_interface_conflict_none_assumed -Xcudafe --diag_suppress=dll_interface_conflict_dllexport_assumed -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\cutlass\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\torch\csrc\api\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\TH -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\THC "-IC:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4\include" -IH:\V.0.2.7\python_embeded\include -IH:\V.0.2.7\python_embeded\Include "-IC:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\include" "-IC:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Auxiliary\VS\include" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\um" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\shared" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\winrt" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\cppwinrt" -c C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim160_fp16_sm80.cu -o C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim160_fp16_sm80.obj -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -DHAS_PYTORCH --use_fast_math -U__CUDA_NO_HALF_OPERATORS__ -U__CUDA_NO_HALF_CONVERSIONS__ --extended-lambda -D_ENABLE_EXTENDED_ALIGNED_STORAGE -std=c++17 --generate-line-info -DNDEBUG --threads 4 --ptxas-options=-v -Xcompiler /Zc:lambda -Xcompiler /Zc:preprocessor -Xcompiler /Zc:__cplusplus -O3 -std=c++17 -U__CUDA_NO_HALF_OPERATORS__ -U__CUDA_NO_HALF_CONVERSIONS__ -U__CUDA_NO_HALF2_OPERATORS__ -U__CUDA_NO_BFLOAT16_CONVERSIONS__ --expt-relaxed-constexpr --expt-extended-lambda --use_fast_math --ptxas-options=-v -gencode=arch=compute_80,code=sm_80 -gencode=arch=compute_86,code=sm_86 -gencode=arch=compute_90,code=sm_90 -DFLASHATTENTION_DISABLE_ALIBI --generate-line-info -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_C_flashattention -D_GLIBCXX_USE_CXX11_ABI=0
  FAILED: C:/Users/Monday/AppData/Local/Temp/pip-install-vu94rg2a/xformers_85fe5e8f6a0041038e778ab49585b085/build/temp.win-amd64-cpython-312/Release/Users/Monday/AppData/Local/Temp/pip-install-vu94rg2a/xformers_85fe5e8f6a0041038e778ab49585b085/third_party/flash-attention/csrc/flash_attn/src/flash_bwd_hdim160_fp16_sm80.obj
  C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4\bin\nvcc --generate-dependencies-with-compile --dependency-output C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim160_fp16_sm80.obj.d -std=c++17 --use-local-env -Xcompiler /MD -Xcompiler /wd4819 -Xcompiler /wd4251 -Xcompiler /wd4244 -Xcompiler /wd4267 -Xcompiler /wd4275 -Xcompiler /wd4018 -Xcompiler /wd4190 -Xcompiler /wd4624 -Xcompiler /wd4067 -Xcompiler /wd4068 -Xcompiler /EHsc -Xcudafe --diag_suppress=base_class_has_different_dll_interface -Xcudafe --diag_suppress=field_without_dll_interface -Xcudafe --diag_suppress=dll_interface_conflict_none_assumed -Xcudafe --diag_suppress=dll_interface_conflict_dllexport_assumed -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\cutlass\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\torch\csrc\api\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\TH -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\THC "-IC:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4\include" -IH:\V.0.2.7\python_embeded\include -IH:\V.0.2.7\python_embeded\Include "-IC:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\include" "-IC:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Auxiliary\VS\include" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\um" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\shared" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\winrt" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\cppwinrt" -c C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim160_fp16_sm80.cu -o C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim160_fp16_sm80.obj -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -DHAS_PYTORCH --use_fast_math -U__CUDA_NO_HALF_OPERATORS__ -U__CUDA_NO_HALF_CONVERSIONS__ --extended-lambda -D_ENABLE_EXTENDED_ALIGNED_STORAGE -std=c++17 --generate-line-info -DNDEBUG --threads 4 --ptxas-options=-v -Xcompiler /Zc:lambda -Xcompiler /Zc:preprocessor -Xcompiler /Zc:__cplusplus -O3 -std=c++17 -U__CUDA_NO_HALF_OPERATORS__ -U__CUDA_NO_HALF_CONVERSIONS__ -U__CUDA_NO_HALF2_OPERATORS__ -U__CUDA_NO_BFLOAT16_CONVERSIONS__ --expt-relaxed-constexpr --expt-extended-lambda --use_fast_math --ptxas-options=-v -gencode=arch=compute_80,code=sm_80 -gencode=arch=compute_86,code=sm_86 -gencode=arch=compute_90,code=sm_90 -DFLASHATTENTION_DISABLE_ALIBI --generate-line-info -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_C_flashattention -D_GLIBCXX_USE_CXX11_ABI=0
  flash_bwd_hdim160_fp16_sm80.cu
  cl: 命令行 warning D9025 :正在重写“/D__CUDA_NO_HALF_OPERATORS__”(用“/U__CUDA_NO_HALF_OPERATORS__”)
  cl: 命令行 warning D9025 :正在重写“/D__CUDA_NO_HALF_CONVERSIONS__”(用“/U__CUDA_NO_HALF_CONVERSIONS__”)
  cl: 命令行 warning D9025 :正在重写“/D__CUDA_NO_HALF2_OPERATORS__”(用“/U__CUDA_NO_HALF2_OPERATORS__”)
  cl: 命令行 warning D9025 :正在重写“/D__CUDA_NO_BFLOAT16_CONVERSIONS__”(用“/U__CUDA_NO_BFLOAT16_CONVERSIONS__”)
  flash_bwd_hdim160_fp16_sm80.cu
  fatal   : Could not open output file C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim160_fp16_sm80.obj.d

  [3/85] C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4\bin\nvcc --generate-dependencies-with-compile --dependency-output C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_fp16_causal_sm80.obj.d -std=c++17 --use-local-env -Xcompiler /MD -Xcompiler /wd4819 -Xcompiler /wd4251 -Xcompiler /wd4244 -Xcompiler /wd4267 -Xcompiler /wd4275 -Xcompiler /wd4018 -Xcompiler /wd4190 -Xcompiler /wd4624 -Xcompiler /wd4067 -Xcompiler /wd4068 -Xcompiler /EHsc -Xcudafe --diag_suppress=base_class_has_different_dll_interface -Xcudafe --diag_suppress=field_without_dll_interface -Xcudafe --diag_suppress=dll_interface_conflict_none_assumed -Xcudafe --diag_suppress=dll_interface_conflict_dllexport_assumed -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\cutlass\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\torch\csrc\api\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\TH -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\THC "-IC:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4\include" -IH:\V.0.2.7\python_embeded\include -IH:\V.0.2.7\python_embeded\Include "-IC:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\include" "-IC:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Auxiliary\VS\include" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\um" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\shared" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\winrt" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\cppwinrt" -c C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_fp16_causal_sm80.cu -o C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_fp16_causal_sm80.obj -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -DHAS_PYTORCH --use_fast_math -U__CUDA_NO_HALF_OPERATORS__ -U__CUDA_NO_HALF_CONVERSIONS__ --extended-lambda -D_ENABLE_EXTENDED_ALIGNED_STORAGE -std=c++17 --generate-line-info -DNDEBUG --threads 4 --ptxas-options=-v -Xcompiler /Zc:lambda -Xcompiler /Zc:preprocessor -Xcompiler /Zc:__cplusplus -O3 -std=c++17 -U__CUDA_NO_HALF_OPERATORS__ -U__CUDA_NO_HALF_CONVERSIONS__ -U__CUDA_NO_HALF2_OPERATORS__ -U__CUDA_NO_BFLOAT16_CONVERSIONS__ --expt-relaxed-constexpr --expt-extended-lambda --use_fast_math --ptxas-options=-v -gencode=arch=compute_80,code=sm_80 -gencode=arch=compute_86,code=sm_86 -gencode=arch=compute_90,code=sm_90 -DFLASHATTENTION_DISABLE_ALIBI --generate-line-info -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_C_flashattention -D_GLIBCXX_USE_CXX11_ABI=0
  FAILED: C:/Users/Monday/AppData/Local/Temp/pip-install-vu94rg2a/xformers_85fe5e8f6a0041038e778ab49585b085/build/temp.win-amd64-cpython-312/Release/Users/Monday/AppData/Local/Temp/pip-install-vu94rg2a/xformers_85fe5e8f6a0041038e778ab49585b085/third_party/flash-attention/csrc/flash_attn/src/flash_bwd_hdim128_fp16_causal_sm80.obj
  C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4\bin\nvcc --generate-dependencies-with-compile --dependency-output C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_fp16_causal_sm80.obj.d -std=c++17 --use-local-env -Xcompiler /MD -Xcompiler /wd4819 -Xcompiler /wd4251 -Xcompiler /wd4244 -Xcompiler /wd4267 -Xcompiler /wd4275 -Xcompiler /wd4018 -Xcompiler /wd4190 -Xcompiler /wd4624 -Xcompiler /wd4067 -Xcompiler /wd4068 -Xcompiler /EHsc -Xcudafe --diag_suppress=base_class_has_different_dll_interface -Xcudafe --diag_suppress=field_without_dll_interface -Xcudafe --diag_suppress=dll_interface_conflict_none_assumed -Xcudafe --diag_suppress=dll_interface_conflict_dllexport_assumed -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src -IC:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\cutlass\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\torch\csrc\api\include -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\TH -IH:\V.0.2.7\python_embeded\Lib\site-packages\torch\include\THC "-IC:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4\include" -IH:\V.0.2.7\python_embeded\include -IH:\V.0.2.7\python_embeded\Include "-IC:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.41.34120\include" "-IC:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Auxiliary\VS\include" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\um" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\shared" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\winrt" "-IC:\Program Files (x86)\Windows Kits\10\\include\10.0.22621.0\\cppwinrt" -c C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_fp16_causal_sm80.cu -o C:\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-vu94rg2a\xformers_85fe5e8f6a0041038e778ab49585b085\third_party\flash-attention\csrc\flash_attn\src\flash_bwd_hdim128_fp16_causal_sm80.obj -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr -DHAS_PYTORCH --use_fast_math -U__CUDA_NO_HALF_OPERATORS__ -U__CUDA_NO_HALF_CONVERSIONS__ --extended-lambda -D_ENABLE_EXTENDED_ALIGNED_STORAGE -std=c++17 --generate-line-info -DNDEBUG --threads 4 --ptxas-options=-v -Xcompiler /Zc:lambda -Xcompiler /Zc:preprocessor -Xcompiler /Zc:__cplusplus -O3 -std=c++17 -U__CUDA_NO_HALF_OPERATORS__ -U__CUDA_NO_HALF_CONVERSIONS__ -U__CUDA_NO_HALF2_OPERATORS__ -U__CUDA_NO_BFLOAT16_CONVERSIONS__ --expt-relaxed-constexpr --expt-extended-lambda --use_fast_math --ptxas-options=-v -gencode=arch=compute_80,code=sm_80 -gencode=arch=compute_86,code=sm_86 -gencode=arch=compute_90,code=sm_90 -DFLASHATTENTION_DISABLE_ALIBI --generate-line-info -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_C_flashattention -D_GLIBCXX_USE_CXX11_ABI=0
@lw
Copy link
Contributor

lw commented Nov 22, 2024

Sorry, our Windows support is best-effort, I'm not sure how I can help you. One common issue with Windows is long-path support, there's already a few threads on GitHub about it, I recommend you take a look.

@zslefour
Copy link
Author

Sorry, our Windows support is best-effort, I'm not sure how I can help you. One common issue with Windows is long-path support, there's already a few threads on GitHub about it, I recommend you take a look.


Thanks for the reply.
But I found nothing to do with long path support for Windows.
During the build process, I paused, and found that the long path had been successfully established.

Exegesis:====I looked through the explorer and the following command successfully established the required long path.===>>
creating C:\Users\Monday\AppData\Local\Temp\pip-install-4ls6iq2w\xformers_bcf4f67735df4e9aae791b64a7228251\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-4ls6iq2w\xformers_bcf4f67735df4e9aae791b64a7228251\third_party\flash-attention\csrc\flash_attn
creating C:\Users\Monday\AppData\Local\Temp\pip-install-4ls6iq2w\xformers_bcf4f67735df4e9aae791b64a7228251\build\temp.win-amd64-cpython-312\Release\Users\Monday\AppData\Local\Temp\pip-install-4ls6iq2w\xformers_bcf4f67735df4e9aae791b64a7228251\third_party\flash-attention\csrc\flash_attn\src

Exegesis:------------It looks like the previous command didn't output a file to the path because the path was empty underneath.----->
FAILED: C:/Users/Monday/AppData/Local/Temp/pip-install-4ls6iq2w/xformers_bcf4f67735df4e9aae791b64a7228251/build/temp.win-amd64-cpython-312/Release/Users/Monday/AppData/Local/Temp/pip-install-4ls6iq2w/xformers_bcf4f67735df4e9aae791b64a7228251/third_party/flash-attention/csrc/flash_attn/src/flash_bwd_hdim160_fp16_sm80.obj

----so,cl try rewrite ?---》
cl: 命令行 warning D9025 :正在重写“/D__CUDA_NO_HALF_OPERATORS__”(用“/U__CUDA_NO_HALF_OPERATORS__”)

is cl.exe not support long-path?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants