Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

install error on windows, need help #109

Open
gaowayne opened this issue Nov 3, 2024 · 5 comments
Open

install error on windows, need help #109

gaowayne opened this issue Nov 3, 2024 · 5 comments

Comments

@gaowayne
Copy link

gaowayne commented Nov 3, 2024

hello expert, need your help, run into below error when trying to install on windows.

:\Wayne\code\ktransformers>install.bat
The system cannot find the file specified.
The system cannot find the file specified.
The system cannot find the file specified.
The system cannot find the file specified.
Could Not Find C:\Wayne\code\ktransformers\ktransformers\ktransformers_ext\cuda\*.egg-info
Installing python dependencies from requirements.txt
Collecting fire (from -r requirements-local_chat.txt (line 1))
  Downloading fire-0.7.0.tar.gz (87 kB)
  Preparing metadata (setup.py) ... done
Requirement already satisfied: transformers in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from -r requirements-local_chat.txt (line 2)) (4.46.1)
Requirement already satisfied: numpy in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from -r requirements-local_chat.txt (line 3)) (1.26.4)
Requirement already satisfied: torch>=2.3.0 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from -r requirements-local_chat.txt (line 4)) (2.4.0)
Requirement already satisfied: packaging in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from -r requirements-local_chat.txt (line 5)) (24.1)
Collecting cpufeature (from -r requirements-local_chat.txt (line 6))
  Downloading cpufeature-0.2.1-cp311-cp311-win_amd64.whl.metadata (5.5 kB)
Requirement already satisfied: protobuf in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from -r requirements-local_chat.txt (line 7)) (4.25.4)
Requirement already satisfied: termcolor in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from fire->-r requirements-local_chat.txt (line 1)) (2.4.0)
Requirement already satisfied: filelock in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from transformers->-r requirements-local_chat.txt (line 2)) (3.16.1)
Requirement already satisfied: huggingface-hub<1.0,>=0.23.2 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from transformers->-r requirements-local_chat.txt (line 2)) (0.26.2)
Requirement already satisfied: pyyaml>=5.1 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from transformers->-r requirements-local_chat.txt (line 2)) (6.0.2)
Requirement already satisfied: regex!=2019.12.17 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from transformers->-r requirements-local_chat.txt (line 2)) (2024.9.11)
Requirement already satisfied: requests in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from transformers->-r requirements-local_chat.txt (line 2)) (2.32.3)
Requirement already satisfied: safetensors>=0.4.1 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from transformers->-r requirements-local_chat.txt (line 2)) (0.4.5)
Requirement already satisfied: tokenizers<0.21,>=0.20 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from transformers->-r requirements-local_chat.txt (line 2)) (0.20.1)
Requirement already satisfied: tqdm>=4.27 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from transformers->-r requirements-local_chat.txt (line 2)) (4.66.6)
Requirement already satisfied: typing-extensions>=4.8.0 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from torch>=2.3.0->-r requirements-local_chat.txt (line 4)) (4.12.2)
Requirement already satisfied: sympy in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from torch>=2.3.0->-r requirements-local_chat.txt (line 4)) (1.12)
Requirement already satisfied: networkx in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from torch>=2.3.0->-r requirements-local_chat.txt (line 4)) (3.0)
Requirement already satisfied: jinja2 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from torch>=2.3.0->-r requirements-local_chat.txt (line 4)) (3.1.2)
Requirement already satisfied: fsspec in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from torch>=2.3.0->-r requirements-local_chat.txt (line 4)) (2024.9.0)
Requirement already satisfied: colorama in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from tqdm>=4.27->transformers->-r requirements-local_chat.txt (line 2)) (0.4.6)
Requirement already satisfied: MarkupSafe>=2.0 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from jinja2->torch>=2.3.0->-r requirements-local_chat.txt (line 4)) (2.1.2)
Requirement already satisfied: charset-normalizer<4,>=2 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from requests->transformers->-r requirements-local_chat.txt (line 2)) (2.1.1)
Requirement already satisfied: idna<4,>=2.5 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from requests->transformers->-r requirements-local_chat.txt (line 2)) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from requests->transformers->-r requirements-local_chat.txt (line 2)) (1.26.13)
Requirement already satisfied: certifi>=2017.4.17 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from requests->transformers->-r requirements-local_chat.txt (line 2)) (2022.12.7)
Requirement already satisfied: mpmath>=0.19 in c:\users\wgao\appdata\local\programs\python\python311\lib\site-packages (from sympy->torch>=2.3.0->-r requirements-local_chat.txt (line 4)) (1.3.0)
Downloading cpufeature-0.2.1-cp311-cp311-win_amd64.whl (15 kB)
Building wheels for collected packages: fire
  Building wheel for fire (setup.py) ... done
  Created wheel for fire: filename=fire-0.7.0-py3-none-any.whl size=114262 sha256=ab3c04d0e625e83168f131e5694383479c5bb74b9ba4e5ff0089f9b637ee6fa2
  Stored in directory: c:\users\wgao\appdata\local\pip\cache\wheels\46\54\24\1624fd5b8674eb1188623f7e8e17cdf7c0f6c24b609dfb8a89
Successfully built fire
Installing collected packages: cpufeature, fire
Successfully installed cpufeature-0.2.1 fire-0.7.0
Installing ktransformers
Processing c:\wayne\code\ktransformers
  Preparing metadata (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Preparing metadata (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [17 lines of output]
      Traceback (most recent call last):
        File "C:\Users\wgao\AppData\Local\Programs\Python\Python311\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 353, in <module>
          main()
        File "C:\Users\wgao\AppData\Local\Programs\Python\Python311\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\wgao\AppData\Local\Programs\Python\Python311\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 149, in prepare_metadata_for_build_wheel
          return hook(metadata_directory, config_settings)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\wgao\AppData\Local\Programs\Python\Python311\Lib\site-packages\setuptools\build_meta.py", line 377, in prepare_metadata_for_build_wheel
          self.run_setup()
        File "C:\Users\wgao\AppData\Local\Programs\Python\Python311\Lib\site-packages\setuptools\build_meta.py", line 335, in run_setup
          exec(code, locals())
        File "<string>", line 294, in <module>
        File "<string>", line 132, in get_package_version
        File "<string>", line 54, in get_cuda_bare_metal_version
      TypeError: unsupported operand type(s) for +: 'NoneType' and 'str'
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
Installation completed successfully
@squik67
Copy link
Contributor

squik67 commented Feb 10, 2025

Same here on Linux, with a python 3.12 venv :

./pip install ktransformers --no-build-isolation
Collecting ktransformers
  Using cached ktransformers-0.1.4.tar.gz (7.2 MB)
  Preparing metadata (pyproject.toml) ... error
  error: subprocess-exited-with-error
  
  × Preparing metadata (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [17 lines of output]
      Traceback (most recent call last):
        File "/usr/local/venv/lib/python3.12/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
          main()
        File "/usr/local/venv/lib/python3.12/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/venv/lib/python3.12/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 149, in prepare_metadata_for_build_wheel
          return hook(metadata_directory, config_settings)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/venv/lib/python3.12/site-packages/setuptools/build_meta.py", line 377, in prepare_metadata_for_build_wheel
          self.run_setup()
        File "/usr/local/venv/lib/python3.12/site-packages/setuptools/build_meta.py", line 320, in run_setup
          exec(code, locals())
        File "<string>", line 294, in <module>
        File "<string>", line 132, in get_package_version
        File "<string>", line 54, in get_cuda_bare_metal_version
      TypeError: unsupported operand type(s) for +: 'NoneType' and 'str'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

Moreover you should update the documentation : pip install torch packaging ninja cpufeature numpy (missing cpufeature and numpy)

On Linux I managed to solve this error by installing the package: nvidia-cuda-toolkit

@Reactantvr
Copy link

Reactantvr commented Feb 10, 2025

I get the same error with windows 10. I tried installing cuda toolkit 12.8, but the error still persists.

Installing ktransformers
Processing j:\ktransformers
Preparing metadata (pyproject.toml) ... error
error: subprocess-exited-with-error

× Preparing metadata (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [17 lines of output]
Traceback (most recent call last):
File "C:\Users\matt\anaconda3\envs\kt3\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 389, in
main()
File "C:\Users\matt\anaconda3\envs\kt3\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 373, in main
json_out["return_val"] = hook(**hook_input["kwargs"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\matt\anaconda3\envs\kt3\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 175, in prepare_metadata_for_build_wheel
return hook(metadata_directory, config_settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\matt\anaconda3\envs\kt3\Lib\site-packages\setuptools\build_meta.py", line 377, in prepare_metadata_for_build_wheel
self.run_setup()
File "C:\Users\matt\anaconda3\envs\kt3\Lib\site-packages\setuptools\build_meta.py", line 320, in run_setup
exec(code, locals())
File "", line 296, in
File "", line 132, in get_package_version
File "", line 54, in get_cuda_bare_metal_version
TypeError: unsupported operand type(s) for +: 'NoneType' and 'str'
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
Installation completed successfully

Edit 1:
I finally got passed the error using the following command:
pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu126

Edit 2:
Well, new error now. I've spent half a day troubleshooting this and I am just going to give up. This app clearly is not set up to work in windows. I'll just stick with llama.cpp and slow speed because it just works.

error: [WinError 2] The system cannot find the file specified
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for ktransformers
Failed to build ktransformers
ERROR: Failed to build installable wheels for some pyproject.toml based projects (ktransformers)
Installation completed successfully

@fshurra
Copy link

fshurra commented Feb 12, 2025

I get the same error with windows 10. I tried installing cuda toolkit 12.8, but the error still persists.

Installing ktransformers Processing j:\ktransformers Preparing metadata (pyproject.toml) ... error error: subprocess-exited-with-error

× Preparing metadata (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [17 lines of output] Traceback (most recent call last): File "C:\Users\matt\anaconda3\envs\kt3\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 389, in main() File "C:\Users\matt\anaconda3\envs\kt3\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 373, in main json_out["return_val"] = hook(**hook_input["kwargs"]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\matt\anaconda3\envs\kt3\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 175, in prepare_metadata_for_build_wheel return hook(metadata_directory, config_settings) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\matt\anaconda3\envs\kt3\Lib\site-packages\setuptools\build_meta.py", line 377, in prepare_metadata_for_build_wheel self.run_setup() File "C:\Users\matt\anaconda3\envs\kt3\Lib\site-packages\setuptools\build_meta.py", line 320, in run_setup exec(code, locals()) File "", line 296, in File "", line 132, in get_package_version File "", line 54, in get_cuda_bare_metal_version TypeError: unsupported operand type(s) for +: 'NoneType' and 'str' [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. error: metadata-generation-failed

× Encountered error while generating package metadata. ╰─> See above for output.

note: This is an issue with the package mentioned above, not pip. hint: See above for details. Installation completed successfully

Edit 1: I finally got passed the error using the following command: pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu126

Edit 2: Well, new error now. I've spent half a day troubleshooting this and I am just going to give up. This app clearly is not set up to work in windows. I'll just stick with llama.cpp and slow speed because it just works.

error: [WinError 2] The system cannot find the file specified [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for ktransformers Failed to build ktransformers ERROR: Failed to build installable wheels for some pyproject.toml based projects (ktransformers) Installation completed successfully

I have done a build from source install with ver 0.1.3 recently. Run into similar problems. Here are my solutions.

For part1, i fixed it using the same way.
For part2, please check if the cmake / cl are installed and present in your PATH.

@Reactantvr
Copy link

@fshurra

cl.exe was not present in my PATH and adding it changed what is happening. Now, "building wheel for ktransforms (pyproject.toml) ..." will do something for several minutes, then spit out errors after (like 100 pages of errors).

Image

@chengdidididi
Copy link

@Reactantvr 我也遇到了类似的问题,但是我是直接用的conda环境里装的最新的cmake(用conda update -c conda-forge cmake更新到最新版),也是会持续几分钟然后输出大量错误信息,报错也是一样的,我怀疑这个项目需要使用特定版本的cl来进行编译,但是在文档里并未说明,感觉这个项目对Windows平台兼容性还很一般

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants