Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dependency conflict #126

Open
Nengzyue opened this issue Oct 17, 2024 · 1 comment
Open

dependency conflict #126

Nengzyue opened this issue Oct 17, 2024 · 1 comment

Comments

@Nengzyue
Copy link

大家没有碰到过依赖库冲突的问题吗,我flash-attn的依赖库老是冲突要怎么解决呀
(tinyllava_factory) bdca@bdca-poweredge-t640:~/ynz/tinyllava/TinyLLaVA_Factory$ python tinyllava/serve/app.py --model-path tinyllava/TinyLLaVA-Phi-2-SigLIP-3.1B
[2024-10-17 14:07:05,891] [INFO] [real_accelerator.py:191:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Traceback (most recent call last):
File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1510, in _get_module
return importlib.import_module("." + module_name, self.name)
File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1050, in _gcd_import
File "", line 1027, in _find_and_load
File "", line 1006, in _find_and_load_unlocked
File "", line 688, in _load_unlocked
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/models/phi/modeling_phi.py", line 56, in
from flash_attn import flash_attn_func, flash_attn_varlen_func
File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/flash_attn/init.py", line 3, in
from flash_attn.flash_attn_interface import (
File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/flash_attn/flash_attn_interface.py", line 10, in
import flash_attn_2_cuda as flash_attn_cuda
ImportError: /home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/bdca/ynz/tinyllava/TinyLLaVA_Factory/tinyllava/serve/app.py", line 20, in
from tinyllava.utils import *
File "/home/bdca/ynz/tinyllava/TinyLLaVA_Factory/tinyllava/utils/init.py", line 7, in
from .eval_utils import *
File "/home/bdca/ynz/tinyllava/TinyLLaVA_Factory/tinyllava/utils/eval_utils.py", line 7, in
from transformers import StoppingCriteria, PhiForCausalLM
File "", line 1075, in _handle_fromlist
File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1501, in getattr
value = getattr(module, name)
File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1500, in getattr
module = self._get_module(self._class_to_module[name])
File "/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1512, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.phi.modeling_phi because of the following error (look up to see its traceback):
/home/bdca/miniconda3/envs/tinyllava_factory/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

@ZhangXJ199
Copy link
Collaborator

ZhangXJ199 commented Oct 18, 2024

可以到 “https://github.com/Dao-AILab/flash-attention/releases” 下载与环境相匹配的flash-attention,然后pip install即可

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants