You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This template is only for bug reports. For questions, please visit Discussions.
I have thoroughly reviewed the project documentation (installation, training, inference) but couldn't find information to solve my problem. English中文日本語Portuguese (Brazil)
I have searched for existing issues, including closed ones. Search issues
I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
[FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
Please do not modify this template and fill in all required fields.
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/home/coders/anaconda3/envs/fish-speech/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 38, in <module>
from .auto_factory import _LazyAutoMapping
File "/home/coders/anaconda3/envs/fish-speech/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 40, in <module>
from ...generation import GenerationMixin
File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
File "/home/coders/anaconda3/envs/fish-speech/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1805, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/home/coders/anaconda3/envs/fish-speech/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1819, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
numpy.core.multiarray failed to import
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/coders/fish-speech/tools/llama/merge_lora.py", line 12, in <module>
from fish_speech.models.text2semantic.llama import BaseTransformer
File "/home/coders/fish-speech/fish_speech/models/text2semantic/llama.py", line 17, in <module>
from transformers import AutoTokenizer
File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
File "/home/coders/anaconda3/envs/fish-speech/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1806, in __getattr__
value = getattr(module, name)
File "/home/dev/anaconda3/envs/fish-speech/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1805, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/home/coders/anaconda3/envs/fish-speech/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1819, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.auto.tokenization_auto because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
numpy.core.multiarray failed to import
✔️ Expected Behavior
What should i do now with that error?
❌ Actual Behavior
No response
The text was updated successfully, but these errors were encountered:
Self Checks
Cloud or Self Hosted
Self Hosted (Source)
Environment Details
Ubuntu
anything mentioned in doc
Steps to Reproduce
I run
and get below result:
✔️ Expected Behavior
What should i do now with that error?
❌ Actual Behavior
No response
The text was updated successfully, but these errors were encountered: