You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
after I do "python3 convert-hf-to-ggml.py bigscience/bloomz-7b1 ./models ",there was a problem loading the model:
model_path = "/aidata/yh/BelleGroup_BELLE-7B-1M-fp16/" # You can modify the path for storing the local model
model = AutoModelForCausalLM.from_pretrained(model_path,from_tf=True)
model = model.half().cuda()
tokenizer = AutoTokenizer.from_pretrained(model_path)
return model,tokenizer
Traceback (most recent call last):
File "/root/anaconda3/lib/python3.9/site-packages/streamlit/scriptrunner/script_runner.py", line 557, in _run_script
exec(code, module.dict)
File "lianjie_web.py", line 20, in
model,tokenizer= load_model()
File "/root/anaconda3/lib/python3.9/site-packages/streamlit/legacy_caching/caching.py", line 573, in wrapped_func
return get_or_create_cached_value()
File "/root/anaconda3/lib/python3.9/site-packages/streamlit/legacy_caching/caching.py", line 557, in get_or_create_cached_value
return_value = func(*args, **kwargs)
File "lianjie_web.py", line 15, in load_model
model = AutoModelForCausalLM.from_pretrained(model_path,from_tf=True)
File "/root/anaconda3/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 471, in from_pretrained
return model_class.from_pretrained(
File "/root/anaconda3/lib/python3.9/site-packages/transformers/modeling_utils.py", line 2612, in from_pretrained
model, loading_info = load_tf2_checkpoint_in_pytorch_model(
File "/root/anaconda3/lib/python3.9/site-packages/transformers/modeling_tf_pytorch_utils.py", line 401, in load_tf2_checkpoint_in_pytorch_model
from .modeling_tf_utils import load_tf_weights
File "/root/anaconda3/lib/python3.9/site-packages/transformers/modeling_tf_utils.py", line 40, in
from .generation import GenerationConfig, TFGenerationMixin
ImportError: cannot import name 'TFGenerationMixin' from 'transformers.generation' (/root/anaconda3/lib/python3.9/site-packages/transformers/generation/init.py)
What is the reason for this, please?
The text was updated successfully, but these errors were encountered:
It seems you're using from_tf=True
I found this issue that is related to your case: huggingface/transformers#20611
Can you try the fix in there please?
after I do "python3 convert-hf-to-ggml.py bigscience/bloomz-7b1 ./models ",there was a problem loading the model:
Traceback (most recent call last):
File "/root/anaconda3/lib/python3.9/site-packages/streamlit/scriptrunner/script_runner.py", line 557, in _run_script
exec(code, module.dict)
File "lianjie_web.py", line 20, in
model,tokenizer= load_model()
File "/root/anaconda3/lib/python3.9/site-packages/streamlit/legacy_caching/caching.py", line 573, in wrapped_func
return get_or_create_cached_value()
File "/root/anaconda3/lib/python3.9/site-packages/streamlit/legacy_caching/caching.py", line 557, in get_or_create_cached_value
return_value = func(*args, **kwargs)
File "lianjie_web.py", line 15, in load_model
model = AutoModelForCausalLM.from_pretrained(model_path,from_tf=True)
File "/root/anaconda3/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 471, in from_pretrained
return model_class.from_pretrained(
File "/root/anaconda3/lib/python3.9/site-packages/transformers/modeling_utils.py", line 2612, in from_pretrained
model, loading_info = load_tf2_checkpoint_in_pytorch_model(
File "/root/anaconda3/lib/python3.9/site-packages/transformers/modeling_tf_pytorch_utils.py", line 401, in load_tf2_checkpoint_in_pytorch_model
from .modeling_tf_utils import load_tf_weights
File "/root/anaconda3/lib/python3.9/site-packages/transformers/modeling_tf_utils.py", line 40, in
from .generation import GenerationConfig, TFGenerationMixin
ImportError: cannot import name 'TFGenerationMixin' from 'transformers.generation' (/root/anaconda3/lib/python3.9/site-packages/transformers/generation/init.py)
What is the reason for this, please?
The text was updated successfully, but these errors were encountered: