You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I encountered an error while following the honesty_contrast_vec_TQA_mc example. When using the ContrastVecLlamaForCausalLM class with the Llama-2-7b-hf model (I also tested with Llama-2-7b-chat-hf), I received the following error:
1727 if name in modules:
1728 return modules[name]
-> 1729 raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
1730
1731 def __setattr__(self, name: str, value: Union[Tensor, 'Module']) -> None:
AttributeError: 'LlamaModel' object has no attribute '_use_flash_attention_2'
Hello, I encountered an error while following the honesty_contrast_vec_TQA_mc example. When using the ContrastVecLlamaForCausalLM class with the Llama-2-7b-hf model (I also tested with Llama-2-7b-chat-hf), I received the following error:
This error occurred at the following line:
model_baseline_acc = get_tqa_accuracy(model, questions, answers, labels, tokenizer, batch_size=batch_size)
I tried downgrading and upgrading the relevant packages (transformers, torch and accelerate) without success.
Environment Details
The text was updated successfully, but these errors were encountered: