-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy patherr_cl
73 lines (69 loc) · 7.72 KB
/
err_cl
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
GPU Driver '470' detected
WARNING: You are using pip version 20.1.1; however, version 23.1.2 is available.
You should consider upgrading via the '/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/bin/python -m pip install --upgrade pip' command.
/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/models/t5/tokenization_t5.py:163: FutureWarning: This tokenizer was incorrectly instantiated with a model max length of 512 which will be corrected in Transformers v5.
For now, this behavior is kept to avoid breaking backwards compatibility when padding/encoding with `truncation is True`.
- Be aware that you SHOULD NOT rely on t5-large automatically truncating your input to 512 when padding/encoding.
- If you want to encode/pad to sequences longer than 512 you can either instantiate this tokenizer with `model_max_length` or pass `max_length` when encoding/padding.
- To avoid this warning, please instantiate this tokenizer with `model_max_length` set to your preferred value.
warnings.warn(
Traceback (most recent call last):
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/configuration_utils.py", line 629, in _get_config_dict
resolved_config_file = cached_file(
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/utils/hub.py", line 417, in cached_file
resolved_file = hf_hub_download(
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 112, in _inner_fn
validate_repo_id(arg_value)
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 160, in validate_repo_id
raise HFValidationError(
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '../cl_checkpoints/min_path-tahoe-word/run_15/after_0'. Use `repo_type` argument if needed.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "run.py", line 137, in <module>
evaluate(args, save_results=True)
File "run.py", line 108, in evaluate
result_matrix = evaluate_all_models_over_all_domains(args, cl_run_input, save_results=save_results)
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/cl_domain/evaluation.py", line 18, in evaluate_all_models_over_all_domains
model_i = T5ForConditionalGeneration.from_pretrained(
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/modeling_utils.py", line 2251, in from_pretrained
config, model_kwargs = cls.config_class.from_pretrained(
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/configuration_utils.py", line 547, in from_pretrained
config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/configuration_utils.py", line 574, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/configuration_utils.py", line 650, in _get_config_dict
raise EnvironmentError(
OSError: Can't load the configuration of '../cl_checkpoints/min_path-tahoe-word/run_15/after_0'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure '../cl_checkpoints/min_path-tahoe-word/run_15/after_0' is the correct path to a directory containing a config.json file
/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/models/t5/tokenization_t5.py:163: FutureWarning: This tokenizer was incorrectly instantiated with a model max length of 512 which will be corrected in Transformers v5.
For now, this behavior is kept to avoid breaking backwards compatibility when padding/encoding with `truncation is True`.
- Be aware that you SHOULD NOT rely on t5-large automatically truncating your input to 512 when padding/encoding.
- If you want to encode/pad to sequences longer than 512 you can either instantiate this tokenizer with `model_max_length` or pass `max_length` when encoding/padding.
- To avoid this warning, please instantiate this tokenizer with `model_max_length` set to your preferred value.
warnings.warn(
Traceback (most recent call last):
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/configuration_utils.py", line 629, in _get_config_dict
resolved_config_file = cached_file(
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/utils/hub.py", line 417, in cached_file
resolved_file = hf_hub_download(
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 112, in _inner_fn
validate_repo_id(arg_value)
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 160, in validate_repo_id
raise HFValidationError(
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '../cl_checkpoints/max_path-vicious-tree/run_15/after_0'. Use `repo_type` argument if needed.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "run.py", line 137, in <module>
evaluate(args, save_results=True)
File "run.py", line 108, in evaluate
result_matrix = evaluate_all_models_over_all_domains(args, cl_run_input, save_results=save_results)
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/cl_domain/evaluation.py", line 18, in evaluate_all_models_over_all_domains
model_i = T5ForConditionalGeneration.from_pretrained(
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/modeling_utils.py", line 2251, in from_pretrained
config, model_kwargs = cls.config_class.from_pretrained(
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/configuration_utils.py", line 547, in from_pretrained
config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/configuration_utils.py", line 574, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/blue/boyer/amogh.mannekote/continual-learning-nlu/continual-learning-nlu/venv/lib/python3.8/site-packages/transformers/configuration_utils.py", line 650, in _get_config_dict
raise EnvironmentError(
OSError: Can't load the configuration of '../cl_checkpoints/max_path-vicious-tree/run_15/after_0'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure '../cl_checkpoints/max_path-vicious-tree/run_15/after_0' is the correct path to a directory containing a config.json file