You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
File "/home/uu_cs_nlpsoc/awegmann/StyleTokenizer/src/run_value.py", line 743, in <module>
main()
File "/home/uu_cs_nlpsoc/awegmann/StyleTokenizer/src/run_value.py", line 553, in main
raw_datasets = raw_datasets.map(
File "/hpc/local/Rocky8/uu_cs_nlpsoc/miniconda3/envs/aw_value/lib/python3.10/site-packages/datasets/dataset_dict.py", line 869, in map
{
File "/hpc/local/Rocky8/uu_cs_nlpsoc/miniconda3/envs/aw_value/lib/python3.10/site-packages/datasets/dataset_dict.py", line 870, in <dictcomp>
k: dataset.map(
File "/hpc/local/Rocky8/uu_cs_nlpsoc/miniconda3/envs/aw_value/lib/python3.10/site-packages/datasets/arrow_dataset.py", line 602, in wrapper
out: Union["Dataset", "DatasetDict"] = func(self, *args, **kwargs)
File "/hpc/local/Rocky8/uu_cs_nlpsoc/miniconda3/envs/aw_value/lib/python3.10/site-packages/datasets/arrow_dataset.py", line 567, in wrapper
out: Union["Dataset", "DatasetDict"] = func(self, *args, **kwargs)
File "/hpc/local/Rocky8/uu_cs_nlpsoc/miniconda3/envs/aw_value/lib/python3.10/site-packages/datasets/arrow_dataset.py", line 3161, in map
for rank, done, content in Dataset._map_single(**dataset_kwargs):
File "/hpc/local/Rocky8/uu_cs_nlpsoc/miniconda3/envs/aw_value/lib/python3.10/site-packages/datasets/arrow_dataset.py", line 3552, in _map_single
batch = apply_function_on_filtered_inputs(
File "/hpc/local/Rocky8/uu_cs_nlpsoc/miniconda3/envs/aw_value/lib/python3.10/site-packages/datasets/arrow_dataset.py", line 3421, in apply_function_on_filtered_inputs
processed_inputs = function(*fn_args, *additional_args, **fn_kwargs)
File "/home/uu_cs_nlpsoc/awegmann/StyleTokenizer/src/run_value.py", line 517, in preprocess_function
conversions1 = [
File "/home/uu_cs_nlpsoc/awegmann/StyleTokenizer/src/run_value.py", line 518, in <listcomp>
dialect.convert_sae_to_dialect(example)
File "/hpc/local/Rocky8/uu_cs_nlpsoc/miniconda3/envs/aw_value/lib/python3.10/site-packages/multivalue/BaseDialect.py", line 193, in convert_sae_to_dialect
self.update(string)
File "/hpc/local/Rocky8/uu_cs_nlpsoc/miniconda3/envs/aw_value/lib/python3.10/site-packages/multivalue/BaseDialect.py", line 218, in update
self.coref_clusters = self.create_coref_cluster(string)
File "/hpc/local/Rocky8/uu_cs_nlpsoc/miniconda3/envs/aw_value/lib/python3.10/site-packages/multivalue/BaseDialect.py", line 237, in create_coref_cluster
assert [tok.text for tok in tokens] == [
AssertionError: Spacy and Stanza word mismatch
Any experience with this error? Does the run_glue.py still work for you in your env? I also had to delete the mapping in AfricanAmericanVernacular(mapping, ...).
FYI: I renamed run_glue to run_value.py
The text was updated successfully, but these errors were encountered:
My current goal is to run value with a model I trained. Would you recommend doing this via the source install or with the older repo https://github.com/SALT-NLP/value/ ?
Thanks for your patience - we don't get a ton of issues so I'm not in the habit of checking!
This repo is updated beyond the original VALUE repo, so this is preferred. The main reason you'd want to use the original VALUE repo is if you were trying to reproduce the experiments from that paper you exactly.
Let me look into the bug you are seeing and try to resolve it!
Hi! Maybe you can help me with the following:
After creating a conda environment with
I am calling
and get the error
Any experience with this error? Does the run_glue.py still work for you in your env? I also had to delete the mapping in AfricanAmericanVernacular(mapping, ...).
FYI: I renamed run_glue to run_value.py
The text was updated successfully, but these errors were encountered: