You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think this is happening because of the raise defined in https://github.com/embeddings-benchmark/leaderboard/blob/main/refresh.py#L218. Our model does not have a hidden_dim/n_positions defined in the config.json. However, in my opinion you should still be able to extract the model size and embedding dimensions if those are available via the safetensors, and this raise should not prevent that from happening. For example, our model does not have a max sequence length limitation, so that key does not make sense.
Hi,
Two models that were uploaded recently (https://huggingface.co/minishlab/M2V_base_glove and https://huggingface.co/minishlab/M2V_base_output) do not have Model Size and Embedding Dimensions on the leaderboard. However, they both have this information displayed correctly on the model card and have working safetensors, and calling get_model_parameters_memory does return correct results for the models ((102, 0.38) and (8, 0.03)). I was wondering how we can fix this.
Thanks in advance!
The text was updated successfully, but these errors were encountered: