Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adjustment when calculating hash | Adjustment of the hash calculation… #1837

Conversation

joaorura
Copy link
Contributor

… method

When trying to load the saved models after adaptation, alerts like these were always triggered:

Loaded prompt hash does not match the saved hash.
Loaded prompt hash does not match the saved hash.

Furthermore, in Python, the default hash() function may yield different results for the same string across different sessions. To achieve consistent hash values, for tha i using the hashlib module to calculate de hash of prompt, which provides stable hashing algorithms.

@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Jan 11, 2025
Copy link
Member

@jjmachan jjmachan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks a lot @joaorura for helping us with the Fix - really really appreciate this ❤️

@jjmachan jjmachan merged commit f2d1ce1 into explodinggradients:main Jan 14, 2025
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:M This PR changes 30-99 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants