Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

community[patch]: Fix HuggingFace LLM to not repeat the prompt as part of the result #17363

Closed
wants to merge 8 commits into from

Conversation

lin-calvin
Copy link
Contributor

@lin-calvin lin-calvin commented Feb 10, 2024

Description: Let huggingface llm don't repeat prompt
Issue: #16972
Break changes: It changes the llm Huggingface's behavior:

  • Before: Huggingface llm will repeat the prompt in its result
  • After: Huggingface llm will not repeat the prompt in its result

Copy link

vercel bot commented Feb 10, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Ignored Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Visit Preview Feb 15, 2024 6:05am

@dosubot dosubot bot added size:XS This PR changes 0-9 lines, ignoring generated files. Ɑ: models Related to LLMs or chat model modules 🤖:improvement Medium size change to existing code to handle new use-cases labels Feb 10, 2024
@eyurtsev
Copy link
Collaborator

@calvinweb this looks like a breaking change, but also a fix for incorrect behavior. Any chance you could include a pr summary that explains the before and after of this change?

@eyurtsev eyurtsev self-assigned this Feb 10, 2024
@lin-calvin
Copy link
Contributor Author

@calvinweb this looks like a breaking change, but also a fix for incorrect behavior. Any chance you could include a pr summary that explains the before and after of this change?

It is now included in the comment

@eyurtsev
Copy link
Collaborator

Fix typo -- parameter was being passed twice. Ready to merge as long as unit-tests pass

@eyurtsev eyurtsev changed the title Let huggingface llm don't repeat prompt community[patch]: Fix update in HuggingFace LLM to not repeat the LLM prompt in the result Feb 13, 2024
@eyurtsev eyurtsev changed the title community[patch]: Fix update in HuggingFace LLM to not repeat the LLM prompt in the result community[patch]: Fix HuggingFace LLM to not repeat the prompt as part of the result Feb 13, 2024
@dosubot dosubot bot added the lgtm PR looks good. Use to confirm that a PR is ready for merging. label Feb 13, 2024
@baskaryan
Copy link
Collaborator

cc @aymeric-roucher believe #17254 also addresses this correct?

@aymeric-roucher
Copy link
Contributor

Yes it does @baskaryan!

@lin-calvin
Copy link
Contributor Author

cc @aymeric-roucher believe #17254 also addresses this correct?

Since it is merged, should this issue close?

@eyurtsev eyurtsev self-assigned this Mar 19, 2024
@eyurtsev
Copy link
Collaborator

Closing since PR was addressed already

@eyurtsev eyurtsev closed this Mar 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:improvement Medium size change to existing code to handle new use-cases lgtm PR looks good. Use to confirm that a PR is ready for merging. Ɑ: models Related to LLMs or chat model modules size:XS This PR changes 0-9 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants