Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for conversations with message history #234

Merged
merged 36 commits into from
Dec 20, 2024

Conversation

leila-messallem
Copy link
Contributor

@leila-messallem leila-messallem commented Dec 13, 2024

Description

Adding chat functionality with message history.

  • Added an optional chat_history parameter to the invoke method of the LLMInterface.
  • The chat_history is a dict with role and content keys, where role can be either "user" or "assistant".
  • Added a system_instruction parameter to the LLM class instantiation (meaning that you have to create a separate LLM instance for each use case. The motivation being that on vertexai, system_instruction is set on the GenerativeModel object, so separated from the question prompt).
  • Added a summary of the chat history to the query embedding (so that the relevant context gets included in the embedding).

Type of Change

  • New feature
  • Bug fix
  • Breaking change
  • Documentation update
  • Project configuration change

Complexity

Complexity: High

How Has This Been Tested?

  • Unit tests
  • E2E tests
  • Manual tests

Checklist

The following requirements should have been met (depending on the changes in the branch):

  • Documentation has been updated
  • Unit tests have been updated
  • E2E tests have been updated
  • Examples have been updated
  • New files have copyright header
  • CLA (https://neo4j.com/developer/cla/) has been signed
  • CHANGELOG.md updated if appropriate

* an extra llm call adds to latency and cost, but including the entire - or even part of - the chat history can potentially create a very large embedding context
* from list to anthropic.MessageParam
* upgrading from Cohere API v1 to API v2, as the v2 is handling chat history in a way that is consistent with the other providers
@leila-messallem leila-messallem requested a review from a team as a code owner December 13, 2024 08:27
* plus added the `options` parameter to the ollama `chat` call
* an idea of how to override the system instructions for some invokations
* for the type declaration of the `message_history` parameter
leila-messallem and others added 3 commits December 16, 2024 20:56
* bring back list[dicy[str,str]] type declaration for the `message_history` parameter
summarization_prompt = ChatSummaryTemplate().format(
message_history=message_history
)
summary = self.llm.invoke(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm wondering if we should allow the user to use a different LLM for summarization. I'm thinking users might want to use a "small" LLM for this simple task, and use a "better" one for the Q&A part. But we can leave it for a later improvement.

@leila-messallem
Copy link
Contributor Author

@CodiumAI-Agent /update_changelog

@CodiumAI-Agent
Copy link

Changelog updates: 🔄

2024-12-19

Added

  • Support for conversations with message history, including a new message_history parameter for LLM interactions.
  • Ability to include system instructions and override them for specific invocations.
  • Summarization of chat history to enhance query embedding and context handling.

Changed

  • Updated LLM implementations to handle message history consistently across providers.
  • Enhanced prompt templates to support chat history summarization and conversation context.

to commit the new content to the CHANGELOG.md file, please type:
'/update_changelog --pr_update_changelog.push_changelog_changes=true'

* ... for query embedding and summarization to the GraphRAG class
Copy link
Contributor

@stellasia stellasia left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work! 🥳🥳

Thanks for dealing with all the issues, especially the vendor specificity 🙏

@leila-messallem leila-messallem merged commit f8092fc into neo4j:main Dec 20, 2024
7 checks passed
@leila-messallem leila-messallem deleted the chat-history branch December 20, 2024 11:35
@CodiumAI-Agent
Copy link

Changelog updates: 🔄

2024-12-20

Added

  • Support for conversations with message history, including a message_history parameter for LLM interactions.
  • Summarization of chat history to improve query embedding and context handling.
  • Ability to include and override system instructions for specific LLM invocations.

Changed

  • Updated LLM implementations to handle message history consistently across providers.

to commit the new content to the CHANGELOG.md file, please type:
'/update_changelog --pr_update_changelog.push_changelog_changes=true'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants