-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add cohere chat generator #88
Conversation
Signed-off-by: sunilkumardash9 <[email protected]>
Thanks for this effort @sunilkumardash9 Can you use the existing ChatMessage like other layers do? |
Signed-off-by: sunilkumardash9 <[email protected]>
update repo
Signed-off-by: sunilkumardash9 <[email protected]>
Hi @vblagoje, I was curious how this import works as I can't seem to import chat_generator to test files with relative import.
|
Signed-off-by: sunilkumardash9 <[email protected]>
Signed-off-by: sunilkumardash9 <[email protected]>
Signed-off-by: sunilkumardash9 <[email protected]>
stream_chunk = self._build_chunk(chunk) | ||
self.streaming_callback(stream_chunk) | ||
chat_message = ChatMessage(content=response.texts, role=None, name=None) | ||
chat_message.metadata.update( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any way we can try to match OpenAI metadata, they have:
{
"model": chunk.model,
"index": 0,
"finish_reason": "stop"
"usage": {}, # here we put token counts for prompt and response
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Aha I see token counts are available. @anakin87 should we try to match the OpenAI format here so all the chat generators are more or less interchangeable? Ideally we match the OpenAI format and then provide more Cohere specific metadata, whatever is available, can't hurt
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be nice to do that!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sunilkumardash9 solid work, thank you. Would you please address my minor comments and don't forget to add release note using reno
tool. See contributing section, thanks for your great work so far.
stream_chunk = self._build_chunk(chunk) | ||
self.streaming_callback(stream_chunk) | ||
chat_message = ChatMessage(content=response.texts, role=None, name=None) | ||
chat_message.metadata.update( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Aha I see token counts are available. @anakin87 should we try to match the OpenAI format here so all the chat generators are more or less interchangeable? Ideally we match the OpenAI format and then provide more Cohere specific metadata, whatever is available, can't hurt
2. add doc strings 3. change metadata to match other chat generators Signed-off-by: sunilkumardash9 <[email protected]>
update repo
Apologies for the delay @sunilkumardash9 I'll take a look on Friday, play with the generator and hopefully, we will integrate it at that time. |
@vblagoje No issues. Do let me know in case I have missed anything. |
Hey @sunilkumardash9 if I understood their chat API correctly, I think we need to set the message parameter to be our last message, i.e message[-1], not to concatenate all messages into one and then sending it over. Rather, we need to pass all messages before the last as chat_history parameter. But please correct me if I'm wrong, I'm looking at the API docs; you have played with the model. Also please consider playing with an example chat script to prove to yourself that everything is working as expected after these changes. |
Also it seems like we have to translate the standard roles that everyone else agreed upon to cohere roles "User", and "Chatbot". So I think you'll need to do a bit of conversion of ChatMessages given to the CohereChatGenerator run method. user_message = {"user_name": "User", "text": message} |
2. Adds a unit test for 1 Signed-off-by: sunilkumardash9 <[email protected]>
@vblagoje I was wondering the same after Gemini chat, which has a similar chat history implementation, made some changes to adjust default chat roles to Cohere-specific ones. And it seems to be working well. |
Excellent @sunilkumardash9 took a quick look and this is much better. I'll play with it over the holidays and get back to you early next week |
Hey @sunilkumardash9 , I tried your CohereChatGenerator, and it works well. The only issue I faced was in streaming mode. The first response from LLM streams fine, but when I tried continuing the chat conversation, the second request failed with 400 response. Out of curiosity - do you get the same issue? |
@vblagoje It is working fine for me, even with subsequent requests. There might be some issues with the Cohere endpoint. |
@sunilkumardash9, this looks okay to me as the first version. Let's track the progress at #150 . Please update the task completion there so we can push it toward the finish line. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this contribution @sunilkumardash9
Sorry sorry... For consistency, we are changing each mention of @sunilkumardash9 can you please update this PR to reflect this change? |
Right, I've seen it too and forgot to mention it to @sunilkumardash9 - thanks @anakin87 Please update these references @sunilkumardash9 |
update repo
Hi @anakin87, I am getting metadata references in chat_message and chat_role in haystack-ai v2.0.0b3. |
@sunilkumardash9 Sorry, my fault. |
--- | ||
preview: | ||
- | | ||
Adds `CohereChatGenerator` for text and chat generation. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We currently don't have release notes for this repository.
Please remove this file.
Signed-off-by: sunilkumardash9 <[email protected]>
support for cohere chat endpoint