-
Notifications
You must be signed in to change notification settings - Fork 127
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add cohere chat generator #88
Merged
Merged
Changes from 9 commits
Commits
Show all changes
15 commits
Select commit
Hold shift + click to select a range
837261d
add cohere chat generator
sunilkumardash9 f9840ad
remove chat_message.py
sunilkumardash9 d56b347
Merge pull request #1 from deepset-ai/main
sunilkumardash9 ed81a0d
add unit and integration tests
sunilkumardash9 4f6432b
improve tests
sunilkumardash9 d2e08e8
fix lint errors
sunilkumardash9 f97ccd1
fix lint errors(1)
sunilkumardash9 4725c48
Merge branch 'main' into main
masci f14a046
Merge branch 'main' into main
masci bc9674c
1. add releasenote
sunilkumardash9 ea574cc
Merge pull request #2 from deepset-ai/main
sunilkumardash9 33f218e
Merge remote-tracking branch 'origin/main'
sunilkumardash9 518a6b9
1. Adds ChatRole and convert default role to Cohere compliant role
sunilkumardash9 ea4c027
Merge pull request #3 from deepset-ai/main
sunilkumardash9 3d3c924
remove releasenotes
sunilkumardash9 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
135 changes: 135 additions & 0 deletions
135
integrations/cohere/src/cohere_haystack/chat/chat_generator.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,135 @@ | ||
import logging | ||
import os | ||
from typing import Any, Callable, Dict, List, Optional | ||
|
||
from haystack import component, default_from_dict, default_to_dict | ||
from haystack.components.generators.utils import deserialize_callback_handler, serialize_callback_handler | ||
from haystack.dataclasses import ChatMessage, StreamingChunk | ||
from haystack.lazy_imports import LazyImport | ||
|
||
with LazyImport(message="Run 'pip install cohere'") as cohere_import: | ||
import cohere | ||
logger = logging.getLogger(__name__) | ||
|
||
|
||
class CohereChatGenerator: | ||
def __init__( | ||
self, | ||
api_key: Optional[str] = None, | ||
model_name: str = "command", | ||
streaming_callback: Optional[Callable[[StreamingChunk], None]] = None, | ||
api_base_url: Optional[str] = None, | ||
generation_kwargs: Optional[Dict[str, Any]] = None, | ||
**kwargs, | ||
): | ||
cohere_import.check() | ||
|
||
if not api_key: | ||
api_key = os.environ.get("COHERE_API_KEY") | ||
if not api_key: | ||
error = "CohereChatGenerator needs an API key to run. Either provide it as init parameter or set the env var COHERE_API_KEY." # noqa: E501 | ||
raise ValueError(error) | ||
|
||
if not api_base_url: | ||
api_base_url = cohere.COHERE_API_URL | ||
if generation_kwargs is None: | ||
generation_kwargs = {} | ||
self.api_key = api_key | ||
self.model_name = model_name | ||
self.streaming_callback = streaming_callback | ||
self.api_base_url = api_base_url | ||
self.generation_kwargs = generation_kwargs | ||
self.model_parameters = kwargs | ||
self.client = cohere.Client(api_key=self.api_key, api_url=self.api_base_url) | ||
|
||
def _get_telemetry_data(self) -> Dict[str, Any]: | ||
""" | ||
Data that is sent to Posthog for usage analytics. | ||
""" | ||
return {"model": self.model_name} | ||
|
||
def to_dict(self) -> Dict[str, Any]: | ||
""" | ||
Serialize this component to a dictionary. | ||
:return: The serialized component as a dictionary. | ||
""" | ||
callback_name = serialize_callback_handler(self.streaming_callback) if self.streaming_callback else None | ||
return default_to_dict( | ||
self, | ||
model_name=self.model_name, | ||
streaming_callback=callback_name, | ||
api_base_url=self.api_base_url, | ||
generation_kwargs=self.generation_kwargs, | ||
) | ||
|
||
@classmethod | ||
def from_dict(cls, data: Dict[str, Any]) -> "CohereChatGenerator": | ||
""" | ||
Deserialize this component from a dictionary. | ||
:param data: The dictionary representation of this component. | ||
:return: The deserialized component instance. | ||
""" | ||
init_params = data.get("init_parameters", {}) | ||
serialized_callback_handler = init_params.get("streaming_callback") | ||
if serialized_callback_handler: | ||
data["init_parameters"]["streaming_callback"] = deserialize_callback_handler(serialized_callback_handler) | ||
return default_from_dict(cls, data) | ||
|
||
@component.output_types(replies=List[ChatMessage]) | ||
def run(self, messages: List[ChatMessage], generation_kwargs: Optional[Dict[str, Any]] = None): | ||
# update generation kwargs by merging with the generation kwargs passed to the run method | ||
generation_kwargs = {**self.generation_kwargs, **(generation_kwargs or {})} | ||
message = [message.content for message in messages] | ||
response = self.client.chat( | ||
message=message[0], model=self.model_name, stream=self.streaming_callback is not None, **generation_kwargs | ||
) | ||
if self.streaming_callback: | ||
for chunk in response: | ||
if chunk.event_type == "text-generation": | ||
stream_chunk = self._build_chunk(chunk) | ||
self.streaming_callback(stream_chunk) | ||
chat_message = ChatMessage(content=response.texts, role=None, name=None) | ||
chat_message.metadata.update( | ||
{ | ||
"token_count": response.token_count, | ||
"finish_reason": response.finish_reason, | ||
"documents": response.documents, | ||
"citations": response.citations, | ||
"chat-history": response.chat_history, | ||
} | ||
) | ||
else: | ||
chat_message = self._build_message(response) | ||
return {"replies": [chat_message]} | ||
|
||
def _build_chunk(self, chunk) -> StreamingChunk: | ||
""" | ||
Converts the response from the Cohere API to a StreamingChunk. | ||
:param chunk: The chunk returned by the OpenAI API. | ||
:param choice: The choice returned by the OpenAI API. | ||
:return: The StreamingChunk. | ||
""" | ||
# if chunk.event_type == "text-generation": | ||
chat_message = StreamingChunk( | ||
content=chunk.text, metadata={"index": chunk.index, "event_type": chunk.event_type} | ||
) | ||
return chat_message | ||
|
||
def _build_message(self, cohere_response): | ||
""" | ||
Converts the non-streaming response from the Cohere API to a ChatMessage. | ||
:param cohere_response: The completion returned by the Cohere API. | ||
:return: The ChatMessage. | ||
""" | ||
content = cohere_response.text | ||
message = ChatMessage(content=content, role=None, name=None) | ||
message.metadata.update( | ||
{ | ||
"token_count": cohere_response.token_count, | ||
"meta": cohere_response.meta, | ||
"citations": cohere_response.citations, | ||
"documents": cohere_response.documents, | ||
"chat-history": cohere_response.chat_history, | ||
} | ||
) | ||
return message |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any way we can try to match OpenAI metadata, they have:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Aha I see token counts are available. @anakin87 should we try to match the OpenAI format here so all the chat generators are more or less interchangeable? Ideally we match the OpenAI format and then provide more Cohere specific metadata, whatever is available, can't hurt
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be nice to do that!