-
Notifications
You must be signed in to change notification settings - Fork 127
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add STACKITChatGenerator #1274
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me! I've added only two comments about some type warnings I am getting.
component = STACKITChatGenerator(model="neuralmagic/Meta-Llama-3.1-70B-Instruct-FP8") | ||
results = component.run(chat_messages) | ||
assert len(results["replies"]) == 1 | ||
message: ChatMessage = results["replies"][0] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here I think we can avoid mypy warning about annotation-unchecked
adding None
as test return type. This should tell mypy to type check the function bodies.
) | ||
], | ||
created=int(datetime.now(tz=pytz.timezone("UTC")).timestamp()), | ||
usage={"prompt_tokens": 57, "completion_tokens": 40, "total_tokens": 97}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This line may be:
from openai.types.completion_usage import CompletionUsage
# ...
usage=CompletionUsage(prompt_tokens=57, completion_tokens=40, total_tokens=97)
mypy trigger this if we add Any
as mock method's return type
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some docstring comments
...tions/stackit/src/haystack_integrations/components/generators/stackit/chat/chat_generator.py
Outdated
Show resolved
Hide resolved
...tions/stackit/src/haystack_integrations/components/generators/stackit/chat/chat_generator.py
Outdated
Show resolved
Hide resolved
...tions/stackit/src/haystack_integrations/components/generators/stackit/chat/chat_generator.py
Outdated
Show resolved
Hide resolved
...tions/stackit/src/haystack_integrations/components/generators/stackit/chat/chat_generator.py
Outdated
Show resolved
Hide resolved
...tions/stackit/src/haystack_integrations/components/generators/stackit/chat/chat_generator.py
Outdated
Show resolved
Hide resolved
Co-authored-by: Daria Fokina <[email protected]>
…rators/stackit/chat/chat_generator.py Co-authored-by: Daria Fokina <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good job!
I left a few comments here and there.
We will release this after 2.9.0? My impression is that the implementation depends on that.
Is it worth creating an issue with this template to make sure we don't forget anything?
dynamic = ["version"] | ||
description = 'An integration of STACKIT as a StackitChatGenerator' | ||
readme = "README.md" | ||
requires-python = ">=3.8" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Python 3.8 reached End Of Life (https://devguide.python.org/versions/), so I recommend supporting >=3.9.
Same for everything below.
def _prepare_api_call( # noqa: PLR0913 | ||
self, | ||
*, | ||
messages: List[ChatMessage], | ||
streaming_callback: Optional[StreamingCallbackT] = None, | ||
generation_kwargs: Optional[Dict[str, Any]] = None, | ||
tools: Optional[List[Tool]] = None, | ||
tools_strict: Optional[bool] = None, | ||
) -> Dict[str, Any]: | ||
prepared_api_call = super(STACKITChatGenerator, self)._prepare_api_call( | ||
messages=messages, | ||
streaming_callback=streaming_callback, | ||
generation_kwargs=generation_kwargs, | ||
tools=tools, | ||
tools_strict=tools_strict, | ||
) | ||
prepared_api_call.pop("tools") | ||
return prepared_api_call |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now that deepset-ai/haystack#8702 has been merged, can we remove/refactor this part?
"embedders: embedders tests", | ||
"chat_generators: chat_generators tests", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"embedders: embedders tests", | |
"chat_generators: chat_generators tests", |
these markers seem unused
Thanks for all the feedback! This PR can wait until after 2.9 release, yes. I will also update issue to use the proper template for new integrations. |
Related Issues
Proposed Changes:
_prepare_api_call
How did you test it?
Ran unit tests, integration tests, and the examples locally.
Notes for the reviewer
I was thinking about setting a
model
by default, which would simplify initialization of the component. However, there is no obvious default value. That's why I made themodel
parameter mandatory.I was surprised that I had to overwrite the
_prepare_api_call
method and wonder whether the implementation in OpenAIChatGenerator could cause problems for other OpenAI compatible model serving services users like to use.There is already a draft PR for the integration page here and a draft for the documentation page here.
Checklist
fix:
,feat:
,build:
,chore:
,ci:
,docs:
,style:
,refactor:
,perf:
,test:
.