-
-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add groq chat stream #88
Conversation
WalkthroughThe pull request introduces modifications to the Changes
Possibly related PRs
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Outside diff range and nitpick comments (1)
src/Google.VertexAI/src/VertexAIChatModel.cs (1)
Line range hint
22-41
: Consider enhancing error handling and cancellation supportWhile the streaming implementation looks good, consider these improvements:
- Add cancellation token support to the non-streaming path
- Add try-catch blocks for proper error handling in the streaming path
Here's a suggested implementation:
request = request ?? throw new ArgumentNullException(nameof(request)); var prompt = ToPrompt(request.Messages); var watch = Stopwatch.StartNew(); if (settings?.UseStreaming == true) { var responseStream = Provider.Api.StreamGenerateContent(prompt).GetResponseStream(); try { await foreach (var response in responseStream.WithCancellation(cancellationToken)) { yield return BuildChatResponse(response, request, settings, watch); } } catch (OperationCanceledException) { throw; } catch (Exception ex) { throw new InvalidOperationException("Error processing stream response", ex); } } else { - var response = await Provider.Api.GenerateContentAsync(prompt).ConfigureAwait(false); + var response = await Provider.Api.GenerateContentAsync( + prompt, + cancellationToken).ConfigureAwait(false); yield return BuildChatResponse(response, request, settings, watch); }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (2)
- src/Google.VertexAI/src/VertexAIChatModel.cs (1 hunks)
- src/Groq/src/GroqChatModel.cs (2 hunks)
🔇 Additional comments (3)
src/Google.VertexAI/src/VertexAIChatModel.cs (3)
22-22
: LGTM: Improved null check syntaxThe simplified null check using the null-coalescing operator is more concise and maintains the same functionality.
Line range hint
43-62
: LGTM: Well-structured response buildingThe BuildChatResponse method is well-implemented with proper handling of function calls and immutable response construction.
Line range hint
91-104
: Verify ProjectId access safetyThe ProjectId is accessed through
serviceAccountCredential?.ProjectId
, but there's no fallback if the credential is of a different type or if ProjectId is null. Consider using the project ID from the configuration directly.Let's check if ProjectId is available in other credential types:
|
||
if (settings?.UseStreaming == true) | ||
{ | ||
var responseStream = Provider.Api.CreateChatCompletionStreamAsync(prompt); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pass the cancellation token to API methods for proper cancellation handling
Currently, the cancellationToken
is not passed to CreateChatCompletionStreamAsync
and CreateChatCompletionAsync
methods. To ensure that the operations can be cancelled promptly, pass the cancellationToken
to these API methods if they support it.
Apply this diff to pass the cancellation token:
var responseStream = Provider.Api.CreateChatCompletionStreamAsync(prompt);
+ var responseStream = Provider.Api.CreateChatCompletionStreamAsync(prompt, cancellationToken);
var response = await Provider.Api.CreateChatCompletionAsync(prompt).ConfigureAwait(false);
+ var response = await Provider.Api.CreateChatCompletionAsync(prompt, cancellationToken).ConfigureAwait(false);
Also applies to: 35-35
|
||
var result = request.Messages.ToList(); | ||
provider.AddUsage(usage); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct the property reference from provider
to Provider
The code uses provider.AddUsage(usage);
, but provider
is not defined in this context. You should use the Provider
property instead.
Apply this diff to fix the property reference:
- provider.AddUsage(usage);
+ Provider.AddUsage(usage);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
provider.AddUsage(usage); | |
Provider.AddUsage(usage); |
Summary by CodeRabbit
New Features
GenerateAsync
method inGroqChatModel
to support streaming responses.VertexAIChatModel
.Bug Fixes
VertexAIChatModel
for improved reliability.Documentation