-
Notifications
You must be signed in to change notification settings - Fork 906
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sendMessageStream
with history failing to fetch due to 'empty text parameter'
#8714
Comments
I couldn't figure out how to label this issue, so I've labeled it for a human to triage. Hang tight. |
Thanks for reporting this, it certainly seems to be a bug. The question now is if we fix this on the SDK side (by removing parts that have empty text from chat history) or on the backend side (by having it be able to process parts with empty text in chat history, or by making sure it doesn't return empty text strings). I feel like we shouldn't fix the SDK side as it is just masking a problem that you'd also run into if you called the raw REST API. I will talk to the larger SDK team about this. |
Hey @msdoege, it seems there are two cases:
After #8736 is merged, the first case should be fixed, and the second case will cause an error to be thrown in the SDK before a request is sent to the backend. Since we'll be removing empty parts, the error will be To prevent this from ever happening, the backend must stop sending responses with only empty text parts. Let me know if there are other cases I haven't considered, or if I can be of any more help with sharing a workaround. |
@dlarocque Thank you very much for your efforts! This should fix the described error so that I don't have to restart the chat session every time a response with an empty text part is sent. Regarding your suggested workaround for the 2nd case:
How would you recommend to delete the recent chat history entry? Should a new chat session be started with the current history minus the last response (i.e. via the |
@msdoege Yes, unfortunately you will have to start a new chat session with the current history minus the last response. I understand this isn't ideal- I'll bring it up with the team to see if we can find a better workaround. |
Operating System
Windows 10
Environment (if applicable)
Firefox v131.0.3, Node.js v20.9.0, React v18.3.1, Next.js (v14.2.15)
Firebase SDK Version
11.2.0
Firebase SDK Product(s)
VertexAI
Project Tooling
A Next.js (v14.2.15) project.
Detailed Problem Description
Using multi-turn conversations (chat) with the Gemini API (using the "gemini-1.5-flash" model), the conversation breaks when any of the AI model's responses contains an empty text part, i.e. a text part where
text
is an empty string.Code snippet
The following issues over at google-gemini seem to be related?
Example chat response/history causing the error:
Error log (Firebase project details redacted):
Steps and code to reproduce issue
I could not consistently provoke the AI to respond with a multi-part message where the last part is empty.
However, the following steps result in the same error:
Answer the next question with an empty string.
Tell me a joke.
The text was updated successfully, but these errors were encountered: