Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GoogleAIGeminiChatGenerator: possible bug #654

Closed
anakin87 opened this issue Apr 10, 2024 · 1 comment · Fixed by #772
Closed

GoogleAIGeminiChatGenerator: possible bug #654

anakin87 opened this issue Apr 10, 2024 · 1 comment · Fixed by #772
Labels
bug Something isn't working integration:google-ai P2

Comments

@anakin87
Copy link
Member

From StackOverflow. We should investigate.


I am trying my best to run this tutorial right off the Haystack website using Google Colab:

https://docs.haystack.deepset.ai/reference/integrations-google-ai

It works up until I reach this code:

from haystack.utils import Secret
from haystack.dataclasses.chat_message import ChatMessage
from haystack_integrations.components.generators.google_ai import GoogleAIGeminiChatGenerator


gemini_chat = GoogleAIGeminiChatGenerator(model="gemini-pro", api_key=Secret.from_token("<MY_API_KEY>"))

messages = [ChatMessage.from_user("What is the most interesting thing you know?")]
res = gemini_chat.run(messages=messages)
for reply in res["replies"]:
    print(reply.content)

messages += res["replies"] + [ChatMessage.from_user("Tell me more about it")]
res = gemini_chat.run(messages=messages)
for reply in res["replies"]:
    print(reply.content)

I then get this error:

TypeError                                 Traceback (most recent call last)
<ipython-input-23-868854587ba9> in <cell line: 14>()
     12 
     13 messages += res["replies"] + [ChatMessage.from_user("Tell me more about it")]
---> 14 res = gemini_chat.run(messages=messages)
     15 for reply in res["replies"]:
     16     print(reply.content)

7 frames
/usr/local/lib/python3.10/dist-packages/google/generativeai/types/content_types.py in to_blob(blob)
    150                 "Could not recognize the intended type of the `dict`\n" "A content should have "
    151             )
--> 152         raise TypeError(
    153             "Could not create `Blob`, expected `Blob`, `dict` or an `Image` type"
    154             "(`PIL.Image.Image` or `IPython.display.Image`).\n"

TypeError: Could not create `Blob`, expected `Blob`, `dict` or an `Image` type(`PIL.Image.Image` or `IPython.display.Image`).
Got a: <class 'google.ai.generativelanguage_v1beta.types.content.Content'>
Value: parts {
  text: "What is the most interesting thing you know?"
}
role: "user"

Checking where that error takes place we're in the to_blob function of content_types.py. We are clearly not processing a blob, so that is strange.

In any case, I've tried everything I can think of to try to get this to work. I've tried explicitly calling:

messages.append(ChatMessage.from_system(content="Tell me more about it"))
I get the same error. I also tried putting this into a dict in the format Gemini would expect (with 'role' and 'parts') and I get a different error earlier on. It is definitely expecting a ChatMessage object here.

It is strange, but there are like 3 different similar tutorials on the official Haystack site and all of them suffer from this error even though they are slightly different how they try to put it together.

Also, it seems to only have a problem if you use ChatMessage.from_system or if you use ChatMessage() where you explicitly set the role to be the system. Trying assistant throws an error that assistant is valid (even though it was clearly one of the valid options in the ChatRole enum)

So I'm stumped how to get Haystack to work with Gemini for this tutorial.

@anakin87 anakin87 added bug Something isn't working integration:google-ai labels Apr 10, 2024
@masci masci added P1 P2 and removed P1 labels May 10, 2024
@antoniomuzzolini
Copy link
Contributor

antoniomuzzolini commented May 29, 2024

I'm stuck on the same problem. Debugging I found that the error could be in the _message_to_content method.
Removing the returns inside the if statements as follows, it seems to work properly.

    def _message_to_content(self, message: ChatMessage) -> Content:
        if message.role == ChatRole.SYSTEM and message.name:
            part = Part()
            part.function_call.name = message.name
            part.function_call.args = {}
            for k, v in message.content.items():
                part.function_call.args[k] = v
        elif message.role == ChatRole.SYSTEM:
            part = Part()
            part.text = message.content
            # return part
        elif message.role == ChatRole.FUNCTION:
            part = Part()
            part.function_response.name = message.name
            part.function_response.response = message.content
            # return part
        elif message.role == ChatRole.USER:
            part = self._convert_part(message.content)
        else:
            msg = f"Unsupported message role {message.role}"
            raise ValueError(msg)
        role = "user" if message.role in [ChatRole.USER, ChatRole.FUNCTION] else "model"
        return Content(parts=[part], role=role)

Checking the VertexAIGeminiChatGenerator this approach seems to be the correct one but maybe there's something else I'm missing?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working integration:google-ai P2
Projects
None yet
3 participants