Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatAnthropicVertex prompt caching support #651

Open
jthack opened this issue Dec 18, 2024 · 1 comment
Open

ChatAnthropicVertex prompt caching support #651

jthack opened this issue Dec 18, 2024 · 1 comment

Comments

@jthack
Copy link

jthack commented Dec 18, 2024

Hello,

As of recently, prompt caching is supposedly in preview in Vertex AI. Can you add support for it to ChatAnthropicVertex?

Thanks!

@ShaharZivanOnvego
Copy link

ShaharZivanOnvego commented Dec 25, 2024

I just want to bump this, and clarify that the "regular" method I use for cache prompting in the standard ChatAnthropic causes an error:

        content = [{
            "text": "Do something or other...",
            "type": "text",
            "cache_control": {"type": "ephemeral"}
        }]   
        
        prompt = ChatPromptTemplate.from_messages(
            [
                SystemMessage(content=content),
                ("placeholder", "{messages}"),
            ]
        )

This method fails when giving this prompt to ChatAnthropicVertex with the error:

File ".../python3.11/site-packages/langchain_google_vertexai/_anthropic_utils.py", line 143, in _format_messages_anthropic raise ValueError( ValueError: System message must be a string, instead was: <class 'list'>

So simply modifying it to support a list rather than a string would be enough to allow caching. Could be a quick fix

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants