Replies: 2 comments 1 reply
-
It looks like the issue you're encountering is due to a mismatch in the expected input format for the To fix this, you should adjust your input to include a The Make sure your input data is correctly formatted according to these requirements. If you continue to face issues or have any more questions, feel free to ask for further assistance. Sources
|
Beta Was this translation helpful? Give feedback.
-
For anyone else that stumbles on this error and because documentation is non-obvious on this... This error ultimately happens because we're trying to invoke embeddings on a model that does not support it and so the Amazon's Titan series LLMs and the Cohere LLMs on Bedrock offer embedding modalities - most others do not (as of now). Make sure you compare the models on Bedrock so that you know which models support embedding. |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
Hello, is BedrockEmbeddings supposed to work with Mistral or Anthropic models ?.
When I use
BedrockEmbeddings(model_id="mistral.mixtral-8x7b-instruct-v0:1", model_kwargs={"max_tokens" : 250})
I have the following error:"ValueError: Error raised by inference endpoint: An error occurred (ValidationException) when calling the InvokeModel operation: Malformed input request: # : required key [prompt] not found# : extraneous key [inputText] is not permitted, please reformat your input and try again. "
System Info
langchain==0.1.16
langchain-community==0.0.32
langchain-core==0.1.42
langchain-mistralai==0.1.2
langchain-text-splitters==0.0.1
macOS 14.3.1
python 3.9
Beta Was this translation helpful? Give feedback.
All reactions