When I try to connect my LLM that is deployed on Azure AI studio using AzureMLOnlineEndpoint I get an 404 error. #24793
-
Checked other resources
Commit to Help
Example Codeimport os
import dotenv
dotenv.load_dotenv()
from langchain_community.llms.azureml_endpoint import (
AzureMLEndpointApiType,
CustomOpenAIContentFormatter, AzureMLOnlineEndpoint,
)
from langchain_core.messages import HumanMessage
from langchain_core.prompts import ChatPromptTemplate
llm = AzureMLOnlineEndpoint(
endpoint_url="https://Meta-Llama-3-1-8B-Instruct-*****.swedencentral.models.ai.azure.com/score",
endpoint_api_type=AzureMLEndpointApiType.dedicated,
endpoint_api_key=os.getenv("LLAMA3_API_KEY"),
content_formatter=CustomOpenAIContentFormatter(),
model_kwargs={"temperature": 0.8, "max_new_tokens": 400},
)
response = llm.invoke("Write me a song about sparkling water:")
print(response) DescriptionI have changed the endpoint url to fit the expected format. When I don't change it it gives an error that says it should be in the provided format. What should i do System Infopython 3 |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
@mmentese21 Hello there! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this issue together! To resolve the 404 error when trying to connect your LLM deployed on Azure AI Studio using from langchain_community.llms.azureml_endpoint import (
AzureMLEndpointApiType,
CustomOpenAIContentFormatter,
AzureMLOnlineEndpoint,
)
llm = AzureMLOnlineEndpoint(
endpoint_url="https://<your-endpoint>.<your_region>.inference.ml.azure.com/score", # Ensure this URL is correct
endpoint_api_type=AzureMLEndpointApiType.dedicated, # Ensure this matches your deployment type
endpoint_api_key="my-api-key", # Ensure this API key is correct
content_formatter=CustomOpenAIContentFormatter(),
model_kwargs={"temperature": 0.8, "max_new_tokens": 400},
)
response = llm.invoke("Write me a song about sparkling water:")
print(response) Make sure that:
If these parameters are correctly set and you still encounter a 404 error, verify that the endpoint is active and accessible. Additionally, ensure that the |
Beta Was this translation helpful? Give feedback.
@mmentese21 Hello there! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this issue together!
To resolve the 404 error when trying to connect your LLM deployed on Azure AI Studio using
AzureMLOnlineEndpoint
, ensure that you have correctly set theendpoint_url
,endpoint_api_type
, andendpoint_api_key
. Here is an example of how to set these parameters correctly: