diff --git a/docs/how-to/llm-connections.mdx b/docs/how-to/llm-connections.mdx index 542a9c1105..a2fc540ccf 100644 --- a/docs/how-to/llm-connections.mdx +++ b/docs/how-to/llm-connections.mdx @@ -125,10 +125,10 @@ You can connect to OpenAI-compatible LLMs using either environment variables or - ```python Code - llm = LLM( - model="custom-model-name", - api_key="your-api-key", + ```python Code + llm = LLM( + model="custom-model-name", + api_key="your-api-key", base_url="https://api.your-provider.com/v1" ) agent = Agent(llm=llm, ...) @@ -179,4 +179,4 @@ This is particularly useful when working with OpenAI-compatible APIs or when you ## Conclusion -By leveraging LiteLLM, CrewAI offers seamless integration with a vast array of LLMs. This flexibility allows you to choose the most suitable model for your specific needs, whether you prioritize performance, cost-efficiency, or local deployment. Remember to consult the [LiteLLM documentation](https://docs.litellm.ai/docs/) for the most up-to-date information on supported models and configuration options. \ No newline at end of file +By leveraging LiteLLM, CrewAI offers seamless integration with a vast array of LLMs. This flexibility allows you to choose the most suitable model for your specific needs, whether you prioritize performance, cost-efficiency, or local deployment. Remember to consult the [LiteLLM documentation](https://docs.litellm.ai/docs/) for the most up-to-date information on supported models and configuration options.