Replies: 4 comments 3 replies
-
You set the temperature when you create the LLM object eg from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="gpt-4o",
temperature=0,
max_tokens=None,
timeout=None,
max_retries=2,
# api_key="...",
) |
Beta Was this translation helpful? Give feedback.
-
Thank you! |
Beta Was this translation helpful? Give feedback.
-
Also, have a look at the new training feature. |
Beta Was this translation helpful? Give feedback.
-
I set LLM temperature=0, but the output is still different.Does Crewai have models built in? So how to ensure that crewai's model input and output are unchanged. Is it OK to set a fixed random seed? |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
All reactions