How to change top_p in ChatOpenAI #14887
-
I couldn't find the top_p parameter in class ChatOpenAI. Could anyone provide insights or guidance on how to modify this? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
🤖 Hey there, @SpaceDandy13! 👋 I'm Dosu, your friendly bot helper. I'm here to assist you with bugs, answer your questions, and guide you towards becoming a contributor while the human maintainers are busy. Just think of me as your personal guide in the world of LangChain. Let's tackle this coding journey together, shall we? 😊🚀 To modify the Here is an example of how you can do this: from langchain_community.chat_models import ChatOpenAI
openai = ChatOpenAI(model_name="gpt-3.5-turbo", model_kwargs={"top_p": 0.9}) In this example, the The As for the I hope this helps! If you have any other questions, feel free to ask. Best, Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
🤖
Hey there, @SpaceDandy13! 👋 I'm Dosu, your friendly bot helper. I'm here to assist you with bugs, answer your questions, and guide you towards becoming a contributor while the human maintainers are busy. Just think of me as your personal guide in the world of LangChain. Let's tackle this coding journey together, shall we? 😊🚀
To modify the
top_p
parameter in theChatOpenAI
class in LangChain, you can pass it as a key-value pair in themodel_kwargs
dictionary when creating an instance of theChatOpenAI
class. Themodel_kwargs
dictionary holds any model parameters valid for thecreate
call that are not explicitly specified in the class.Here is an example of how you can do this: