Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Replace deprecated text-davinci-003 model with gpt-3.5-turbo-instruct model #265

Merged
merged 1 commit into from
Dec 29, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions tutorials/21_Customizing_PromptNode.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@
"source": [
"## Trying Out PromptNode\n",
"\n",
"The PromptNode is the central abstraction in Haystack's large language model (LLM) support. It uses [`google/flan-t5-base`](https://huggingface.co/google/flan-t5-base) model by default, but you can replace the default model with a flan-t5 model of a different size such as `google/flan-t5-large` or a model by OpenAI such as `text-davinci-003`.\n",
"The PromptNode is the central abstraction in Haystack's large language model (LLM) support. It uses [`google/flan-t5-base`](https://huggingface.co/google/flan-t5-base) model by default, but you can replace the default model with a flan-t5 model of a different size such as `google/flan-t5-large` or a model by OpenAI such as `gpt-3.5-turbo-instruct`.\n",
"\n",
"[Large language models](https://docs.haystack.deepset.ai/docs/language_models#large-language-models-llms) are huge models trained on enormous amounts of data. That’s why these models have general knowledge of the world, so you can ask them anything and they will be able to answer.\n",
"\n",
Expand Down Expand Up @@ -145,7 +145,7 @@
"source": [
"> Note: To use PromptNode with an OpenAI model, change the model name and provide an `api_key`: \n",
"> ```python\n",
"> prompt_node = PromptNode(model_name_or_path=\"text-davinci-003\", api_key=<YOUR_API_KEY>)\n",
"> prompt_node = PromptNode(model_name_or_path=\"gpt-3.5-turbo-instruct\", api_key=<YOUR_API_KEY>)\n",
"> ```"
]
},
Expand Down
4 changes: 2 additions & 2 deletions tutorials/23_Answering_Multihop_Questions_with_Agents.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -419,7 +419,7 @@
"\n",
"The `Agent` needs to determine the next best course of action at each iteration. It does this by using an LLM, and a prompt designed specially for this use case. Our `Agent` uses a `PromptNode` with the default [\"zero-shot-react\" `PromptTemplate` ](https://github.com/deepset-ai/haystack/blob/444a3116c42d2c8852d27aa8093ac92c8e85ab88/haystack/nodes/prompt/prompt_node.py#L337). \n",
"\n",
"Here, let's define an `Agent` that uses the `text-davinci-003` model by OpenAI."
"Here, let's define an `Agent` that uses the `gpt-3.5-turbo-instruct` model by OpenAI."
]
},
{
Expand All @@ -433,7 +433,7 @@
"from haystack.agents import Agent\n",
"from haystack.nodes import PromptNode\n",
"\n",
"prompt_node = PromptNode(model_name_or_path=\"text-davinci-003\", api_key=api_key, stop_words=[\"Observation:\"])\n",
"prompt_node = PromptNode(model_name_or_path=\"gpt-3.5-turbo-instruct\", api_key=api_key, stop_words=[\"Observation:\"])\n",
"agent = Agent(prompt_node=prompt_node)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion tutorials/25_Customizing_Agent.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -285,7 +285,7 @@
")\n",
"\n",
"prompt_node = PromptNode(\n",
" model_name_or_path=\"text-davinci-003\", api_key=openai_api_key, default_prompt_template=prompt_template\n",
" model_name_or_path=\"gpt-3.5-turbo-instruct\", api_key=openai_api_key, default_prompt_template=prompt_template\n",
")\n",
"\n",
"generative_pipeline = Pipeline()\n",
Expand Down