From 56a4d78bb7032d2e13eb1abf613f6c4b59934616 Mon Sep 17 00:00:00 2001 From: Carmelo Daniele <32391685+c-daniele@users.noreply.github.com> Date: Sat, 20 Apr 2024 19:12:15 +0200 Subject: [PATCH] Update prompt.py The current prompt is working properly with just OpenAI LLM. I've made several tests using other models like LLAMA2, Claude2 and Claude3. They all broke down with the "No Connection Adapter" error. I've then realized that it's just a PROMPT minor issue in the URL prompt. With this small prompt fine tuning, the APIChain will work on LLAMA2 and Claude also. --- libs/langchain/langchain/chains/api/prompt.py | 1 + 1 file changed, 1 insertion(+) diff --git a/libs/langchain/langchain/chains/api/prompt.py b/libs/langchain/langchain/chains/api/prompt.py index 0ffc389ad3d06..21bf5cb4593b8 100644 --- a/libs/langchain/langchain/chains/api/prompt.py +++ b/libs/langchain/langchain/chains/api/prompt.py @@ -5,6 +5,7 @@ {api_docs} Using this documentation, generate the full API url to call for answering the user question. You should build the API url in order to get a response that is as short as possible, while still getting the necessary information to answer the question. Pay attention to deliberately exclude any unnecessary pieces of data in the API call. +Return just the raw API URL without any prefix text. Question:{question} API url:"""