Skip to content

Commit

Permalink
Chat GPT Constructor: Consider Model when caching
Browse files Browse the repository at this point in the history
  • Loading branch information
VesnaT committed Dec 20, 2023
1 parent b4b9c17 commit 90a39f7
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions orangecontrib/prototypes/widgets/owchatgptconstructor.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,15 +48,15 @@ def ask_gpt(self, state) -> List:
if state.is_interruption_requested():
raise Exception

args = (text.strip(),
args = (MODELS[self.model_index],
text.strip(),
self.prompt_start.strip(),
self.prompt_end.strip())
if args in self.cache:
answer = self.cache[args]
else:
try:
answer = run_gpt(self.access_key, MODELS[self.model_index],
*args)
answer = run_gpt(self.access_key, *args)
self.cache[args] = answer
except Exception as ex:
answer = ex
Expand Down

0 comments on commit 90a39f7

Please sign in to comment.