Skip to content

Commit

Permalink
CHANGELOG
Browse files Browse the repository at this point in the history
  • Loading branch information
amaiya committed Dec 19, 2024
1 parent e19ea89 commit 479afbc
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Most recent releases are shown at the top. Each release shows:
- N/A

### fixed:
- Temporary fix for chat template issue (#113)
- Fix for HF chat template issue (#113/#114)


## 0.7.0 (2024-12-16)
Expand Down
2 changes: 1 addition & 1 deletion nbs/00_llm.base.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -575,8 +575,8 @@
" prompt = U.format_string(prompt_template, prompt=prompt)\n",
" stop = stop if stop else self.stop\n",
" if self.is_hf():\n",
" # Temporary fix for ISSUE #113\n",
" tokenizer = llm.llm.pipeline.tokenizer\n",
" # FIX for #113/#114\n",
" prompt = [{'role':'user', 'content':prompt}] if tokenizer.chat_template else prompt\n",
" # Call HF pipeline directly instead of `invoke`\n",
" # since LangChain is not passing along stop_strings\n",
Expand Down
2 changes: 1 addition & 1 deletion onprem/llm/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -540,8 +540,8 @@ def prompt(self,
prompt = U.format_string(prompt_template, prompt=prompt)
stop = stop if stop else self.stop
if self.is_hf():
# Temporary fix for ISSUE #113
tokenizer = llm.llm.pipeline.tokenizer
# FIX for #113/#114
prompt = [{'role':'user', 'content':prompt}] if tokenizer.chat_template else prompt
# Call HF pipeline directly instead of `invoke`
# since LangChain is not passing along stop_strings
Expand Down

0 comments on commit 479afbc

Please sign in to comment.