Skip to content

Commit

Permalink
Feature/enhance huggingfacepipeline to handle different return type (#…
Browse files Browse the repository at this point in the history
…11394)

**Description:** Avoid huggingfacepipeline to truncate the response if
user setup return_full_text as False within huggingface pipeline.

**Dependencies:** : None
**Tag maintainer:**   Maybe @sam-h-bean ?

---------

Co-authored-by: Bagatur <[email protected]>
  • Loading branch information
hsuyuming and baskaryan authored Oct 12, 2023
1 parent 2aba9ab commit 0b743f0
Showing 1 changed file with 17 additions and 2 deletions.
19 changes: 17 additions & 2 deletions libs/langchain/langchain/llms/huggingface_pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -202,8 +202,23 @@ def _generate(
response = response[0]

if self.pipeline.task == "text-generation":
# Text generation return includes the starter text
text = response["generated_text"][len(batch_prompts[j]) :]
try:
from transformers.pipelines.text_generation import ReturnType

remove_prompt = (
self.pipeline._postprocess_params.get("return_type")
!= ReturnType.NEW_TEXT
)
except Exception as e:
logger.warning(
f"Unable to extract pipeline return_type. "
f"Received error:\n\n{e}"
)
remove_prompt = True
if remove_prompt:
text = response["generated_text"][len(batch_prompts[j]) :]
else:
text = response["generated_text"]
elif self.pipeline.task == "text2text-generation":
text = response["generated_text"]
elif self.pipeline.task == "summarization":
Expand Down

0 comments on commit 0b743f0

Please sign in to comment.