Replies: 1 comment
-
Like so :)
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
See:
https://github.com/hwchase17/langchain/blob/master/langchain/chains/combine_documents/map_reduce.py
with default
token_max: int = 3000
Am I correct in that if after mapreduce, the response is > 3000 tokens, the response is collapsed further using the collapse_llm or collapse_prompt?
Curious why default is set to 3000?
How do we override this when using mapreduce chains for say Q&A?
Beta Was this translation helpful? Give feedback.
All reactions