How to load user conversations and GPT responses persisted in RedisCache into langchain memory #5298
Unanswered
shashi1992
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi All,
Context: I am in the process of building my own chatbot wherein each users questions and responses will be persisted into the Redis Database. The reason for persisting the queries and responses is to allow the users to continue their chat conversations even at a later stage of time.
Things tried so far:
I made use of the RedisChatMessageHistory functionality from langchain.memory to persist the human and ai messages. I now want to load the persisted messages as memory into LLMChain under the memory parameter like how it is done for ConversationBufferMemory
I could not find any references to the same.
Guidance Needed:
Is there a way to load the conversations persisted in Redis to the langchain memory?
If not, is there a way to load the data stored in Redis as llm cache? Would this method have any drawbacks when the number of tokens and number of users increases?
Thanks,
Shashidhar
Beta Was this translation helpful? Give feedback.
All reactions