Local Langchain chatbot with chroma vector storage memory #12902
ossirytk
started this conversation in
Show and tell
Replies: 1 comment 10 replies
-
I've added support for json lorebooks and metadata filtering. It's a bit hacky currently, but I'll see about improving the filtering when Chroma supports more complicated actions. |
Beta Was this translation helpful? Give feedback.
10 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I find that there is a woeful lack of more complex examples. For that purpose, I've made a character AI chatbot with chroma vector storage memory to serve as an example and a simple drop in platform for testing things.
The main chatbot is built using llama-cpp-python, langchain and chainlit. It supports json, yaml, V2 and Tavern character card formats.
It uses langchain llamacpp embeddings to parse documents into chroma vector storage collections. There is also a test script to query and test the collections.
Everything is local and in python. Runs in a virtual env with minimum installations. No api keys needed. Runs locally with cpu and optional gpu offloading
https://github.com/ossirytk/llama-cpp-chat-memory
Beta Was this translation helpful? Give feedback.
All reactions