Replies: 2 comments
-
Yes, it can be used without OpenAI models, both the embedder and the LLM are customizable. For start chek out this example which uses local models: https://github.com/pathwaycom/llm-app/blob/main/examples/pipelines/local/app.py |
Beta Was this translation helpful? Give feedback.
0 replies
-
@bennylam Just to make sure if the embedder model interface, linked by @janchorowski above, is what you were looking to use? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have installed several quantized open source LLM (e.g. LLaMA.cpp, ChatGLM2-6B-int4 ..etc) for private Chatbot apps and I want to use some locally installed embedding model (e.g. SentenceTransformers) instead of OpenAI embedding API.
My question is:
Can Pathway llm-app be used for open source LLM without OpenAI embedding API?
Any example or tutorials on how to do that?
Thanks
Benny
Beta Was this translation helpful? Give feedback.
All reactions