Replies: 1 comment
-
Plz give me a clue if someone knows how to call llama_cpp.server using Langchain. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
We know now we can run LlamaCpp-python as a web server using
python3 -m llama_cpp.server --model blabla
, and LlamaCpp server exposes some endpoints that is compatible with openai API.&text=Navigate%20to%20http%3A%2F%2Flocalhost,to%20see%20the%20OpenAPI%20documentation.).But
langchain.llms.LlamaCpp
is only for create a runtime model along with the application code, it's sometimes not very convenient, is there an API adapter for LlamaCpp Server likelangchain.llms.OpenAI
andlangchain.chat_models.ChatOpenAI
?Beta Was this translation helpful? Give feedback.
All reactions