Replies: 1 comment
-
Hi dear Chris, This is not an issue man 🫡, so I moved it into the discussions tab for now. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have a network computer running linux, that I've set up an https://ollama.ai service on.
The steps to start the ollama service are:
installing & running ollama which is a llm server
installing litellm which is an openai emulator
running it.
Which means I now have an openai compatible port at 192.168.0.15:9000
Unfortunately when I tried using the ChatGPT wizard, it came back with an error "No answer, try again"
On the linux side the message said:
Docs: https://docs.litellm.ai/docs/proxy_server
Worker Initialized
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:9000 (Press CTRL+C to quit)
192.168.0.196:50629 - "POST / HTTP/1.1" 405 Method Not Allowed
google show others with this problem.
openai/chatgpt-retrieval-plugin#74
Beta Was this translation helpful? Give feedback.
All reactions