-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MODELS=`[ variable problem when I docker run #1436
Comments
Docker doesn't like the formatting of Try passing it's content in a
that should work? |
you can also bind mount the
something like this |
I tried this solution but there are none of my changes and when I access the I tried this solution also, using DOTENV_LOCAL=$(<.env.local) sudo --preserve-env=DOTENV_LOCAL docker run -d -p 3000:3000 --env-file /dev/null -e DOTENV_LOCAL -v chat-ui:/data --name chat-ui --network proxy ghcr.io/huggingface/chat-ui-db && sudo docker network connect backend chat-ui The file
Also, I tried this solution : my Using this solution, when I remove MODELS=`[ variable it works well, I think the problem is related about Ollama configuration. |
Here's an update on the situation. In documentation, https://huggingface.co/docs/chat-ui/configuration/models/providers/ollama, this causes the error :
I changed this part using this documentation https://github.com/huggingface/chat-ui/blob/main/docs/source/configuration/models/providers/ollama.md and I dont have the 500 error "An error occured" ❌ anymore ! :
But now I have this error "fetch error" ❌ when I ask a question to my AI model, here is my error when I execute
I don't know what that means exactly. I don't think I have a Ollama endpoint problem because I don't have error indicating that |
Everything now works on my side, it was due to the configuration of Ollama in the MODELS=` variable. MODELS=`[
{
"name": "Ollama Mistral",
"chatPromptTemplate": "<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}} {{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s> {{/ifAssistant}}{{/each}}",
"parameters": {
"temperature": 0.1,
"top_p": 0.95,
"repetition_penalty": 1.2,
"top_k": 50,
"truncate": 3072,
"max_new_tokens": 1024,
"stop": ["</s>"]
},
"endpoints": [
{
"type": "ollama",
"url" : "http://ollama:11434",
"ollamaName" : "mistral"
}
]
}
]` Thanks a lot ! |
Hello,
I want to use Ollama to use Mistral model and I followed the documentation below : https://huggingface.co/docs/chat-ui/configuration/models/providers/ollama
deploy.sh
:docker-compose.yml
:.env.local
:When I start my script, at the end of the execution, the container doesn't want to launch, I get the following error :
I already tried to put
chat-ui
andmongodb
containers in thedocker-compose.yml
and it doesn't works, same as this issue : #614Any solutions ?
Thanks in advance.
The text was updated successfully, but these errors were encountered: