Skip to content

Commit

Permalink
App Submission: Open WebUI (#1977)
Browse files Browse the repository at this point in the history
Co-authored-by: nmfretz <[email protected]>
  • Loading branch information
al-lac and nmfretz authored Jan 30, 2025
1 parent 0a18f6f commit 5cd488e
Show file tree
Hide file tree
Showing 3 changed files with 58 additions and 0 deletions.
Empty file.
18 changes: 18 additions & 0 deletions open-webui/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
version: '3.7'

services:
app_proxy:
environment:
APP_HOST: open-webui_web_1
APP_PORT: 8080
PROXY_AUTH_ADD: "false"

web:
image: ghcr.io/open-webui/open-webui:v0.5.7@sha256:b9a3425659236186df16ccf4432a247a353e54dec9549fb475d8b57f0c29a93d
volumes:
- ${APP_DATA_DIR}/data/open-webui:/app/backend/data
environment:
# Exported from ollama app, which is currently a required dependency.
# This will need to change once optional dependencies are supported.
OLLAMA_BASE_URL: $APP_OLLAMA_URL
restart: on-failure
40 changes: 40 additions & 0 deletions open-webui/umbrel-app.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
manifestVersion: 1
id: open-webui
name: Open WebUI
tagline: User-friendly AI Interface
category: ai
version: "0.5.7"
port: 2876
description: >-
Open WebUI lets you chat with advanced AI models running locally on your own device or connect to online models using an API key.
**Getting Started with Local AI Models:**
🦙 Install Ollama: Start by installing the Ollama app from the Umbrel App Store. Ollama enables you to download and run large language models like Llama 3 and DeepSeek-R1 directly on your device.
⬇️ Download a Model: In the Open WebUI app, type the name of the model you want in the search bar and click “Pull from Ollama.com.” A full list of models is available at https://ollama.com/.
🤖 Example - Running DeepSeek-R1 1.5B: To use the DeepSeek-R1 model with 1.5 billion parameters, type deepseek-r1:1.5b in the search bar and start the download. Once it's ready, you can chat with the model directly in Open WebUI.
⚠️ Warning: Before running a model, make sure your device has enough free RAM to support it. Attempting to run a model that exceeds your available memory could cause your device to crash or become unresponsive. Always check the model requirements before downloading or starting it.
developer: Open WebUI
website: https://openwebui.com/
submitter: al-lac
submission: https://github.com/getumbrel/umbrel-apps/pull/1977
repo: https://github.com/open-webui/open-webui
support: https://github.com/open-webui/open-webui/issues
gallery:
- 1.jpg
- 2.jpg
- 3.jpg
defaultUsername: ""
defaultPassword: ""
dependencies:
- ollama
releaseNotes: ""
path: ""

0 comments on commit 5cd488e

Please sign in to comment.