Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor Retrievers related Examples #1387

Merged
merged 18 commits into from
Jan 16, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions ChatQnA/docker_compose/amd/gpu/rocm/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ cd GenAIComps
### 2. Build Retriever Image

```bash
docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile .
docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile .
```

### 3. Build Dataprep Image
Expand Down Expand Up @@ -143,7 +143,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a

Then run the command `docker images`, you will have the following 5 Docker Images:

1. `opea/retriever-redis:latest`
1. `opea/retriever:latest`
2. `opea/dataprep-redis:latest`
3. `opea/chatqna:latest`
4. `opea/chatqna-ui:latest` or `opea/chatqna-react-ui:latest`
Expand Down
4 changes: 3 additions & 1 deletion ChatQnA/docker_compose/amd/gpu/rocm/compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ services:
security_opt:
- seccomp:unconfined
chatqna-retriever:
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
container_name: chatqna-retriever-redis-server
depends_on:
- chatqna-redis-vector-db
Expand All @@ -63,6 +63,8 @@ services:
REDIS_URL: ${CHATQNA_REDIS_URL}
INDEX_NAME: ${CHATQNA_INDEX_NAME}
TEI_EMBEDDING_ENDPOINT: ${CHATQNA_TEI_EMBEDDING_ENDPOINT}
LOGFLAG: ${LOGFLAG}
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
restart: unless-stopped
chatqna-tei-reranking-service:
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
Expand Down
4 changes: 2 additions & 2 deletions ChatQnA/docker_compose/intel/cpu/aipc/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ export https_proxy="Your_HTTPs_Proxy"
### 1. Build Retriever Image

```bash
docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile .
docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile .
```

### 2. Build Dataprep Image
Expand Down Expand Up @@ -61,7 +61,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a
Then run the command `docker images`, you will have the following 6 Docker Images:

1. `opea/dataprep-redis:latest`
2. `opea/retriever-redis:latest`
2. `opea/retriever:latest`
3. `opea/chatqna:latest`
4. `opea/chatqna-ui:latest`
5. `opea/nginx:latest`
Expand Down
4 changes: 3 additions & 1 deletion ChatQnA/docker_compose/intel/cpu/aipc/compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ services:
https_proxy: ${https_proxy}
command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate
retriever:
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
container_name: retriever-redis-server
depends_on:
- redis-vector-db
Expand All @@ -55,6 +55,8 @@ services:
INDEX_NAME: ${INDEX_NAME}
TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80
HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN}
LOGFLAG: ${LOGFLAG}
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
restart: unless-stopped
tei-reranking-service:
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
Expand Down
4 changes: 2 additions & 2 deletions ChatQnA/docker_compose/intel/cpu/xeon/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ cd GenAIComps
### 1. Build Retriever Image

```bash
docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile .
docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile .
```

### 2. Build Dataprep Image
Expand Down Expand Up @@ -167,7 +167,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a
Then run the command `docker images`, you will have the following 5 Docker Images:

1. `opea/dataprep-redis:latest`
2. `opea/retriever-redis:latest`
2. `opea/retriever:latest`
3. `opea/chatqna:latest` or `opea/chatqna-without-rerank:latest`
4. `opea/chatqna-ui:latest`
5. `opea/nginx:latest`
Expand Down
6 changes: 3 additions & 3 deletions ChatQnA/docker_compose/intel/cpu/xeon/README_pinecone.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ cd GenAIComps
### 1. Build Retriever Image

```bash
docker build --no-cache -t opea/retriever-pinecone:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/pinecone/langchain/Dockerfile .
docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile .
```

### 2. Build Dataprep Image
Expand Down Expand Up @@ -170,7 +170,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a
Then run the command `docker images`, you will have the following 5 Docker Images:

1. `opea/dataprep-pinecone:latest`
2. `opea/retriever-pinecone:latest`
2. `opea/retriever:latest`
3. `opea/chatqna:latest` or `opea/chatqna-without-rerank:latest`
4. `opea/chatqna-ui:latest`
5. `opea/nginx:latest`
Expand Down Expand Up @@ -352,7 +352,7 @@ click [here](https://raw.githubusercontent.com/opea-project/GenAIComps/v1.1/comp
Or run this command to get the file on a terminal.

```bash
wget https://raw.githubusercontent.com/opea-project/GenAIComps/main/comps/retrievers/redis/data/nke-10k-2023.pdf
wget https://raw.githubusercontent.com/opea-project/GenAIComps/v1.1/comps/retrievers/redis/data/nke-10k-2023.pdf

```

Expand Down
4 changes: 2 additions & 2 deletions ChatQnA/docker_compose/intel/cpu/xeon/README_qdrant.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ cd GenAIComps
### 1. Build Retriever Image

```bash
docker build --no-cache -t opea/retriever-qdrant:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/qdrant/haystack/Dockerfile .
docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile .
```

### 2. Build Dataprep Image
Expand Down Expand Up @@ -128,7 +128,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a
Then run the command `docker images`, you will have the following 5 Docker Images:

1. `opea/dataprep-qdrant:latest`
2. `opea/retriever-qdrant:latest`
2. `opea/retriever:latest`
3. `opea/chatqna:latest`
4. `opea/chatqna-ui:latest`
5. `opea/nginx:latest`
Expand Down
4 changes: 3 additions & 1 deletion ChatQnA/docker_compose/intel/cpu/xeon/compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ services:
https_proxy: ${https_proxy}
command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate
retriever:
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
container_name: retriever-redis-server
depends_on:
- redis-vector-db
Expand All @@ -55,6 +55,8 @@ services:
INDEX_NAME: ${INDEX_NAME}
TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80
HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN}
LOGFLAG: ${LOGFLAG}
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
restart: unless-stopped
tei-reranking-service:
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
Expand Down
6 changes: 3 additions & 3 deletions ChatQnA/docker_compose/intel/cpu/xeon/compose_pinecone.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,6 @@ services:
- tei-embedding-service
ports:
- "6007:6007"
- "6008:6008"
- "6009:6009"
environment:
no_proxy: ${no_proxy}
http_proxy: ${http_proxy}
Expand All @@ -37,7 +35,7 @@ services:
https_proxy: ${https_proxy}
command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate
retriever:
image: ${REGISTRY:-opea}/retriever-pinecone:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
container_name: retriever-pinecone-server
ports:
- "7000:7000"
Expand All @@ -51,6 +49,8 @@ services:
LANGCHAIN_API_KEY: ${LANGCHAIN_API_KEY}
TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80
HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN}
LOGFLAG: ${LOGFLAG}
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_PINECONE"
restart: unless-stopped
tei-reranking-service:
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
Expand Down
8 changes: 5 additions & 3 deletions ChatQnA/docker_compose/intel/cpu/xeon/compose_qdrant.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@ services:
https_proxy: ${https_proxy}
QDRANT_HOST: qdrant-vector-db
QDRANT_PORT: 6333
COLLECTION_NAME: ${INDEX_NAME}
TEI_ENDPOINT: http://tei-embedding-service:80
QDRANT_INDEX_NAME: ${INDEX_NAME}
TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80
HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN}
tei-embedding-service:
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
Expand All @@ -39,7 +39,7 @@ services:
https_proxy: ${https_proxy}
command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate
retriever:
image: ${REGISTRY:-opea}/retriever-qdrant:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
container_name: retriever-qdrant-server
depends_on:
- qdrant-vector-db
Expand All @@ -54,6 +54,8 @@ services:
QDRANT_PORT: 6333
INDEX_NAME: ${INDEX_NAME}
TEI_EMBEDDING_ENDPOINT: ${TEI_EMBEDDING_ENDPOINT}
LOGFLAG: ${LOGFLAG}
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_QDRANT"
restart: unless-stopped
tei-reranking-service:
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
Expand Down
4 changes: 3 additions & 1 deletion ChatQnA/docker_compose/intel/cpu/xeon/compose_vllm.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ services:
https_proxy: ${https_proxy}
command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate
retriever:
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
container_name: retriever-redis-server
depends_on:
- redis-vector-db
Expand All @@ -55,6 +55,8 @@ services:
INDEX_NAME: ${INDEX_NAME}
TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80
HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN}
LOGFLAG: ${LOGFLAG}
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
restart: unless-stopped
tei-reranking-service:
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ services:
https_proxy: ${https_proxy}
command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate
retriever:
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
container_name: retriever-redis-server
depends_on:
- redis-vector-db
Expand All @@ -55,6 +55,8 @@ services:
INDEX_NAME: ${INDEX_NAME}
TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80
HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN}
LOGFLAG: ${LOGFLAG}
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
restart: unless-stopped
tgi-service:
image: ghcr.io/huggingface/text-generation-inference:2.4.0-intel-cpu
Expand Down
4 changes: 2 additions & 2 deletions ChatQnA/docker_compose/intel/hpu/gaudi/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ cd GenAIComps
### 1. Build Retriever Image

```bash
docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile .
docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile .
```

### 2. Build Dataprep Image
Expand Down Expand Up @@ -156,7 +156,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a

Then run the command `docker images`, you will have the following 5 Docker Images:

- `opea/retriever-redis:latest`
- `opea/retriever:latest`
- `opea/dataprep-redis:latest`
- `opea/chatqna:latest`
- `opea/chatqna-ui:latest`
Expand Down
4 changes: 3 additions & 1 deletion ChatQnA/docker_compose/intel/hpu/gaudi/compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ services:
https_proxy: ${https_proxy}
command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate --otlp-endpoint $OTEL_EXPORTER_OTLP_TRACES_ENDPOINT
retriever:
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
container_name: retriever-redis-server
depends_on:
- redis-vector-db
Expand All @@ -57,6 +57,8 @@ services:
TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80
HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN}
TELEMETRY_ENDPOINT: ${TELEMETRY_ENDPOINT}
LOGFLAG: ${LOGFLAG}
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
restart: unless-stopped
tei-reranking-service:
image: ghcr.io/huggingface/tei-gaudi:1.5.0
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ services:
https_proxy: ${https_proxy}
command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate
retriever:
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
container_name: retriever-redis-server
depends_on:
- redis-vector-db
Expand All @@ -94,6 +94,8 @@ services:
INDEX_NAME: ${INDEX_NAME}
TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80
HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN}
LOGFLAG: ${LOGFLAG}
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
restart: unless-stopped
tei-reranking-service:
image: ghcr.io/huggingface/tei-gaudi:1.5.0
Expand Down
4 changes: 3 additions & 1 deletion ChatQnA/docker_compose/intel/hpu/gaudi/compose_vllm.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ services:
https_proxy: ${https_proxy}
command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate
retriever:
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
container_name: retriever-redis-server
depends_on:
- redis-vector-db
Expand All @@ -55,6 +55,8 @@ services:
INDEX_NAME: ${INDEX_NAME}
TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80
HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN}
LOGFLAG: ${LOGFLAG}
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
restart: unless-stopped
tei-reranking-service:
image: ghcr.io/huggingface/tei-gaudi:1.5.0
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ services:
https_proxy: ${https_proxy}
command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate
retriever:
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
container_name: retriever-redis-server
depends_on:
- redis-vector-db
Expand All @@ -55,6 +55,8 @@ services:
INDEX_NAME: ${INDEX_NAME}
TEI_EMBEDDING_ENDPOINT: http://tei-embedding-service:80
HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN}
LOGFLAG: ${LOGFLAG}
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
restart: unless-stopped
tgi-service:
image: ghcr.io/huggingface/tgi-gaudi:2.0.6
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ bee1132464cd opea/chatqna:latest "python c
f810f3b4d329 opea/embedding:latest "python embedding_te…" 2 minutes ago Up 2 minutes 0.0.0.0:6000->6000/tcp, :::6000->6000/tcp embedding-server
325236a01f9b opea/llm-textgen:latest "python llm.py" 2 minutes ago Up 2 minutes 0.0.0.0:9000->9000/tcp, :::9000->9000/tcp llm-textgen-gaudi-server
2fa17d84605f opea/dataprep-redis:latest "python prepare_doc_…" 2 minutes ago Up 2 minutes 0.0.0.0:6007->6007/tcp, :::6007->6007/tcp dataprep-redis-server
69e1fb59e92c opea/retriever-redis:latest "/home/user/comps/re…" 2 minutes ago Up 2 minutes 0.0.0.0:7000->7000/tcp, :::7000->7000/tcp retriever-redis-server
69e1fb59e92c opea/retriever:latest "/home/user/comps/re…" 2 minutes ago Up 2 minutes 0.0.0.0:7000->7000/tcp, :::7000->7000/tcp retriever-redis-server
313b9d14928a opea/reranking-tei:latest "python reranking_te…" 2 minutes ago Up 2 minutes 0.0.0.0:8000->8000/tcp, :::8000->8000/tcp reranking-tei-gaudi-server
174bd43fa6b5 ghcr.io/huggingface/tei-gaudi:1.5.0 "text-embeddings-rou…" 2 minutes ago Up 2 minutes 0.0.0.0:8090->80/tcp, :::8090->80/tcp tei-embedding-gaudi-server
05c40b636239 ghcr.io/huggingface/tgi-gaudi:2.0.6 "text-generation-lau…" 2 minutes ago Exited (1) About a minute ago tgi-gaudi-server
Expand Down
4 changes: 2 additions & 2 deletions ChatQnA/docker_compose/nvidia/gpu/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ cd GenAIComps
### 2. Build Retriever Image

```bash
docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile .
docker build --no-cache -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile .
```

### 3. Build Dataprep Image
Expand Down Expand Up @@ -153,7 +153,7 @@ docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-a

Then run the command `docker images`, you will have the following 5 Docker Images:

1. `opea/retriever-redis:latest`
1. `opea/retriever:latest`
2. `opea/dataprep-redis:latest`
3. `opea/chatqna:latest`
4. `opea/chatqna-ui:latest` or `opea/chatqna-react-ui:latest`
Expand Down
4 changes: 3 additions & 1 deletion ChatQnA/docker_compose/nvidia/gpu/compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ services:
https_proxy: ${https_proxy}
command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate
retriever:
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
container_name: retriever-redis-server
depends_on:
- redis-vector-db
Expand All @@ -55,6 +55,8 @@ services:
REDIS_HOST: redis-vector-db
INDEX_NAME: ${INDEX_NAME}
TEI_EMBEDDING_ENDPOINT: ${TEI_EMBEDDING_ENDPOINT}
LOGFLAG: ${LOGFLAG}
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
restart: unless-stopped
tei-reranking-service:
image: ghcr.io/huggingface/text-embeddings-inference:1.5
Expand Down
Loading
Loading