-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixes port issues and a dataprep endpoint referenced by the UI #50
Conversation
Signed-off-by: Lianhao Lu <[email protected]>
Use local hub cache for AgentQnA test to save workspace. Signed-off-by: chensuyue <[email protected]>
All gaudi values updated with extra flags. Added helm support for 2 new examples Text2Image and SearchQnA. Minor fix for llm-uservice. Signed-off-by: Dolpher Du <[email protected]>
Use vllm-fork for gaudi. fix the issue opea-project#1451
This PR fixes the missing protocol for in the curl command mentioned in chatqna readme for tei-embedding-service.
Signed-off-by: Chingis Yundunov <[email protected]>
…ject#1455) Signed-off-by: lvliang-intel <[email protected]>
Test compose cd workflow depend on image build, so if we want to run both compose and helm charts deployment in cd workflow, this condition should be removed. Signed-off-by: chensuyue <[email protected]>
Signed-off-by: Melanie Buehler <[email protected]>
@@ -334,15 +334,6 @@ export audio_fn="AudioSample.wav" | |||
wget https://github.com/intel/intel-extension-for-transformers/raw/main/intel_extension_for_transformers/neural_chat/assets/audio/sample.wav -O ${audio_fn} | |||
``` | |||
|
|||
```bash | |||
export DATAPREP_MMR_PORT=6007 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
338-344 are being added to ROCM but removed here is this intentional?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not adding them to ROCM, I'm changing them from using external port 5000 (a mistake, I think) to using external port 6007. I did not make that external port dynamic like the others, because we decided to change ROCM as little as possible and the pre-existing ROCM env vars file doesn't include that config.
@@ -282,15 +282,6 @@ wget https://github.com/intel/intel-extension-for-transformers/raw/main/intel_ex | |||
|
|||
Test dataprep microservice with generating transcript. This command updates a knowledge base by uploading a local video .mp4 and an audio .wav file. | |||
|
|||
```bash | |||
export DATAPREP_MMR_PORT=6007 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same question from above
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Xeon and Gaudi READMEs got these extra exports somehow during the refactor, but if you set them with external port 6007 initially, you don't need to do it again.
@@ -28,7 +28,7 @@ services: | |||
- redis-vector-db | |||
- lvm | |||
ports: | |||
- "6007:${DATAPREP_MMR_PORT}" | |||
- "${DATAPREP_MMR_PORT}:5000" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this not be 6007?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The 5000 internal port (right side) is hard-coded and that's okay as I understand. The configurable port is the external one and that one should be set to an env var (we use 6007 in tests and READMEs).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Signed-off-by: Melanie Buehler <[email protected]>
Description
Per discussion with Liang Lv, the MultimodalQnA dataprep internal port number is 5000 and the external one should be 6007. This also fixes an endpoint in the UI code that was missed in the dataprep refactor.
Issues
N/A
Type of change
List the type of change like below. Please delete options that are not relevant.
Dependencies
N/A
Tests
N/A