Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes port issues and a dataprep endpoint referenced by the UI #50

Closed
wants to merge 12 commits into from

Conversation

mhbuehler
Copy link
Owner

Description

Per discussion with Liang Lv, the MultimodalQnA dataprep internal port number is 5000 and the external one should be 6007. This also fixes an endpoint in the UI code that was missed in the dataprep refactor.

Issues

N/A

Type of change

List the type of change like below. Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)

Dependencies

N/A

Tests

N/A

lianhao and others added 9 commits January 22, 2025 09:18
Use local hub cache for AgentQnA test to save workspace.

Signed-off-by: chensuyue <[email protected]>
All gaudi values updated with extra flags.
Added helm support for 2 new examples Text2Image and SearchQnA. Minor fix for llm-uservice.

Signed-off-by: Dolpher Du <[email protected]>
Use vllm-fork for gaudi.

fix the issue opea-project#1451
This PR fixes the missing protocol for in the curl command mentioned in chatqna readme for tei-embedding-service.
Test compose cd workflow depend on image build, so if we want to run both compose and helm charts deployment in cd workflow, this condition should be removed.

Signed-off-by: chensuyue <[email protected]>
@@ -334,15 +334,6 @@ export audio_fn="AudioSample.wav"
wget https://github.com/intel/intel-extension-for-transformers/raw/main/intel_extension_for_transformers/neural_chat/assets/audio/sample.wav -O ${audio_fn}
```

```bash
export DATAPREP_MMR_PORT=6007
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

338-344 are being added to ROCM but removed here is this intentional?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not adding them to ROCM, I'm changing them from using external port 5000 (a mistake, I think) to using external port 6007. I did not make that external port dynamic like the others, because we decided to change ROCM as little as possible and the pre-existing ROCM env vars file doesn't include that config.

@@ -282,15 +282,6 @@ wget https://github.com/intel/intel-extension-for-transformers/raw/main/intel_ex

Test dataprep microservice with generating transcript. This command updates a knowledge base by uploading a local video .mp4 and an audio .wav file.

```bash
export DATAPREP_MMR_PORT=6007
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same question from above

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Xeon and Gaudi READMEs got these extra exports somehow during the refactor, but if you set them with external port 6007 initially, you don't need to do it again.

@@ -28,7 +28,7 @@ services:
- redis-vector-db
- lvm
ports:
- "6007:${DATAPREP_MMR_PORT}"
- "${DATAPREP_MMR_PORT}:5000"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this not be 6007?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The 5000 internal port (right side) is hard-coded and that's okay as I understand. The configurable port is the external one and that one should be set to an env var (we use 6007 in tests and READMEs).

Copy link
Collaborator

@dmsuehir dmsuehir left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Signed-off-by: Melanie Buehler <[email protected]>
@mhbuehler mhbuehler closed this Jan 24, 2025
@mhbuehler mhbuehler deleted the melanie/mmqna_ui_port_fix branch January 24, 2025 00:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

10 participants