Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DEVHAS-736] Create techdoc for running applications in RHOAI #25

Merged
merged 3 commits into from
Aug 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions scripts/envs/audio-to-text
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ export APP_NAME="audio-to-text"
export APP_DISPLAY_NAME="Audio to Text Application"
export APP_DESC="Audio to Text Application example with AI enabled audio transcription"
export APP_TAGS='["ai", "whispercpp", "python", "asr"]'
export APP_RUN_COMMAND="streamlit run whisper_client.py"
export INIT_CONTAINER="quay.io/redhat-ai-dev/whisper-small:latest"
export INIT_CONTAINER_COMMAND="['/usr/bin/install', '/model/model.file', '/shared/']"
export MODEL_PATH="/model/model.file"
Expand Down
1 change: 1 addition & 0 deletions scripts/envs/chatbot
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ export APP_NAME="chatbot"
export APP_DISPLAY_NAME="Chatbot Application"
export APP_DESC="Chatbot Application example with LLM enabled chat applications"
export APP_TAGS='["ai", "llamacpp", "vllm", "python"]'
export APP_RUN_COMMAND="streamlit run chatbot_ui.py"
export INIT_CONTAINER="quay.io/redhat-ai-dev/granite-7b-lab:latest"
export INIT_CONTAINER_COMMAND="['/usr/bin/install', '/model/model.file', '/shared/']"
export MODEL_PATH="/model/model.file"
Expand Down
1 change: 1 addition & 0 deletions scripts/envs/codegen
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ export APP_NAME="codegen"
export APP_DISPLAY_NAME="Code Generation Application"
export APP_DESC="Code Generation Application example that generate code in countless programming languages."
export APP_TAGS='["ai", "llamacpp", "vllm", "python"]'
export APP_RUN_COMMAND="streamlit run codegen-app.py"
export INIT_CONTAINER="quay.io/redhat-ai-dev/mistral-7b-code-16k-qlora:latest"
export INIT_CONTAINER_COMMAND="['/usr/bin/install', '/model/model.file', '/shared/']"
export MODEL_PATH="/model/model.file"
Expand Down
1 change: 1 addition & 0 deletions scripts/envs/object-detection
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ export APP_NAME="object-detection"
export APP_DISPLAY_NAME="Object Detection Application"
export APP_DESC="AI enabled Object Detection Application example using DEtection TRansformer(DETR) model to detect objects in an image"
export APP_TAGS='["ai", "detr", "python"]'
export APP_RUN_COMMAND="streamlit run object_detection_client.py"
export INIT_CONTAINER="quay.io/redhat-ai-dev/detr-resnet-101:latest"
export INIT_CONTAINER_COMMAND="['cp', '-R', '/model/detr-resnet-101', '/shared/detr-resnet-101']"
export MODEL_PATH="/model/detr-resnet-101"
Expand Down
Binary file added skeleton/techdoc/docs/.assets/open-terminal.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added skeleton/techdoc/docs/.assets/workbench-name.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
37 changes: 37 additions & 0 deletions skeleton/techdoc/docs/rhoai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Running Samples in OpenShift AI

This document will outline how you can run build and run your sample applications within an OpenShift AI workbench.

## Prerequisites

- Red Hat OpenShift AI installed, and `Create workbench for OpenShift AI` selected during component creation.
- `oc` cli installed
johnmcollier marked this conversation as resolved.
Show resolved Hide resolved
- `oc` can be downloaded from https://mirror.openshift.com/pub/openshift-v4/clients/ocp/stable/
- Permissions to run `oc port-forward` on the cluster, specifically an account with the following roles:
- `get`, `create`, and `list` for the `pods/portforward` subresource

## Running the Sample

1) Navigate to the OpenShift AI workbench created for your sample application

2) Go to `File->Open` and select `Terminal`
![image](./.assets/open-terminal.png)

3) In the terminal, run `cd ${{ values.name }}` to navigate to your sample app's directory

4) Run `pip install --upgrade -r requirements.txt` to install the dependencies for your application

5) Run `${{ values.appRunCommand }}` to run the sample in the workbench.

## Accessing the Sample

With the sample app now running, the following steps will allow you to access the sample app in your browser:

1) Navigate back to the OpenShift AI dashboard, and find the name of your workbench.
![image](./.assets/workbench-name.png)

2) In a terminal window on your machine, run `oc get pods -l app=<workbench-name>`. This will retrieve the name of the pod where the workbench is running.

3) Run `oc port-forward <pod-name> ${{ values.appPort }}` to port forward the sample application's port to your local machine.

4) Finally, visit `http://localhost:${{ values.appPort }}` in your browser to access the application.
1 change: 1 addition & 0 deletions skeleton/techdoc/mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ nav:
- Source Component: source-component.md
- Pipelines: pipelines.md
- GitOps Application: gitops-application.md
- OpenShift AI: rhoai.md

plugins:
- techdocs-core
3 changes: 2 additions & 1 deletion skeleton/template.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -205,6 +205,7 @@ spec:
srcRepoURL: https://${{ parameters.githubServer if parameters.hostType === 'GitHub' else parameters.gitlabServer }}/${{ parameters.repoOwner }}/${{ parameters.repoName }}
appContainer: ${{ 'quay.io/redhat-ai-dev/ai-template-bootstrap-app:latest' if parameters.hostType === 'GitHub' else '${APP_INTERFACE_CONTAINER}' }}
appPort: ${APP_PORT}
appRunCommand: "${APP_RUN_COMMAND}"
modelServiceContainer: ${MODEL_SERVICE_CONTAINER}
modelServicePort: ${MODEL_SERVICE_PORT}
# Renders all the template variables into the files and directory names and content, and places the result in the workspace.
Expand All @@ -225,7 +226,7 @@ spec:
tags: 'sed.edit.APPTAGS'
owner: ${{ parameters.owner }}
repoSlug: '${{ parameters.imageOrg }}/${{ parameters.imageName }}'
defaultBranch: ${{ parameters.branch }}
defaultBranch: ${{ parameters.branch }}
- id: fetch-github-action
name: Fetch GitHub Action
action: fetch:plain
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
37 changes: 37 additions & 0 deletions templates/audio-to-text/content/docs/rhoai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Running Samples in OpenShift AI

This document will outline how you can run build and run your sample applications within an OpenShift AI workbench.

## Prerequisites

- Red Hat OpenShift AI installed, and `Create workbench for OpenShift AI` selected during component creation.
- `oc` cli installed
- `oc` can be downloaded from https://mirror.openshift.com/pub/openshift-v4/clients/ocp/stable/
- Permissions to run `oc port-forward` on the cluster, specifically an account with the following roles:
- `get`, `create`, and `list` for the `pods/portforward` subresource

## Running the Sample

1) Navigate to the OpenShift AI workbench created for your sample application

2) Go to `File->Open` and select `Terminal`
![image](./.assets/open-terminal.png)

3) In the terminal, run `cd ${{ values.name }}` to navigate to your sample app's directory

4) Run `pip install --upgrade -r requirements.txt` to install the dependencies for your application

5) Run `${{ values.appRunCommand }}` to run the sample in the workbench.

## Accessing the Sample

With the sample app now running, the following steps will allow you to access the sample app in your browser:

1) Navigate back to the OpenShift AI dashboard, and find the name of your workbench.
![image](./.assets/workbench-name.png)

2) In a terminal window on your machine, run `oc get pods -l app=<workbench-name>`. This will retrieve the name of the pod where the workbench is running.

3) Run `oc port-forward <pod-name> ${{ values.appPort }}` to port forward the sample application's port to your local machine.

4) Finally, visit `http://localhost:${{ values.appPort }}` in your browser to access the application.
1 change: 1 addition & 0 deletions templates/audio-to-text/content/mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ nav:
- Source Component: source-component.md
- Pipelines: pipelines.md
- GitOps Application: gitops-application.md
- OpenShift AI: rhoai.md

plugins:
- techdocs-core
3 changes: 2 additions & 1 deletion templates/audio-to-text/template.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -165,6 +165,7 @@ spec:
srcRepoURL: https://${{ parameters.githubServer if parameters.hostType === 'GitHub' else parameters.gitlabServer }}/${{ parameters.repoOwner }}/${{ parameters.repoName }}
appContainer: ${{ 'quay.io/redhat-ai-dev/ai-template-bootstrap-app:latest' if parameters.hostType === 'GitHub' else 'quay.io/redhat-ai-dev/audio-to-text:latest' }}
appPort: 8501
appRunCommand: "streamlit run whisper_client.py"
modelServiceContainer: quay.io/redhat-ai-dev/whispercpp:latest
modelServicePort: 8001
# Renders all the template variables into the files and directory names and content, and places the result in the workspace.
Expand All @@ -185,7 +186,7 @@ spec:
tags: '["ai", "whispercpp", "python", "asr"]'
owner: ${{ parameters.owner }}
repoSlug: '${{ parameters.imageOrg }}/${{ parameters.imageName }}'
defaultBranch: ${{ parameters.branch }}
defaultBranch: ${{ parameters.branch }}
- id: fetch-github-action
name: Fetch GitHub Action
action: fetch:plain
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
37 changes: 37 additions & 0 deletions templates/chatbot/content/docs/rhoai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Running Samples in OpenShift AI

This document will outline how you can run build and run your sample applications within an OpenShift AI workbench.

## Prerequisites

- Red Hat OpenShift AI installed, and `Create workbench for OpenShift AI` selected during component creation.
- `oc` cli installed
- `oc` can be downloaded from https://mirror.openshift.com/pub/openshift-v4/clients/ocp/stable/
- Permissions to run `oc port-forward` on the cluster, specifically an account with the following roles:
- `get`, `create`, and `list` for the `pods/portforward` subresource

## Running the Sample

1) Navigate to the OpenShift AI workbench created for your sample application

2) Go to `File->Open` and select `Terminal`
![image](./.assets/open-terminal.png)

3) In the terminal, run `cd ${{ values.name }}` to navigate to your sample app's directory

4) Run `pip install --upgrade -r requirements.txt` to install the dependencies for your application

5) Run `${{ values.appRunCommand }}` to run the sample in the workbench.

## Accessing the Sample

With the sample app now running, the following steps will allow you to access the sample app in your browser:

1) Navigate back to the OpenShift AI dashboard, and find the name of your workbench.
![image](./.assets/workbench-name.png)

2) In a terminal window on your machine, run `oc get pods -l app=<workbench-name>`. This will retrieve the name of the pod where the workbench is running.

3) Run `oc port-forward <pod-name> ${{ values.appPort }}` to port forward the sample application's port to your local machine.

4) Finally, visit `http://localhost:${{ values.appPort }}` in your browser to access the application.
1 change: 1 addition & 0 deletions templates/chatbot/content/mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ nav:
- Source Component: source-component.md
- Pipelines: pipelines.md
- GitOps Application: gitops-application.md
- OpenShift AI: rhoai.md

plugins:
- techdocs-core
3 changes: 2 additions & 1 deletion templates/chatbot/template.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -179,6 +179,7 @@ spec:
srcRepoURL: https://${{ parameters.githubServer if parameters.hostType === 'GitHub' else parameters.gitlabServer }}/${{ parameters.repoOwner }}/${{ parameters.repoName }}
appContainer: ${{ 'quay.io/redhat-ai-dev/ai-template-bootstrap-app:latest' if parameters.hostType === 'GitHub' else 'quay.io/redhat-ai-dev/chatbot:latest' }}
appPort: 8501
appRunCommand: "streamlit run chatbot_ui.py"
modelServiceContainer: quay.io/ai-lab/llamacpp_python:latest
modelServicePort: 8001
# Renders all the template variables into the files and directory names and content, and places the result in the workspace.
Expand All @@ -199,7 +200,7 @@ spec:
tags: '["ai", "llamacpp", "vllm", "python"]'
owner: ${{ parameters.owner }}
repoSlug: '${{ parameters.imageOrg }}/${{ parameters.imageName }}'
defaultBranch: ${{ parameters.branch }}
defaultBranch: ${{ parameters.branch }}
- id: fetch-github-action
name: Fetch GitHub Action
action: fetch:plain
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
37 changes: 37 additions & 0 deletions templates/codegen/content/docs/rhoai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Running Samples in OpenShift AI

This document will outline how you can run build and run your sample applications within an OpenShift AI workbench.

## Prerequisites

- Red Hat OpenShift AI installed, and `Create workbench for OpenShift AI` selected during component creation.
- `oc` cli installed
- `oc` can be downloaded from https://mirror.openshift.com/pub/openshift-v4/clients/ocp/stable/
- Permissions to run `oc port-forward` on the cluster, specifically an account with the following roles:
- `get`, `create`, and `list` for the `pods/portforward` subresource

## Running the Sample

1) Navigate to the OpenShift AI workbench created for your sample application

2) Go to `File->Open` and select `Terminal`
![image](./.assets/open-terminal.png)

3) In the terminal, run `cd ${{ values.name }}` to navigate to your sample app's directory

4) Run `pip install --upgrade -r requirements.txt` to install the dependencies for your application

5) Run `${{ values.appRunCommand }}` to run the sample in the workbench.

## Accessing the Sample

With the sample app now running, the following steps will allow you to access the sample app in your browser:

1) Navigate back to the OpenShift AI dashboard, and find the name of your workbench.
![image](./.assets/workbench-name.png)

2) In a terminal window on your machine, run `oc get pods -l app=<workbench-name>`. This will retrieve the name of the pod where the workbench is running.

3) Run `oc port-forward <pod-name> ${{ values.appPort }}` to port forward the sample application's port to your local machine.

4) Finally, visit `http://localhost:${{ values.appPort }}` in your browser to access the application.
1 change: 1 addition & 0 deletions templates/codegen/content/mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ nav:
- Source Component: source-component.md
- Pipelines: pipelines.md
- GitOps Application: gitops-application.md
- OpenShift AI: rhoai.md

plugins:
- techdocs-core
3 changes: 2 additions & 1 deletion templates/codegen/template.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -179,6 +179,7 @@ spec:
srcRepoURL: https://${{ parameters.githubServer if parameters.hostType === 'GitHub' else parameters.gitlabServer }}/${{ parameters.repoOwner }}/${{ parameters.repoName }}
appContainer: ${{ 'quay.io/redhat-ai-dev/ai-template-bootstrap-app:latest' if parameters.hostType === 'GitHub' else 'quay.io/redhat-ai-dev/codegen:latest' }}
appPort: 8501
appRunCommand: "streamlit run codegen-app.py"
modelServiceContainer: quay.io/ai-lab/llamacpp_python:latest
modelServicePort: 8001
# Renders all the template variables into the files and directory names and content, and places the result in the workspace.
Expand All @@ -199,7 +200,7 @@ spec:
tags: '["ai", "llamacpp", "vllm", "python"]'
owner: ${{ parameters.owner }}
repoSlug: '${{ parameters.imageOrg }}/${{ parameters.imageName }}'
defaultBranch: ${{ parameters.branch }}
defaultBranch: ${{ parameters.branch }}
- id: fetch-github-action
name: Fetch GitHub Action
action: fetch:plain
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
37 changes: 37 additions & 0 deletions templates/object-detection/content/docs/rhoai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Running Samples in OpenShift AI

This document will outline how you can run build and run your sample applications within an OpenShift AI workbench.

## Prerequisites

- Red Hat OpenShift AI installed, and `Create workbench for OpenShift AI` selected during component creation.
- `oc` cli installed
- `oc` can be downloaded from https://mirror.openshift.com/pub/openshift-v4/clients/ocp/stable/
- Permissions to run `oc port-forward` on the cluster, specifically an account with the following roles:
- `get`, `create`, and `list` for the `pods/portforward` subresource

## Running the Sample

1) Navigate to the OpenShift AI workbench created for your sample application

2) Go to `File->Open` and select `Terminal`
![image](./.assets/open-terminal.png)

3) In the terminal, run `cd ${{ values.name }}` to navigate to your sample app's directory

4) Run `pip install --upgrade -r requirements.txt` to install the dependencies for your application

5) Run `${{ values.appRunCommand }}` to run the sample in the workbench.

## Accessing the Sample

With the sample app now running, the following steps will allow you to access the sample app in your browser:

1) Navigate back to the OpenShift AI dashboard, and find the name of your workbench.
![image](./.assets/workbench-name.png)

2) In a terminal window on your machine, run `oc get pods -l app=<workbench-name>`. This will retrieve the name of the pod where the workbench is running.

3) Run `oc port-forward <pod-name> ${{ values.appPort }}` to port forward the sample application's port to your local machine.

4) Finally, visit `http://localhost:${{ values.appPort }}` in your browser to access the application.
1 change: 1 addition & 0 deletions templates/object-detection/content/mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ nav:
- Source Component: source-component.md
- Pipelines: pipelines.md
- GitOps Application: gitops-application.md
- OpenShift AI: rhoai.md

plugins:
- techdocs-core
3 changes: 2 additions & 1 deletion templates/object-detection/template.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -165,6 +165,7 @@ spec:
srcRepoURL: https://${{ parameters.githubServer if parameters.hostType === 'GitHub' else parameters.gitlabServer }}/${{ parameters.repoOwner }}/${{ parameters.repoName }}
appContainer: ${{ 'quay.io/redhat-ai-dev/ai-template-bootstrap-app:latest' if parameters.hostType === 'GitHub' else 'quay.io/redhat-ai-dev/object_detection:latest' }}
appPort: 8501
appRunCommand: "streamlit run object_detection_client.py"
modelServiceContainer: quay.io/redhat-ai-dev/object_detection_python:latest
modelServicePort: 8000
# Renders all the template variables into the files and directory names and content, and places the result in the workspace.
Expand All @@ -185,7 +186,7 @@ spec:
tags: '["ai", "detr", "python"]'
owner: ${{ parameters.owner }}
repoSlug: '${{ parameters.imageOrg }}/${{ parameters.imageName }}'
defaultBranch: ${{ parameters.branch }}
defaultBranch: ${{ parameters.branch }}
- id: fetch-github-action
name: Fetch GitHub Action
action: fetch:plain
Expand Down
Loading