Skip to content

Commit

Permalink
initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
Scaffolder committed Nov 29, 2024
0 parents commit 72b4232
Show file tree
Hide file tree
Showing 19 changed files with 380 additions and 0 deletions.
21 changes: 21 additions & 0 deletions .github/workflows/automerge.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
name: Trigger PipelineRun on Push

on:
workflow_dispatch:
inputs:
pr_url:
description: 'The PR URL that need to be merged to trigger the pipelinerun'
required: true
jobs:
automerge:
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
repository-projects: write
steps:
- name: Enable Automerge
run: gh pr merge --auto --squash "$PR_URL"
env:
PR_URL: ${{ github.event.inputs.pr_url }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
22 changes: 22 additions & 0 deletions .tekton/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# docker-build-ai-rhdh

## Shared Git resolver model for shared pipeline and tasks

This pipeline is used to create Containerfile based SSCS (Software Supply Chain Security) builds. The pipeline run by this runner clones the source, builds an image with SBOM (Software Bill of Materials), attests, and pushes these to the users image registry.

Tasks references come from this [repository](https://github.com/redhat-ai-dev/rhdh-pipelines) `pac/pipelines` and the tasks are defined in `pac/tasks`. The tasks are referenced by URL using the git resolver in tekton.

When the pipelines in this repository are updated, all future runs in existing pipelines are shared.

A developer can override these tasks with a local copy and updated annotations.

Example

To override the git-clone task, you may simply copy the git reference into your .tekton directory and then reference it from the remote task annotation.

`pipelinesascode.tekton.dev/task-0: ".tekton/git-clone.yaml"`

## Templates
These pipelines are in template format. The references to this repository in the PaC template is `{{values.rawUrl}}` which is updated to point to this repo or the fork of this repo.

The intent of the template is to fork this repository and update its use in the Developer Hub templates directory.
53 changes: 53 additions & 0 deletions .tekton/docker-pull-request.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
apiVersion: tekton.dev/v1
kind: PipelineRun
metadata:
name: nov29-oldargocd-on-pull-request
annotations:
pipelinesascode.tekton.dev/on-event: "[pull_request]"
pipelinesascode.tekton.dev/on-target-branch: "[main]"
pipelinesascode.tekton.dev/max-keep-runs: "2"
pipelinesascode.tekton.dev/pipeline: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/pipelines/docker-build-ai-rhdh.yaml"
pipelinesascode.tekton.dev/task-0: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/init.yaml"
pipelinesascode.tekton.dev/task-1: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/git-clone.yaml"
pipelinesascode.tekton.dev/task-2: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/buildah-ai-rhdh.yaml"
pipelinesascode.tekton.dev/task-3: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/update-deployment.yaml"
pipelinesascode.tekton.dev/task-4: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/show-sbom-rhdh.yaml"
pipelinesascode.tekton.dev/task-5: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/summary.yaml"
labels:
argocd/app-name: nov29-oldargocd
janus-idp.io/tekton: nov29-oldargocd
backstage.io/kubernetes-id: nov29-oldargocd
backstage.io/kubernetes-namespace: rhdh-app
app.kubernetes.io/part-of: nov29-oldargocd
spec:
params:
- name: dockerfile
value: Containerfile
- name: git-url
value: '{{repo_url}}'
- name: image-expires-after
value: 5d
- name: output-image
value: quay.io/jdubrick-ai/nov29-oldargocd:on-pr-{{revision}}
- name: path-context
value: .
- name: revision
value: '{{revision}}'
- name: event-type
value: '{{event_type}}'
- name: gitops-auth-secret-name
value: gitops-auth-secret
pipelineRef:
name: docker-build-ai-rhdh
workspaces:
- name: git-auth
secret:
secretName: "{{ git_auth_secret }}"
- name: workspace
volumeClaimTemplate:
spec:
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 1Gi
53 changes: 53 additions & 0 deletions .tekton/docker-push.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
apiVersion: tekton.dev/v1
kind: PipelineRun
metadata:
name: nov29-oldargocd-on-push
annotations:
pipelinesascode.tekton.dev/on-event: "[push]"
pipelinesascode.tekton.dev/on-target-branch: "[main]"
pipelinesascode.tekton.dev/max-keep-runs: "2"
pipelinesascode.tekton.dev/pipeline: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/pipelines/docker-build-ai-rhdh.yaml"
pipelinesascode.tekton.dev/task-0: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/init.yaml"
pipelinesascode.tekton.dev/task-1: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/git-clone.yaml"
pipelinesascode.tekton.dev/task-2: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/buildah-ai-rhdh.yaml"
pipelinesascode.tekton.dev/task-3: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/update-deployment.yaml"
pipelinesascode.tekton.dev/task-4: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/show-sbom-rhdh.yaml"
pipelinesascode.tekton.dev/task-5: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/summary.yaml"
labels:
argocd/app-name: nov29-oldargocd
janus-idp.io/tekton: nov29-oldargocd
backstage.io/kubernetes-id: nov29-oldargocd
backstage.io/kubernetes-namespace: rhdh-app
app.kubernetes.io/part-of: nov29-oldargocd
spec:
params:
- name: dockerfile
value: Containerfile
- name: git-url
value: '{{repo_url}}'
- name: image-expires-after
value: 5d
- name: output-image
value: quay.io/jdubrick-ai/nov29-oldargocd:{{revision}}
- name: path-context
value: .
- name: revision
value: '{{revision}}'
- name: event-type
value: '{{event_type}}'
- name: gitops-auth-secret-name
value: gitops-auth-secret
pipelineRef:
name: docker-build-ai-rhdh
workspaces:
- name: git-auth
secret:
secretName: "{{ git_auth_secret }}"
- name: workspace
volumeClaimTemplate:
spec:
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 1Gi
8 changes: 8 additions & 0 deletions Containerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
FROM registry.access.redhat.com/ubi9/python-311:1-77.1726664316
WORKDIR /codegen
COPY requirements.txt .
RUN pip install --upgrade pip
RUN pip install --no-cache-dir --upgrade -r /codegen/requirements.txt
COPY codegen-app.py .
EXPOSE 8501
ENTRYPOINT ["streamlit", "run", "codegen-app.py"]
23 changes: 23 additions & 0 deletions catalog-info.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
apiVersion: backstage.io/v1alpha1
kind: Component
metadata:
name: nov29-oldargocd
description: Secure Supply Chain Example for Code Generation Application
links:
- url: https://www.redhat.com/en/solutions/trusted-software-supply-chain
title: Trusted Secure Supply Chain
icon: dashboard
type: admin-dashboard
annotations:
# ArgoCD apps from this template used rhtap-gitops as the grouping
argocd/app-selector: rhtap/gitops=nov29-oldargocd
janus-idp.io/tekton: nov29-oldargocd
backstage.io/kubernetes-id: nov29-oldargocd
backstage.io/techdocs-ref: dir:.
quay.io/repository-slug: jdubrick-ai/nov29-oldargocd
tags: ["ai", "llamacpp", "vllm", "python"]
spec:
type: service
owner: user:guest
lifecycle: experimental

72 changes: 72 additions & 0 deletions codegen-app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
import os
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnablePassthrough
from langchain_community.callbacks import StreamlitCallbackHandler

import streamlit as st
import requests
import time

model_service = os.getenv("MODEL_ENDPOINT", "http://localhost:8001")
model_service = f"{model_service}/v1"
model_service_bearer = os.getenv("MODEL_ENDPOINT_BEARER")
request_kwargs = {}
if model_service_bearer is not None:
request_kwargs = {"headers": {"Authorization": f"Bearer {model_service_bearer}"}}

@st.cache_resource(show_spinner=False)
def checking_model_service():
start = time.time()
print("Checking Model Service Availability...")
ready = False
while not ready:
try:
request = requests.get(f'{model_service}/models', **request_kwargs)
if request.status_code == 200:
ready = True
except:
pass
time.sleep(1)
print("Model Service Available")
print(f"{time.time()-start} seconds")

with st.spinner("Checking Model Service Availability..."):
checking_model_service()

st.title("Code Generation App")

if "messages" not in st.session_state:
st.session_state["messages"] = [{"role": "assistant",
"content": "How can I help you?"}]

for msg in st.session_state.messages:
st.chat_message(msg["role"]).write(msg["content"])

model_name = os.getenv("MODEL_NAME", "")

llm = ChatOpenAI(base_url=model_service,
model=model_name,
api_key="EMPTY" if model_service_bearer is None else model_service_bearer,
streaming=True)

# Define the Langchain chain
prompt = ChatPromptTemplate.from_template("""You are an helpful code assistant that can help developer to code for a given {input}.
Generate the code block at first, and explain the code at the end.
If the {input} is not making sense, please ask for more clarification.""")
chain = (
{"input": RunnablePassthrough()}
| prompt
| llm
)

if prompt := st.chat_input():
st.session_state.messages.append({"role": "user", "content": prompt})
st.chat_message("user").markdown(prompt)

st_callback = StreamlitCallbackHandler(st.container())
response = chain.invoke(prompt, {"callbacks": [st_callback]})

st.chat_message("assistant").markdown(response.content)
st.session_state.messages.append({"role": "assistant", "content": response.content})
st.rerun()
15 changes: 15 additions & 0 deletions docs/gitops-application.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# ai-lab-template-gitops

# Gitops Repo Patterns

This repository contains an HTTP Gitops repository format component for use as the AI-Lab Gitops template.

## HTTP

This contains a deployment with the following characteristics:

**Model service image** `quay.io/ai-lab/llamacpp_python:latest` **listening on port** `8001`.

**App interface image** `quay.io/redhat-ai-dev/ai-template-bootstrap-app:latest` **listening on port** `8501` for service and routing.

This matches the current AI-Lab software template default deployment.
Binary file added docs/images/access-openshift-ai.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/access-workbench.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/data-science-projects.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/open-terminal.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/workbench-name.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
20 changes: 20 additions & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# AI Software Template

This application, nov29-oldargocd, is created from an AI Software Template. These software templates create a new source code repository as well as a new GitOps deployment repository.

The chosen sample source applicable is included in the source code repository.

## Sample Source Application

A Large Language Model (LLM)-enabled streamlit code generation application. This specialized bot helps with code related queries.

## Repositories

The source code for your application can be found in [https://github.com/jdubrick-ai/nov29-oldargocd ](https://github.com/jdubrick-ai/nov29-oldargocd ).

The GitOps repository, which contains the Kubernetes manifests for the application can be found in
[https://github.com/jdubrick-ai/nov29-oldargocd-gitops ](https://github.com/jdubrick-ai/nov29-oldargocd-gitops ).

## Application namespaces

The default application is found in the namespace: **`rhdh-app`**. Applications can be deployed into their own unique namespace or multiple software templates can generate numerous applications into the same namespace.
22 changes: 22 additions & 0 deletions docs/pipelines.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# docker-build-ai-rhdh

## Shared Git resolver model for shared pipeline and tasks

This pipeline is used to create Containerfile based SSCS (Software Supply Chain Security) builds. The pipeline run by this runner clones the source, builds an image with SBOM (Software Bill of Materials), attests, and pushes these to the users image registry.

Tasks references come from this [repository](https://github.com/redhat-ai-dev/rhdh-pipelines) `pac/pipelines` and the tasks are defined in `pac/tasks`. The tasks are referenced by URL using the git resolver in tekton.

When the pipelines in this repository are updated, all future runs in existing pipelines are shared.

A developer can override these tasks with a local copy and updated annotations.

Example

To override the git-clone task, you may simply copy the git reference into your .tekton directory and then reference it from the remote task annotation.

`pipelinesascode.tekton.dev/task-0: ".tekton/git-clone.yaml"`

## Templates
These pipelines are in template format. The references to this repository in the PaC template is `{{values.rawUrl}}` which is updated to point to this repo or the fork of this repo.

The intent of the template is to fork this repository and update its use in the Developer Hub templates directory.
48 changes: 48 additions & 0 deletions docs/rhoai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# Running Samples in OpenShift AI

This document outlines how you can build and run your sample applications within an OpenShift AI workbench.

## Prerequisites

- Red Hat OpenShift AI installed, and `Create workbench for OpenShift AI` selected during component creation.
- `oc` cli installed
- `oc` can be downloaded from https://mirror.openshift.com/pub/openshift-v4/clients/ocp/stable/
- Permissions to run `oc port-forward` on the cluster, specifically an account with the following roles:
- `get`, `create`, and `list` for the `pods/portforward` subresource

## Running the Sample

1) On the Console, click the square "apps" icon on the upper-right corner (next to the notifications icon). `Openshift AI` is listed in the drop-down list.

![image](./images/access-openshift-ai.png)

2) Go to the `Data Science Projects` section and access your application's project named `rhdh-app`.

![image](./images/data-science-projects.png)

3) Access the `workbench` named `nov29-oldargocd-notebook`.

![image](./images/access-workbench.png)

4) Go to `File->Open` and select `Terminal`.

![image](./images/open-terminal.png)

5) In the terminal, run `cd nov29-oldargocd` to navigate to your sample app's directory.

6) Run `pip install --upgrade -r requirements.txt` to install the dependencies for your application.

7) Run `streamlit run codegen-app.py` to run the sample in the workbench.

## Accessing the Sample

With the sample app now running, complete the following steps to access the sample app in your browser:

1) Go to the OpenShift AI dashboard, and find the name of your workbench.
![image](./images/workbench-name.png)

2) In a terminal window on your machine, run `oc get pods -l app=<workbench-name>`. This retrieves the name of the pod where the workbench is running.

3) Run `oc port-forward <pod-name> 8501` to port forward the sample application's port to your local machine.

4) Finally, visit `http://localhost:8501` in your browser to access the application.
9 changes: 9 additions & 0 deletions docs/source-component.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# AI-lab samples

## Usage in AI-lab templates

This repository is being used in [ai-lab-template](https://github.com/redhat-ai-dev/ai-lab-template) as component source code for users to start with.

This is a copy of the ai lab sample apps source code. The master copy of those apps are under [ai-lab-recipes](https://github.com/containers/ai-lab-recipes)

To pull in the latest changes, run `./pull-sample-app.sh`, and commit the changes.
11 changes: 11 additions & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
site_name: 'Documentation'

nav:
- Home: index.md
- Source Component: source-component.md
- Pipelines: pipelines.md
- GitOps Application: gitops-application.md
- OpenShift AI: rhoai.md

plugins:
- techdocs-core
3 changes: 3 additions & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
langchain==0.1.20
langchain-openai==0.1.7
streamlit==1.34.0

0 comments on commit 72b4232

Please sign in to comment.