Skip to content

Commit

Permalink
initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
Scaffolder committed Nov 8, 2024
0 parents commit 7db3413
Show file tree
Hide file tree
Showing 19 changed files with 417 additions and 0 deletions.
21 changes: 21 additions & 0 deletions .github/workflows/automerge.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
name: Trigger PipelineRun on Push

on:
workflow_dispatch:
inputs:
pr_url:
description: 'The PR URL that need to be merged to trigger the pipelinerun'
required: true
jobs:
automerge:
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
repository-projects: write
steps:
- name: Enable Automerge
run: gh pr merge --auto --squash "$PR_URL"
env:
PR_URL: ${{ github.event.inputs.pr_url }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
22 changes: 22 additions & 0 deletions .tekton/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# docker-build-ai-rhdh

## Shared Git resolver model for shared pipeline and tasks

This pipeline is used to create Containerfile based SSCS (Software Supply Chain Security) builds. The pipeline run by this runner clones the source, builds an image with SBOM (Software Bill of Materials), attests, and pushes these to the users image registry.

Tasks references come from this [repository](https://github.com/redhat-ai-dev/rhdh-pipelines) `pac/pipelines` and the tasks are defined in `pac/tasks`. The tasks are referenced by URL using the git resolver in tekton.

When the pipelines in this repository are updated, all future runs in existing pipelines are shared.

A developer can override these tasks with a local copy and updated annotations.

Example

To override the git-clone task, you may simply copy the git reference into your .tekton directory and then reference it from the remote task annotation.

`pipelinesascode.tekton.dev/task-0: ".tekton/git-clone.yaml"`

## Templates
These pipelines are in template format. The references to this repository in the PaC template is `{{values.rawUrl}}` which is updated to point to this repo or the fork of this repo.

The intent of the template is to fork this repository and update its use in the Developer Hub templates directory.
53 changes: 53 additions & 0 deletions .tekton/docker-pull-request.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
apiVersion: tekton.dev/v1
kind: PipelineRun
metadata:
name: test-newscript-nov8-on-pull-request
annotations:
pipelinesascode.tekton.dev/on-event: "[pull_request]"
pipelinesascode.tekton.dev/on-target-branch: "[main]"
pipelinesascode.tekton.dev/max-keep-runs: "2"
pipelinesascode.tekton.dev/pipeline: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/pipelines/docker-build-ai-rhdh.yaml"
pipelinesascode.tekton.dev/task-0: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/init.yaml"
pipelinesascode.tekton.dev/task-1: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/git-clone.yaml"
pipelinesascode.tekton.dev/task-2: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/buildah-ai-rhdh.yaml"
pipelinesascode.tekton.dev/task-3: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/update-deployment.yaml"
pipelinesascode.tekton.dev/task-4: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/show-sbom-rhdh.yaml"
pipelinesascode.tekton.dev/task-5: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/summary.yaml"
labels:
argocd/app-name: test-newscript-nov8
janus-idp.io/tekton: test-newscript-nov8
backstage.io/kubernetes-id: test-newscript-nov8
backstage.io/kubernetes-namespace: rhdh-app
app.kubernetes.io/part-of: test-newscript-nov8
spec:
params:
- name: dockerfile
value: Containerfile
- name: git-url
value: '{{repo_url}}'
- name: image-expires-after
value: 5d
- name: output-image
value: quay.io/jdubrick-ai/test-newscript-nov8:on-pr-{{revision}}
- name: path-context
value: .
- name: revision
value: '{{revision}}'
- name: event-type
value: '{{event_type}}'
- name: gitops-auth-secret-name
value: gitops-auth-secret
pipelineRef:
name: docker-build-ai-rhdh
workspaces:
- name: git-auth
secret:
secretName: "{{ git_auth_secret }}"
- name: workspace
volumeClaimTemplate:
spec:
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 1Gi
53 changes: 53 additions & 0 deletions .tekton/docker-push.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
apiVersion: tekton.dev/v1
kind: PipelineRun
metadata:
name: test-newscript-nov8-on-push
annotations:
pipelinesascode.tekton.dev/on-event: "[push]"
pipelinesascode.tekton.dev/on-target-branch: "[main]"
pipelinesascode.tekton.dev/max-keep-runs: "2"
pipelinesascode.tekton.dev/pipeline: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/pipelines/docker-build-ai-rhdh.yaml"
pipelinesascode.tekton.dev/task-0: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/init.yaml"
pipelinesascode.tekton.dev/task-1: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/git-clone.yaml"
pipelinesascode.tekton.dev/task-2: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/buildah-ai-rhdh.yaml"
pipelinesascode.tekton.dev/task-3: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/update-deployment.yaml"
pipelinesascode.tekton.dev/task-4: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/show-sbom-rhdh.yaml"
pipelinesascode.tekton.dev/task-5: "https://raw.githubusercontent.com/redhat-ai-dev/rhdh-pipelines/main/pac/tasks/summary.yaml"
labels:
argocd/app-name: test-newscript-nov8
janus-idp.io/tekton: test-newscript-nov8
backstage.io/kubernetes-id: test-newscript-nov8
backstage.io/kubernetes-namespace: rhdh-app
app.kubernetes.io/part-of: test-newscript-nov8
spec:
params:
- name: dockerfile
value: Containerfile
- name: git-url
value: '{{repo_url}}'
- name: image-expires-after
value: 5d
- name: output-image
value: quay.io/jdubrick-ai/test-newscript-nov8:{{revision}}
- name: path-context
value: .
- name: revision
value: '{{revision}}'
- name: event-type
value: '{{event_type}}'
- name: gitops-auth-secret-name
value: gitops-auth-secret
pipelineRef:
name: docker-build-ai-rhdh
workspaces:
- name: git-auth
secret:
secretName: "{{ git_auth_secret }}"
- name: workspace
volumeClaimTemplate:
spec:
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 1Gi
8 changes: 8 additions & 0 deletions Containerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
FROM registry.access.redhat.com/ubi9/python-311:1-77.1726664316
WORKDIR /chat
COPY requirements.txt .
RUN pip install --upgrade pip
RUN pip install --no-cache-dir --upgrade -r /chat/requirements.txt
COPY chatbot_ui.py .
EXPOSE 8501
ENTRYPOINT [ "streamlit", "run", "chatbot_ui.py" ]
23 changes: 23 additions & 0 deletions catalog-info.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
apiVersion: backstage.io/v1alpha1
kind: Component
metadata:
name: test-newscript-nov8
description: Secure Supply Chain Example for Chatbot Application
links:
- url: https://www.redhat.com/en/solutions/trusted-software-supply-chain
title: Trusted Secure Supply Chain
icon: dashboard
type: admin-dashboard
annotations:
# ArgoCD apps from this template used rhtap-gitops as the grouping
argocd/app-selector: rhtap/gitops=test-newscript-nov8
janus-idp.io/tekton: test-newscript-nov8
backstage.io/kubernetes-id: test-newscript-nov8
backstage.io/techdocs-ref: dir:.
quay.io/repository-slug: jdubrick-ai/test-newscript-nov8
tags: ["ai", "llamacpp", "vllm", "python"]
spec:
type: service
owner: user:guest
lifecycle: experimental

108 changes: 108 additions & 0 deletions chatbot_ui.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,108 @@
from langchain_openai import ChatOpenAI
from langchain.chains import LLMChain
from langchain_community.callbacks import StreamlitCallbackHandler
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain.memory import ConversationBufferWindowMemory
import streamlit as st
import requests
import time
import json
import os

model_service = os.getenv("MODEL_ENDPOINT",
"http://localhost:8001")
model_service = f"{model_service}/v1"
model_service_bearer = os.getenv("MODEL_ENDPOINT_BEARER")
request_kwargs = {}
if model_service_bearer is not None:
request_kwargs = {"headers": {"Authorization": f"Bearer {model_service_bearer}"}}

@st.cache_resource(show_spinner=False)
def checking_model_service():
start = time.time()
print("Checking Model Service Availability...")
ready = False
while not ready:
try:
request_cpp = requests.get(f'{model_service}/models', **request_kwargs)
request_ollama = requests.get(f'{model_service[:-2]}api/tags', **request_kwargs)
if request_cpp.status_code == 200:
server = "Llamacpp_Python"
ready = True
elif request_ollama.status_code == 200:
server = "Ollama"
ready = True
except:
pass
time.sleep(1)
print(f"{server} Model Service Available")
print(f"{time.time()-start} seconds")
return server

def get_models():
try:
response = requests.get(f"{model_service[:-2]}api/tags", **request_kwargs)
return [i["name"].split(":")[0] for i in
json.loads(response.content)["models"]]
except:
return None

with st.spinner("Checking Model Service Availability..."):
server = checking_model_service()

def enableInput():
st.session_state["input_disabled"] = False

def disableInput():
st.session_state["input_disabled"] = True

st.title("💬 Chatbot")
if "messages" not in st.session_state:
st.session_state["messages"] = [{"role": "assistant",
"content": "How can I help you?"}]
if "input_disabled" not in st.session_state:
enableInput()

for msg in st.session_state.messages:
st.chat_message(msg["role"]).write(msg["content"])

@st.cache_resource()
def memory():
memory = ConversationBufferWindowMemory(return_messages=True,k=3)
return memory

model_name = os.getenv("MODEL_NAME", "")

if server == "Ollama":
models = get_models()
with st.sidebar:
model_name = st.radio(label="Select Model",
options=models)

llm = ChatOpenAI(base_url=model_service,
api_key="sk-no-key-required" if model_service_bearer is None else model_service_bearer,
model=model_name,
streaming=True,
callbacks=[StreamlitCallbackHandler(st.empty(),
expand_new_thoughts=True,
collapse_completed_thoughts=True)])

prompt = ChatPromptTemplate.from_messages([
("system", "You are world class technical advisor."),
MessagesPlaceholder(variable_name="history"),
("user", "{input}")
])

chain = LLMChain(llm=llm,
prompt=prompt,
verbose=False,
memory=memory())

if prompt := st.chat_input(disabled=st.session_state["input_disabled"],on_submit=disableInput):
st.session_state.messages.append({"role": "user", "content": prompt})
st.chat_message("user").markdown(prompt)
response = chain.invoke(prompt)
st.chat_message("assistant").markdown(response["text"])
st.session_state.messages.append({"role": "assistant", "content": response["text"]})
enableInput()
st.rerun()
15 changes: 15 additions & 0 deletions docs/gitops-application.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# ai-lab-template-gitops

# Gitops Repo Patterns

This repository contains an HTTP Gitops repository format component for use as the AI-Lab Gitops template.

## HTTP

This contains a deployment with the following characteristics:

**Model service image** `quay.io/ai-lab/llamacpp_python:latest` **listening on port** `8001`.

**App interface image** `quay.io/redhat-ai-dev/ai-template-bootstrap-app:latest` **listening on port** `8501` for service and routing.

This matches the current AI-Lab software template default deployment.
Binary file added docs/images/access-openshift-ai.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/access-workbench.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/data-science-projects.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/open-terminal.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/workbench-name.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
20 changes: 20 additions & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# AI Software Template

This application, test-newscript-nov8, is created from an AI Software Template. These software templates create a new source code repository as well as a new GitOps deployment repository.

The chosen sample source applicable is included in the source code repository.

## Sample Source Application

A Large Language Model (LLM)-enabled streamlit chat application. The bot replies with AI generated responses.

## Repositories

The source code for your application can be found in [https://github.com/jdubrick-ai/test-newscript-nov8 ](https://github.com/jdubrick-ai/test-newscript-nov8 ).

The GitOps repository, which contains the Kubernetes manifests for the application can be found in
[https://github.com/jdubrick-ai/test-newscript-nov8-gitops ](https://github.com/jdubrick-ai/test-newscript-nov8-gitops ).

## Application namespaces

The default application is found in the namespace: **`rhdh-app`**. Applications can be deployed into their own unique namespace or multiple software templates can generate numerous applications into the same namespace.
22 changes: 22 additions & 0 deletions docs/pipelines.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# docker-build-ai-rhdh

## Shared Git resolver model for shared pipeline and tasks

This pipeline is used to create Containerfile based SSCS (Software Supply Chain Security) builds. The pipeline run by this runner clones the source, builds an image with SBOM (Software Bill of Materials), attests, and pushes these to the users image registry.

Tasks references come from this [repository](https://github.com/redhat-ai-dev/rhdh-pipelines) `pac/pipelines` and the tasks are defined in `pac/tasks`. The tasks are referenced by URL using the git resolver in tekton.

When the pipelines in this repository are updated, all future runs in existing pipelines are shared.

A developer can override these tasks with a local copy and updated annotations.

Example

To override the git-clone task, you may simply copy the git reference into your .tekton directory and then reference it from the remote task annotation.

`pipelinesascode.tekton.dev/task-0: ".tekton/git-clone.yaml"`

## Templates
These pipelines are in template format. The references to this repository in the PaC template is `{{values.rawUrl}}` which is updated to point to this repo or the fork of this repo.

The intent of the template is to fork this repository and update its use in the Developer Hub templates directory.
Loading

0 comments on commit 7db3413

Please sign in to comment.