diff --git a/.github/workflows/_example-workflow.yml b/.github/workflows/_example-workflow.yml index 6081ed9b1d..f66a2a323d 100644 --- a/.github/workflows/_example-workflow.yml +++ b/.github/workflows/_example-workflow.yml @@ -79,6 +79,7 @@ jobs: fi if [[ $(grep -c "vllm-gaudi:" ${docker_compose_path}) != 0 ]]; then git clone https://github.com/HabanaAI/vllm-fork.git + cd vllm-fork && git checkout v0.6.4.post2+Gaudi-1.19.0 && cd ../ fi git clone https://github.com/opea-project/GenAIComps.git cd GenAIComps && git checkout ${{ inputs.opea_branch }} && git rev-parse HEAD && cd ../ diff --git a/AgentQnA/README.md b/AgentQnA/README.md index de36805b11..d45b14ef55 100644 --- a/AgentQnA/README.md +++ b/AgentQnA/README.md @@ -2,8 +2,8 @@ ## Overview -This example showcases a hierarchical multi-agent system for question-answering applications. The architecture diagram is shown below. The supervisor agent interfaces with the user and dispatch tasks to the worker agent and other tools to gather information and come up with answers. The worker agent uses the retrieval tool to generate answers to the queries posted by the supervisor agent. Other tools used by the supervisor agent may include APIs to interface knowledge graphs, SQL databases, external knowledge bases, etc. -![Architecture Overview](assets/agent_qna_arch.png) +This example showcases a hierarchical multi-agent system for question-answering applications. The architecture diagram is shown below. The supervisor agent interfaces with the user and dispatch tasks to two worker agents to gather information and come up with answers. The worker RAG agent uses the retrieval tool to retrieve relevant documents from the knowledge base (a vector database). The worker SQL agent retrieve relevant data from the SQL database. Although not included in this example, but other tools such as a web search tool or a knowledge graph query tool can be used by the supervisor agent to gather information from additional sources. +![Architecture Overview](assets/img/agent_qna_arch.png) The AgentQnA example is implemented using the component-level microservices defined in [GenAIComps](https://github.com/opea-project/GenAIComps). The flow chart below shows the information flow between different microservices for this example. @@ -38,6 +38,7 @@ flowchart LR end AG_REACT([Agent MicroService - react]):::blue AG_RAG([Agent MicroService - rag]):::blue + AG_SQL([Agent MicroService - sql]):::blue LLM_gen{{LLM Service
}} DP([Data Preparation MicroService]):::blue TEI_RER{{Reranking service
}} @@ -51,6 +52,7 @@ flowchart LR direction LR a[User Input Query] --> AG_REACT AG_REACT --> AG_RAG + AG_REACT --> AG_SQL AG_RAG --> DocIndexRetriever-MegaService EM ==> RET RET ==> RER @@ -59,6 +61,7 @@ flowchart LR %% Embedding service flow direction LR AG_RAG <-.-> LLM_gen + AG_SQL <-.-> LLM_gen AG_REACT <-.-> LLM_gen EM <-.-> TEI_EM RET <-.-> R_RET @@ -75,11 +78,11 @@ flowchart LR ### Why Agent for question answering? 1. Improve relevancy of retrieved context. - Agent can rephrase user queries, decompose user queries, and iterate to get the most relevant context for answering user's questions. Compared to conventional RAG, RAG agent can significantly improve the correctness and relevancy of the answer. -2. Use tools to get additional knowledge. - For example, knowledge graphs and SQL databases can be exposed as APIs for Agents to gather knowledge that may be missing in the retrieval vector database. -3. Hierarchical agent can further improve performance. - Expert worker agents, such as retrieval agent, knowledge graph agent, SQL agent, etc., can provide high-quality output for different aspects of a complex query, and the supervisor agent can aggregate the information together to provide a comprehensive answer. + RAG agent can rephrase user queries, decompose user queries, and iterate to get the most relevant context for answering user's questions. Compared to conventional RAG, RAG agent can significantly improve the correctness and relevancy of the answer. +2. Expand scope of the agent. + The supervisor agent can interact with multiple worker agents that specialize in different domains with different skills (e.g., retrieve documents, write SQL queries, etc.), and thus can answer questions in multiple domains. +3. Hierarchical multi-agents can improve performance. + Expert worker agents, such as RAG agent and SQL agent, can provide high-quality output for different aspects of a complex query, and the supervisor agent can aggregate the information together to provide a comprehensive answer. If we only use one agent and provide all the tools to this single agent, it may get overwhelmed and not able to provide accurate answers. ## Deployment with docker @@ -148,28 +151,55 @@ docker build -t opea/agent:latest --build-arg https_proxy=$https_proxy --build-a bash run_ingest_data.sh ``` -4. Launch other tools.
+4. Prepare SQL database + In this example, we will use the Chinook SQLite database. Run the commands below. + + ``` + # Download data + cd $WORKDIR + git clone https://github.com/lerocha/chinook-database.git + cp chinook-database/ChinookDatabase/DataSources/Chinook_Sqlite.sqlite $WORKDIR/GenAIExamples/AgentQnA/tests/ + ``` + +5. Launch other tools.
In this example, we will use some of the mock APIs provided in the Meta CRAG KDD Challenge to demonstrate the benefits of gaining additional context from mock knowledge graphs. ``` docker run -d -p=8080:8000 docker.io/aicrowd/kdd-cup-24-crag-mock-api:v0 ``` -5. Launch agent services
- We provide two options for `llm_engine` of the agents: 1. open-source LLMs, 2. OpenAI models via API calls. - - Deploy it on Gaudi or Xeon respectively +6. Launch multi-agent system.
+ We provide two options for `llm_engine` of the agents: 1. open-source LLMs on Intel Gaudi2, 2. OpenAI models via API calls. ::::{tab-set} :::{tab-item} Gaudi :sync: Gaudi - To use open-source LLMs on Gaudi2, run commands below. + On Gaudi2 we will serve `meta-llama/Meta-Llama-3.1-70B-Instruct` using vllm. + First build vllm-gaudi docker image. + + ```bash + cd $WORKDIR + git clone https://github.com/vllm-project/vllm.git + cd ./vllm + git checkout v0.6.6 + docker build --no-cache -f Dockerfile.hpu -t opea/vllm-gaudi:latest --shm-size=128g . --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy ``` - cd $WORKDIR/GenAIExamples/AgentQnA/docker_compose/intel/hpu/gaudi - bash launch_tgi_gaudi.sh - bash launch_agent_service_tgi_gaudi.sh + + Then launch vllm on Gaudi2 with the command below. + + ```bash + vllm_port=8086 + model="meta-llama/Meta-Llama-3.1-70B-Instruct" + docker run -d --runtime=habana --rm --name "vllm-gaudi-server" -e HABANA_VISIBLE_DEVICES=0,1,2,3 -p $vllm_port:8000 -v $vllm_volume:/data -e HF_TOKEN=$HF_TOKEN -e HUGGING_FACE_HUB_TOKEN=$HF_TOKEN -e HF_HOME=/data -e OMPI_MCA_btl_vader_single_copy_mechanism=none -e PT_HPU_ENABLE_LAZY_COLLECTIVES=true -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e no_proxy=$no_proxy -e VLLM_SKIP_WARMUP=true --cap-add=sys_nice --ipc=host opea/vllm-gaudi:latest --model ${model} --max-seq-len-to-capture 16384 --tensor-parallel-size 4 + ``` + + Then launch Agent microservices. + + ```bash + cd $WORKDIR/GenAIExamples/AgentQnA/docker_compose/intel/hpu/gaudi/ + bash launch_agent_service_gaudi.sh ``` ::: @@ -179,6 +209,7 @@ docker build -t opea/agent:latest --build-arg https_proxy=$https_proxy --build-a To use OpenAI models, run commands below. ``` + export OPENAI_API_KEY= cd $WORKDIR/GenAIExamples/AgentQnA/docker_compose/intel/cpu/xeon bash launch_agent_service_openai.sh ``` @@ -195,8 +226,11 @@ Refer to the [AgentQnA helm chart](./kubernetes/helm/README.md) for instructions First look at logs of the agent docker containers: ``` -# worker agent +# worker RAG agent docker logs rag-agent-endpoint + +# worker SQL agent +docker logs sql-agent-endpoint ``` ``` @@ -206,22 +240,36 @@ docker logs react-agent-endpoint You should see something like "HTTP server setup successful" if the docker containers are started successfully.

-Second, validate worker agent: +Second, validate worker RAG agent: ``` curl http://${host_ip}:9095/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ - "query": "Most recent album by Taylor Swift" + "messages": "Michael Jackson song Thriller" }' ``` -Third, validate supervisor agent: +Third, validate worker SQL agent: + +``` +curl http://${host_ip}:9096/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ + "messages": "How many employees are in the company" + }' +``` + +Finally, validate supervisor agent: ``` curl http://${host_ip}:9090/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ - "query": "Most recent album by Taylor Swift" + "messages": "How many albums does Iron Maiden have?" }' ``` +## Deploy AgentQnA UI + +The AgentQnA UI can be deployed locally or using Docker. + +For detailed instructions on deploying AgentQnA UI, refer to the [AgentQnA UI Guide](./ui/svelte/README.md). + ## How to register your own tools with agent You can take a look at the tools yaml and python files in this example. For more details, please refer to the "Provide your own tools" section in the instructions [here](https://github.com/opea-project/GenAIComps/tree/main/comps/agent/src/README.md). diff --git a/AgentQnA/assets/agent_qna_arch.png b/AgentQnA/assets/agent_qna_arch.png deleted file mode 100644 index 3bebb1d997..0000000000 Binary files a/AgentQnA/assets/agent_qna_arch.png and /dev/null differ diff --git a/AgentQnA/assets/img/agent_qna_arch.png b/AgentQnA/assets/img/agent_qna_arch.png new file mode 100644 index 0000000000..31a2039fdc Binary files /dev/null and b/AgentQnA/assets/img/agent_qna_arch.png differ diff --git a/AgentQnA/assets/img/agent_ui.png b/AgentQnA/assets/img/agent_ui.png new file mode 100644 index 0000000000..154c85be83 Binary files /dev/null and b/AgentQnA/assets/img/agent_ui.png differ diff --git a/AgentQnA/assets/img/agent_ui_result.png b/AgentQnA/assets/img/agent_ui_result.png new file mode 100644 index 0000000000..813248d6a3 Binary files /dev/null and b/AgentQnA/assets/img/agent_ui_result.png differ diff --git a/AgentQnA/docker_compose/intel/cpu/xeon/README.md b/AgentQnA/docker_compose/intel/cpu/xeon/README.md index 1da7685349..dde535f2ae 100644 --- a/AgentQnA/docker_compose/intel/cpu/xeon/README.md +++ b/AgentQnA/docker_compose/intel/cpu/xeon/README.md @@ -41,21 +41,33 @@ This example showcases a hierarchical multi-agent system for question-answering bash run_ingest_data.sh ``` -4. Launch Tool service +4. Prepare SQL database + In this example, we will use the SQLite database provided in the [TAG-Bench](https://github.com/TAG-Research/TAG-Bench/tree/main). Run the commands below. + + ``` + # Download data + cd $WORKDIR + git clone https://github.com/TAG-Research/TAG-Bench.git + cd TAG-Bench/setup + chmod +x get_dbs.sh + ./get_dbs.sh + ``` + +5. Launch Tool service In this example, we will use some of the mock APIs provided in the Meta CRAG KDD Challenge to demonstrate the benefits of gaining additional context from mock knowledge graphs. ``` docker run -d -p=8080:8000 docker.io/aicrowd/kdd-cup-24-crag-mock-api:v0 ``` -5. Launch `Agent` service +6. Launch multi-agent system - The configurations of the supervisor agent and the worker agent are defined in the docker-compose yaml file. We currently use openAI GPT-4o-mini as LLM, and llama3.1-70B-instruct (served by TGI-Gaudi) in Gaudi example. To use openai llm, run command below. + The configurations of the supervisor agent and the worker agents are defined in the docker-compose yaml file. We currently use openAI GPT-4o-mini as LLM. ``` cd $WORKDIR/GenAIExamples/AgentQnA/docker_compose/intel/cpu/xeon bash launch_agent_service_openai.sh ``` -6. [Optional] Build `Agent` docker image if pulling images failed. +7. [Optional] Build `Agent` docker image if pulling images failed. ``` git clone https://github.com/opea-project/GenAIComps.git @@ -68,8 +80,11 @@ This example showcases a hierarchical multi-agent system for question-answering First look at logs of the agent docker containers: ``` -# worker agent +# worker RAG agent docker logs rag-agent-endpoint + +# worker SQL agent +docker logs sql-agent-endpoint ``` ``` @@ -79,19 +94,27 @@ docker logs react-agent-endpoint You should see something like "HTTP server setup successful" if the docker containers are started successfully.

-Second, validate worker agent: +Second, validate worker RAG agent: + +``` +curl http://${host_ip}:9095/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ + "messages": "Michael Jackson song Thriller" + }' +``` + +Third, validate worker SQL agent: ``` curl http://${host_ip}:9095/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ - "query": "Most recent album by Taylor Swift" + "messages": "How many employees are in the company?" }' ``` -Third, validate supervisor agent: +Finally, validate supervisor agent: ``` curl http://${host_ip}:9090/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ - "query": "Most recent album by Taylor Swift" + "messages": "How many albums does Iron Maiden have?" }' ``` diff --git a/AgentQnA/docker_compose/intel/cpu/xeon/compose_openai.yaml b/AgentQnA/docker_compose/intel/cpu/xeon/compose_openai.yaml index 3b78f97a1c..09bde26bde 100644 --- a/AgentQnA/docker_compose/intel/cpu/xeon/compose_openai.yaml +++ b/AgentQnA/docker_compose/intel/cpu/xeon/compose_openai.yaml @@ -31,6 +31,33 @@ services: LANGCHAIN_PROJECT: "opea-worker-agent-service" port: 9095 + worker-sql-agent: + image: opea/agent:latest + container_name: sql-agent-endpoint + volumes: + - ${WORKDIR}/TAG-Bench/:/home/user/TAG-Bench # SQL database + ports: + - "9096:9096" + ipc: host + environment: + ip_address: ${ip_address} + strategy: sql_agent + db_name: ${db_name} + db_path: ${db_path} + use_hints: false + hints_file: /home/user/TAG-Bench/${db_name}_hints.csv + recursion_limit: ${recursion_limit_worker} + llm_engine: openai + OPENAI_API_KEY: ${OPENAI_API_KEY} + model: ${model} + temperature: 0 + max_new_tokens: ${max_new_tokens} + stream: false + require_human_feedback: false + no_proxy: ${no_proxy} + http_proxy: ${http_proxy} + https_proxy: ${https_proxy} + port: 9096 supervisor-react-agent: image: opea/agent:latest diff --git a/AgentQnA/docker_compose/intel/cpu/xeon/launch_agent_service_openai.sh b/AgentQnA/docker_compose/intel/cpu/xeon/launch_agent_service_openai.sh index de5c2e34c3..7b4e86a781 100644 --- a/AgentQnA/docker_compose/intel/cpu/xeon/launch_agent_service_openai.sh +++ b/AgentQnA/docker_compose/intel/cpu/xeon/launch_agent_service_openai.sh @@ -13,7 +13,10 @@ export temperature=0 export max_new_tokens=4096 export OPENAI_API_KEY=${OPENAI_API_KEY} export WORKER_AGENT_URL="http://${ip_address}:9095/v1/chat/completions" +export SQL_AGENT_URL="http://${ip_address}:9096/v1/chat/completions" export RETRIEVAL_TOOL_URL="http://${ip_address}:8889/v1/retrievaltool" export CRAG_SERVER=http://${ip_address}:8080 +export db_name=california_schools +export db_path="sqlite:////home/user/TAG-Bench/dev_folder/dev_databases/${db_name}/${db_name}.sqlite" docker compose -f compose_openai.yaml up -d diff --git a/AgentQnA/docker_compose/intel/hpu/gaudi/README.md b/AgentQnA/docker_compose/intel/hpu/gaudi/README.md index dcf8adfdc6..b920dff804 100644 --- a/AgentQnA/docker_compose/intel/hpu/gaudi/README.md +++ b/AgentQnA/docker_compose/intel/hpu/gaudi/README.md @@ -1,6 +1,6 @@ # Single node on-prem deployment AgentQnA on Gaudi -This example showcases a hierarchical multi-agent system for question-answering applications. We deploy the example on Gaudi using open-source LLMs, +This example showcases a hierarchical multi-agent system for question-answering applications. We deploy the example on Gaudi using open-source LLMs. For more details, please refer to the deployment guide [here](../../../../README.md). ## Deployment with docker @@ -45,22 +45,53 @@ For more details, please refer to the deployment guide [here](../../../../README bash run_ingest_data.sh ``` -4. Launch Tool service +4. Prepare SQL database + In this example, we will use the Chinook SQLite database. Run the commands below. + + ``` + # Download data + cd $WORKDIR + git clone https://github.com/lerocha/chinook-database.git + cp chinook-database/ChinookDatabase/DataSources/Chinook_Sqlite.sqlite $WORKDIR/GenAIExamples/AgentQnA/tests/ + ``` + +5. Launch Tool service In this example, we will use some of the mock APIs provided in the Meta CRAG KDD Challenge to demonstrate the benefits of gaining additional context from mock knowledge graphs. ``` docker run -d -p=8080:8000 docker.io/aicrowd/kdd-cup-24-crag-mock-api:v0 ``` -5. Launch `Agent` service +6. Launch multi-agent system + + On Gaudi2 we will serve `meta-llama/Meta-Llama-3.1-70B-Instruct` using vllm. + + First build vllm-gaudi docker image. + + ```bash + cd $WORKDIR + git clone https://github.com/vllm-project/vllm.git + cd ./vllm + git checkout v0.6.6 + docker build --no-cache -f Dockerfile.hpu -t opea/vllm-gaudi:latest --shm-size=128g . --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy + ``` - To use open-source LLMs on Gaudi2, run commands below. + Then launch vllm on Gaudi2 with the command below. + ```bash + vllm_port=8086 + model="meta-llama/Meta-Llama-3.1-70B-Instruct" + docker run -d --runtime=habana --rm --name "vllm-gaudi-server" -e HABANA_VISIBLE_DEVICES=0,1,2,3 -p $vllm_port:8000 -v $vllm_volume:/data -e HF_TOKEN=$HF_TOKEN -e HUGGING_FACE_HUB_TOKEN=$HF_TOKEN -e HF_HOME=/data -e OMPI_MCA_btl_vader_single_copy_mechanism=none -e PT_HPU_ENABLE_LAZY_COLLECTIVES=true -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e no_proxy=$no_proxy -e VLLM_SKIP_WARMUP=true --cap-add=sys_nice --ipc=host opea/vllm-gaudi:latest --model ${model} --max-seq-len-to-capture 16384 --tensor-parallel-size 4 ``` - cd $WORKDIR/GenAIExamples/AgentQnA/docker_compose/intel/hpu/gaudi - bash launch_tgi_gaudi.sh - bash launch_agent_service_tgi_gaudi.sh + + Then launch Agent microservices. + + ```bash + cd $WORKDIR/GenAIExamples/AgentQnA/docker_compose/intel/hpu/gaudi/ + bash launch_agent_service_gaudi.sh ``` -6. [Optional] Build `Agent` docker image if pulling images failed. +7. [Optional] Build `Agent` docker image if pulling images failed. + + If docker image pulling failed in Step 6 above, build the agent docker image with the commands below. After image build, try Step 6 again. ``` git clone https://github.com/opea-project/GenAIComps.git @@ -73,8 +104,11 @@ For more details, please refer to the deployment guide [here](../../../../README First look at logs of the agent docker containers: ``` -# worker agent +# worker RAG agent docker logs rag-agent-endpoint + +# worker SQL agent +docker logs sql-agent-endpoint ``` ``` @@ -84,19 +118,27 @@ docker logs react-agent-endpoint You should see something like "HTTP server setup successful" if the docker containers are started successfully.

-Second, validate worker agent: +Second, validate worker RAG agent: + +``` +curl http://${host_ip}:9095/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ + "messages": "Michael Jackson song Thriller" + }' +``` + +Third, validate worker SQL agent: ``` curl http://${host_ip}:9095/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ - "query": "Most recent album by Taylor Swift" + "messages": "How many employees are in the company?" }' ``` -Third, validate supervisor agent: +Finally, validate supervisor agent: ``` curl http://${host_ip}:9090/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ - "query": "Most recent album by Taylor Swift" + "messages": "How many albums does Iron Maiden have?" }' ``` diff --git a/AgentQnA/docker_compose/intel/hpu/gaudi/compose.yaml b/AgentQnA/docker_compose/intel/hpu/gaudi/compose.yaml index a586ffd520..4895722c93 100644 --- a/AgentQnA/docker_compose/intel/hpu/gaudi/compose.yaml +++ b/AgentQnA/docker_compose/intel/hpu/gaudi/compose.yaml @@ -6,7 +6,6 @@ services: image: opea/agent:latest container_name: rag-agent-endpoint volumes: - # - ${WORKDIR}/GenAIExamples/AgentQnA/docker_image_build/GenAIComps/comps/agent/langchain/:/home/user/comps/agent/langchain/ - ${TOOLSET_PATH}:/home/user/tools/ ports: - "9095:9095" @@ -15,7 +14,7 @@ services: ip_address: ${ip_address} strategy: rag_agent_llama recursion_limit: ${recursion_limit_worker} - llm_engine: tgi + llm_engine: vllm HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} llm_endpoint_url: ${LLM_ENDPOINT_URL} model: ${LLM_MODEL_ID} @@ -33,14 +32,41 @@ services: LANGCHAIN_PROJECT: "opea-worker-agent-service" port: 9095 + worker-sql-agent: + image: opea/agent:latest + container_name: sql-agent-endpoint + volumes: + - ${WORKDIR}/GenAIExamples/AgentQnA/tests:/home/user/chinook-db # test db + ports: + - "9096:9096" + ipc: host + environment: + ip_address: ${ip_address} + strategy: sql_agent_llama + db_name: ${db_name} + db_path: ${db_path} + use_hints: false + recursion_limit: ${recursion_limit_worker} + llm_engine: vllm + HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} + llm_endpoint_url: ${LLM_ENDPOINT_URL} + model: ${LLM_MODEL_ID} + temperature: ${temperature} + max_new_tokens: ${max_new_tokens} + stream: false + require_human_feedback: false + no_proxy: ${no_proxy} + http_proxy: ${http_proxy} + https_proxy: ${https_proxy} + port: 9096 supervisor-react-agent: image: opea/agent:latest container_name: react-agent-endpoint depends_on: - worker-rag-agent + - worker-sql-agent volumes: - # - ${WORKDIR}/GenAIExamples/AgentQnA/docker_image_build/GenAIComps/comps/agent/langchain/:/home/user/comps/agent/langchain/ - ${TOOLSET_PATH}:/home/user/tools/ ports: - "9090:9090" @@ -49,7 +75,7 @@ services: ip_address: ${ip_address} strategy: react_llama recursion_limit: ${recursion_limit_supervisor} - llm_engine: tgi + llm_engine: vllm HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN} llm_endpoint_url: ${LLM_ENDPOINT_URL} model: ${LLM_MODEL_ID} @@ -66,4 +92,5 @@ services: LANGCHAIN_PROJECT: "opea-supervisor-agent-service" CRAG_SERVER: $CRAG_SERVER WORKER_AGENT_URL: $WORKER_AGENT_URL + SQL_AGENT_URL: $SQL_AGENT_URL port: 9090 diff --git a/AgentQnA/docker_compose/intel/hpu/gaudi/launch_agent_service_tgi_gaudi.sh b/AgentQnA/docker_compose/intel/hpu/gaudi/launch_agent_service_gaudi.sh similarity index 81% rename from AgentQnA/docker_compose/intel/hpu/gaudi/launch_agent_service_tgi_gaudi.sh rename to AgentQnA/docker_compose/intel/hpu/gaudi/launch_agent_service_gaudi.sh index 38f7d592b5..fff5d53f8d 100644 --- a/AgentQnA/docker_compose/intel/hpu/gaudi/launch_agent_service_tgi_gaudi.sh +++ b/AgentQnA/docker_compose/intel/hpu/gaudi/launch_agent_service_gaudi.sh @@ -16,8 +16,8 @@ ls $HF_CACHE_DIR export HUGGINGFACEHUB_API_TOKEN=${HUGGINGFACEHUB_API_TOKEN} export LLM_MODEL_ID="meta-llama/Meta-Llama-3.1-70B-Instruct" export NUM_SHARDS=4 -export LLM_ENDPOINT_URL="http://${ip_address}:8085" -export temperature=0.01 +export LLM_ENDPOINT_URL="http://${ip_address}:8086" +export temperature=0 export max_new_tokens=4096 # agent related environment variables @@ -26,7 +26,11 @@ echo "TOOLSET_PATH=${TOOLSET_PATH}" export recursion_limit_worker=12 export recursion_limit_supervisor=10 export WORKER_AGENT_URL="http://${ip_address}:9095/v1/chat/completions" +export SQL_AGENT_URL="http://${ip_address}:9096/v1/chat/completions" export RETRIEVAL_TOOL_URL="http://${ip_address}:8889/v1/retrievaltool" export CRAG_SERVER=http://${ip_address}:8080 +export db_name=Chinook +export db_path="sqlite:////home/user/chinook-db/Chinook_Sqlite.sqlite" + docker compose -f compose.yaml up -d diff --git a/AgentQnA/docker_image_build/build.yaml b/AgentQnA/docker_image_build/build.yaml index 61f2b0dda5..723a61c873 100644 --- a/AgentQnA/docker_image_build/build.yaml +++ b/AgentQnA/docker_image_build/build.yaml @@ -11,3 +11,9 @@ services: https_proxy: ${https_proxy} no_proxy: ${no_proxy} image: ${REGISTRY:-opea}/agent:${TAG:-latest} + agent-ui: + build: + context: ../ui + dockerfile: ./docker/Dockerfile + extends: agent + image: ${REGISTRY:-opea}/agent-ui:${TAG:-latest} diff --git a/AgentQnA/retrieval_tool/launch_retrieval_tool.sh b/AgentQnA/retrieval_tool/launch_retrieval_tool.sh index 1a142abf8e..b0c22fea41 100644 --- a/AgentQnA/retrieval_tool/launch_retrieval_tool.sh +++ b/AgentQnA/retrieval_tool/launch_retrieval_tool.sh @@ -13,6 +13,7 @@ export TEI_EMBEDDING_ENDPOINT="http://${host_ip}:6006" export TEI_RERANKING_ENDPOINT="http://${host_ip}:8808" export REDIS_URL="redis://${host_ip}:6379" export INDEX_NAME="rag-redis" +export RERANK_TYPE="tei" export MEGA_SERVICE_HOST_IP=${host_ip} export EMBEDDING_SERVICE_HOST_IP=${host_ip} export RETRIEVER_SERVICE_HOST_IP=${host_ip} diff --git a/AgentQnA/tests/sql_agent_test/run_data_split.sh b/AgentQnA/tests/sql_agent_test/run_data_split.sh new file mode 100644 index 0000000000..2fc2dfcb0e --- /dev/null +++ b/AgentQnA/tests/sql_agent_test/run_data_split.sh @@ -0,0 +1,6 @@ +# Copyright (C) 2024 Intel Corporation +# SPDX-License-Identifier: Apache-2.0 + +DATAPATH=$WORKDIR/TAG-Bench/tag_queries.csv +OUTFOLDER=$WORKDIR/TAG-Bench/query_by_db +python3 split_data.py --path $DATAPATH --output $OUTFOLDER diff --git a/AgentQnA/tests/sql_agent_test/split_data.py b/AgentQnA/tests/sql_agent_test/split_data.py new file mode 100644 index 0000000000..1b3f5cfc79 --- /dev/null +++ b/AgentQnA/tests/sql_agent_test/split_data.py @@ -0,0 +1,27 @@ +# Copyright (C) 2024 Intel Corporation +# SPDX-License-Identifier: Apache-2.0 + +import argparse +import os + +import pandas as pd + +if __name__ == "__main__": + parser = argparse.ArgumentParser() + parser.add_argument("--path", type=str, required=True) + parser.add_argument("--output", type=str, required=True) + args = parser.parse_args() + + # if output folder does not exist, create it + if not os.path.exists(args.output): + os.makedirs(args.output) + + # Load the data + data = pd.read_csv(args.path) + + # Split the data by domain + domains = data["DB used"].unique() + for domain in domains: + domain_data = data[data["DB used"] == domain] + out = os.path.join(args.output, f"query_{domain}.csv") + domain_data.to_csv(out, index=False) diff --git a/AgentQnA/tests/step1_build_images.sh b/AgentQnA/tests/step1_build_images.sh index 43000f6630..4782da677a 100644 --- a/AgentQnA/tests/step1_build_images.sh +++ b/AgentQnA/tests/step1_build_images.sh @@ -21,7 +21,7 @@ function build_docker_images_for_retrieval_tool(){ # git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../ get_genai_comps echo "Build all the images with --no-cache..." - service_list="doc-index-retriever dataprep-redis embedding retriever-redis reranking" + service_list="doc-index-retriever dataprep-redis embedding retriever reranking" docker compose -f build.yaml build ${service_list} --no-cache docker pull ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 @@ -35,6 +35,27 @@ function build_agent_docker_image() { docker compose -f build.yaml build --no-cache } +function build_vllm_docker_image() { + echo "Building the vllm docker image" + cd $WORKPATH + echo $WORKPATH + if [ ! -d "./vllm" ]; then + echo "clone vllm repo...." + git clone https://github.com/vllm-project/vllm.git + fi + cd ./vllm + echo "Checking out latest stable release of vllm" + git checkout v0.6.6 + docker build --no-cache -f Dockerfile.hpu -t opea/vllm-gaudi:comps --shm-size=128g . --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy + if [ $? -ne 0 ]; then + echo "opea/vllm-gaudi:comps failed" + exit 1 + else + echo "opea/vllm-gaudi:comps successful" + fi +} + + function main() { echo "==================== Build docker images for retrieval tool ====================" build_docker_images_for_retrieval_tool @@ -43,6 +64,12 @@ function main() { echo "==================== Build agent docker image ====================" build_agent_docker_image echo "==================== Build agent docker image completed ====================" + + echo "==================== Build vllm docker image ====================" + build_vllm_docker_image + echo "==================== Build vllm docker image completed ====================" + + docker image ls | grep vllm } main diff --git a/AgentQnA/tests/step4_launch_and_validate_agent_tgi.sh b/AgentQnA/tests/step4_launch_and_validate_agent_tgi.sh index fde46e0d5a..c99e212ff6 100644 --- a/AgentQnA/tests/step4_launch_and_validate_agent_tgi.sh +++ b/AgentQnA/tests/step4_launch_and_validate_agent_tgi.sh @@ -10,6 +10,8 @@ echo "WORKDIR=${WORKDIR}" export ip_address=$(hostname -I | awk '{print $1}') export TOOLSET_PATH=$WORKDIR/GenAIExamples/AgentQnA/tools/ export HUGGINGFACEHUB_API_TOKEN=${HUGGINGFACEHUB_API_TOKEN} +HF_TOKEN=${HUGGINGFACEHUB_API_TOKEN} +model="meta-llama/Meta-Llama-3.1-70B-Instruct" export HF_CACHE_DIR=$WORKDIR/hf_cache if [ ! -d "$HF_CACHE_DIR" ]; then @@ -17,6 +19,9 @@ if [ ! -d "$HF_CACHE_DIR" ]; then fi ls $HF_CACHE_DIR +vllm_port=8086 +vllm_volume=${HF_CACHE_DIR} + function start_tgi(){ echo "Starting tgi-gaudi server" cd $WORKDIR/GenAIExamples/AgentQnA/docker_compose/intel/hpu/gaudi @@ -24,14 +29,67 @@ function start_tgi(){ } +function start_vllm_service_70B() { + + echo "token is ${HF_TOKEN}" + + echo "start vllm gaudi service" + echo "**************model is $model**************" + vllm_image=opea/vllm-gaudi:comps + docker run -d --runtime=habana --rm --name "vllm-gaudi-server" -e HABANA_VISIBLE_DEVICES=0,1,2,3 -p $vllm_port:8000 -v $vllm_volume:/data -e HF_TOKEN=$HF_TOKEN -e HUGGING_FACE_HUB_TOKEN=$HF_TOKEN -e HF_HOME=/data -e OMPI_MCA_btl_vader_single_copy_mechanism=none -e PT_HPU_ENABLE_LAZY_COLLECTIVES=true -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e no_proxy=$no_proxy -e VLLM_SKIP_WARMUP=true --cap-add=sys_nice --ipc=host $vllm_image --model ${model} --max-seq-len-to-capture 16384 --tensor-parallel-size 4 + sleep 5s + echo "Waiting vllm gaudi ready" + n=0 + LOG_PATH=$PWD + until [[ "$n" -ge 100 ]] || [[ $ready == true ]]; do + docker logs vllm-gaudi-server + docker logs vllm-gaudi-server &> ${LOG_PATH}/vllm-gaudi-service.log + n=$((n+1)) + if grep -q "Uvicorn running on" ${LOG_PATH}/vllm-gaudi-service.log; then + break + fi + if grep -q "No such container" ${LOG_PATH}/vllm-gaudi-service.log; then + echo "container vllm-gaudi-server not found" + exit 1 + fi + sleep 5s + done + sleep 5s + echo "Service started successfully" +} + + +function prepare_data() { + cd $WORKDIR + + echo "Downloading data..." + git clone https://github.com/TAG-Research/TAG-Bench.git + cd TAG-Bench/setup + chmod +x get_dbs.sh + ./get_dbs.sh + + echo "Split data..." + cd $WORKPATH/tests/sql_agent_test + bash run_data_split.sh + + echo "Data preparation done!" +} + +function download_chinook_data(){ + echo "Downloading chinook data..." + cd $WORKDIR + git clone https://github.com/lerocha/chinook-database.git + cp chinook-database/ChinookDatabase/DataSources/Chinook_Sqlite.sqlite $WORKDIR/GenAIExamples/AgentQnA/tests/ +} + function start_agent_and_api_server() { echo "Starting CRAG server" docker run -d --runtime=runc --name=kdd-cup-24-crag-service -p=8080:8000 docker.io/aicrowd/kdd-cup-24-crag-mock-api:v0 echo "Starting Agent services" cd $WORKDIR/GenAIExamples/AgentQnA/docker_compose/intel/hpu/gaudi - bash launch_agent_service_tgi_gaudi.sh - sleep 10 + bash launch_agent_service_gaudi.sh + sleep 2m } function validate() { @@ -49,35 +107,76 @@ function validate() { } function validate_agent_service() { - echo "----------------Test agent ----------------" - # local CONTENT=$(http_proxy="" curl http://${ip_address}:9095/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ - # "query": "Tell me about Michael Jackson song thriller" - # }') + # # test worker rag agent + echo "======================Testing worker rag agent======================" export agent_port="9095" - local CONTENT=$(python3 $WORKDIR/GenAIExamples/AgentQnA/tests/test.py) + prompt="Tell me about Michael Jackson song Thriller" + local CONTENT=$(python3 $WORKDIR/GenAIExamples/AgentQnA/tests/test.py --prompt "$prompt") + # echo $CONTENT local EXIT_CODE=$(validate "$CONTENT" "Thriller" "rag-agent-endpoint") - docker logs rag-agent-endpoint + echo $EXIT_CODE + local EXIT_CODE="${EXIT_CODE:0-1}" if [ "$EXIT_CODE" == "1" ]; then + docker logs rag-agent-endpoint exit 1 fi - # local CONTENT=$(http_proxy="" curl http://${ip_address}:9090/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ - # "query": "Tell me about Michael Jackson song thriller" - # }') + # # test worker sql agent + echo "======================Testing worker sql agent======================" + export agent_port="9096" + prompt="How many employees are there in the company?" + local CONTENT=$(python3 $WORKDIR/GenAIExamples/AgentQnA/tests/test.py --prompt "$prompt") + local EXIT_CODE=$(validate "$CONTENT" "8" "sql-agent-endpoint") + echo $CONTENT + # echo $EXIT_CODE + local EXIT_CODE="${EXIT_CODE:0-1}" + if [ "$EXIT_CODE" == "1" ]; then + docker logs sql-agent-endpoint + exit 1 + fi + + # test supervisor react agent + echo "======================Testing supervisor react agent======================" export agent_port="9090" - local CONTENT=$(python3 $WORKDIR/GenAIExamples/AgentQnA/tests/test.py) - local EXIT_CODE=$(validate "$CONTENT" "Thriller" "react-agent-endpoint") - docker logs react-agent-endpoint + prompt="How many albums does Iron Maiden have?" + local CONTENT=$(python3 $WORKDIR/GenAIExamples/AgentQnA/tests/test.py --prompt "$prompt") + local EXIT_CODE=$(validate "$CONTENT" "21" "react-agent-endpoint") + # echo $CONTENT + echo $EXIT_CODE + local EXIT_CODE="${EXIT_CODE:0-1}" if [ "$EXIT_CODE" == "1" ]; then + docker logs react-agent-endpoint exit 1 fi } +function remove_data() { + echo "Removing data..." + cd $WORKDIR + if [ -d "TAG-Bench" ]; then + rm -rf TAG-Bench + fi + echo "Data removed!" +} + +function remove_chinook_data(){ + echo "Removing chinook data..." + cd $WORKDIR + if [ -d "chinook-database" ]; then + rm -rf chinook-database + fi + echo "Chinook data removed!" +} + function main() { - echo "==================== Start TGI ====================" - start_tgi - echo "==================== TGI started ====================" + echo "==================== Prepare data ====================" + download_chinook_data + echo "==================== Data prepare done ====================" + + echo "==================== Start VLLM service ====================" + start_vllm_service_70B + echo "==================== VLLM service started ====================" echo "==================== Start agent ====================" start_agent_and_api_server @@ -88,4 +187,8 @@ function main() { echo "==================== Agent service validated ====================" } +remove_data +remove_chinook_data main +remove_data +remove_chinook_data diff --git a/AgentQnA/tests/test.py b/AgentQnA/tests/test.py index f0ef934412..400684ffd6 100644 --- a/AgentQnA/tests/test.py +++ b/AgentQnA/tests/test.py @@ -1,6 +1,7 @@ # Copyright (C) 2024 Intel Corporation # SPDX-License-Identifier: Apache-2.0 +import argparse import os import requests @@ -9,17 +10,47 @@ def generate_answer_agent_api(url, prompt): proxies = {"http": ""} payload = { - "query": prompt, + "messages": prompt, } response = requests.post(url, json=payload, proxies=proxies) answer = response.json()["text"] return answer +def process_request(url, query, is_stream=False): + proxies = {"http": ""} + + payload = { + "messages": query, + } + + try: + resp = requests.post(url=url, json=payload, proxies=proxies, stream=is_stream) + if not is_stream: + ret = resp.json()["text"] + print(ret) + else: + for line in resp.iter_lines(decode_unicode=True): + print(line) + ret = None + + resp.raise_for_status() # Raise an exception for unsuccessful HTTP status codes + return ret + except requests.exceptions.RequestException as e: + ret = f"An error occurred:{e}" + print(ret) + return False + + if __name__ == "__main__": + parser = argparse.ArgumentParser() + parser.add_argument("--prompt", type=str) + parser.add_argument("--stream", action="store_true") + args = parser.parse_args() + ip_address = os.getenv("ip_address", "localhost") - agent_port = os.getenv("agent_port", "9095") + agent_port = os.getenv("agent_port", "9090") url = f"http://{ip_address}:{agent_port}/v1/chat/completions" - prompt = "Tell me about Michael Jackson song thriller" - answer = generate_answer_agent_api(url, prompt) - print(answer) + prompt = args.prompt + + process_request(url, prompt, args.stream) diff --git a/AgentQnA/tests/test_compose_on_gaudi.sh b/AgentQnA/tests/test_compose_on_gaudi.sh index 0720a9b2b9..cf224b6aa1 100644 --- a/AgentQnA/tests/test_compose_on_gaudi.sh +++ b/AgentQnA/tests/test_compose_on_gaudi.sh @@ -4,6 +4,9 @@ set -xe +echo "All running containers" +docker ps + WORKPATH=$(dirname "$PWD") export WORKDIR=$WORKPATH/../../ echo "WORKDIR=${WORKDIR}" @@ -27,7 +30,7 @@ function stop_agent_docker() { done } -function stop_tgi(){ +function stop_llm(){ cd $WORKPATH/docker_compose/intel/hpu/gaudi/ container_list=$(cat tgi_gaudi.yaml | grep container_name | cut -d':' -f2) for container_name in $container_list; do @@ -36,6 +39,14 @@ function stop_tgi(){ if [[ ! -z "$cid" ]]; then docker rm $cid -f && sleep 1s; fi done + cid=$(docker ps -aq --filter "name=vllm-gaudi-server") + echo "Stopping container $cid" + if [[ ! -z "$cid" ]]; then docker rm $cid -f && sleep 1s; fi + + cid=$(docker ps -aq --filter "name=test-comps-vllm-gaudi-service") + echo "Stopping container $cid" + if [[ ! -z "$cid" ]]; then docker rm $cid -f && sleep 1s; fi + } function stop_retrieval_tool() { @@ -52,7 +63,7 @@ function stop_retrieval_tool() { echo "workpath: $WORKPATH" echo "=================== Stop containers ====================" stop_crag -stop_tgi +stop_llm stop_agent_docker stop_retrieval_tool @@ -78,6 +89,7 @@ echo "=================== #5 Stop agent and API server====================" stop_crag stop_agent_docker stop_retrieval_tool +stop_llm echo "=================== #5 Agent and API server stopped====================" echo y | docker system prune diff --git a/AgentQnA/tools/supervisor_agent_tools.yaml b/AgentQnA/tools/supervisor_agent_tools.yaml index 4b53cc9f9f..bfe6f970d4 100644 --- a/AgentQnA/tools/supervisor_agent_tools.yaml +++ b/AgentQnA/tools/supervisor_agent_tools.yaml @@ -2,7 +2,7 @@ # SPDX-License-Identifier: Apache-2.0 search_knowledge_base: - description: Search knowledge base for a given query. Returns text related to the query. + description: Search a knowledge base for a given query. Returns text related to the query. callable_api: tools.py:search_knowledge_base args_schema: query: @@ -10,6 +10,15 @@ search_knowledge_base: description: query return_output: retrieved_data +search_artist_database: + description: Search a SQL database on artists and their music with a natural language query. Returns text related to the query. + callable_api: tools.py:search_sql_database + args_schema: + query: + type: str + description: natural language query + return_output: retrieved_data + get_artist_birth_place: description: Get the birth place of an artist. callable_api: tools.py:get_artist_birth_place diff --git a/AgentQnA/tools/tools.py b/AgentQnA/tools/tools.py index 94a44f6787..67137c3c18 100644 --- a/AgentQnA/tools/tools.py +++ b/AgentQnA/tools/tools.py @@ -8,13 +8,30 @@ def search_knowledge_base(query: str) -> str: - """Search the knowledge base for a specific query.""" - # use worker agent (DocGrader) to search the knowledge base + """Search a knowledge base about music and singers for a given query. + + Returns text related to the query. + """ url = os.environ.get("WORKER_AGENT_URL") print(url) proxies = {"http": ""} payload = { - "query": query, + "messages": query, + } + response = requests.post(url, json=payload, proxies=proxies) + return response.json()["text"] + + +def search_sql_database(query: str) -> str: + """Search a SQL database on artists and their music with a natural language query. + + Returns text related to the query. + """ + url = os.environ.get("SQL_AGENT_URL") + print(url) + proxies = {"http": ""} + payload = { + "messages": query, } response = requests.post(url, json=payload, proxies=proxies) return response.json()["text"] diff --git a/AgentQnA/tools/worker_agent_tools.py b/AgentQnA/tools/worker_agent_tools.py index fded38ec3a..9fe40f11f0 100644 --- a/AgentQnA/tools/worker_agent_tools.py +++ b/AgentQnA/tools/worker_agent_tools.py @@ -12,7 +12,7 @@ def search_knowledge_base(query: str) -> str: print(url) proxies = {"http": ""} payload = { - "messages": query, + "text": query, } response = requests.post(url, json=payload, proxies=proxies) print(response) diff --git a/AgentQnA/ui/docker/Dockerfile b/AgentQnA/ui/docker/Dockerfile new file mode 100644 index 0000000000..1d5115f4b5 --- /dev/null +++ b/AgentQnA/ui/docker/Dockerfile @@ -0,0 +1,26 @@ +# Copyright (C) 2024 Intel Corporation +# SPDX-License-Identifier: Apache-2.0 + +# Use node 20.11.1 as the base image +FROM node:20.11.1 + +# Update package manager and install Git +RUN apt-get update -y && apt-get install -y git + +# Copy the front-end code repository +COPY svelte /home/user/svelte + +# Set the working directory +WORKDIR /home/user/svelte + +# Install front-end dependencies +RUN npm install + +# Build the front-end application +RUN npm run build + +# Expose the port of the front-end application +EXPOSE 5173 + +# Run the front-end application in preview mode +CMD ["npm", "run", "preview", "--", "--port", "5173", "--host", "0.0.0.0"] diff --git a/AgentQnA/ui/svelte/.editorconfig b/AgentQnA/ui/svelte/.editorconfig new file mode 100644 index 0000000000..2b7a6637f7 --- /dev/null +++ b/AgentQnA/ui/svelte/.editorconfig @@ -0,0 +1,10 @@ +[*] +indent_style = tab + +[package.json] +indent_style = space +indent_size = 2 + +[*.md] +indent_style = space +indent_size = 2 diff --git a/AgentQnA/ui/svelte/.env b/AgentQnA/ui/svelte/.env new file mode 100644 index 0000000000..260701a6d0 --- /dev/null +++ b/AgentQnA/ui/svelte/.env @@ -0,0 +1 @@ +AGENT_URL = '/v1/chat/completions' diff --git a/AgentQnA/ui/svelte/.eslintignore b/AgentQnA/ui/svelte/.eslintignore new file mode 100644 index 0000000000..38972655fa --- /dev/null +++ b/AgentQnA/ui/svelte/.eslintignore @@ -0,0 +1,13 @@ +.DS_Store +node_modules +/build +/.svelte-kit +/package +.env +.env.* +!.env.example + +# Ignore files for PNPM, NPM and YARN +pnpm-lock.yaml +package-lock.json +yarn.lock diff --git a/AgentQnA/ui/svelte/.eslintrc.cjs b/AgentQnA/ui/svelte/.eslintrc.cjs new file mode 100644 index 0000000000..cfe2be4d4d --- /dev/null +++ b/AgentQnA/ui/svelte/.eslintrc.cjs @@ -0,0 +1,20 @@ +module.exports = { + root: true, + parser: "@typescript-eslint/parser", + extends: ["eslint:recommended", "plugin:@typescript-eslint/recommended", "prettier"], + plugins: ["svelte3", "@typescript-eslint", "neverthrow"], + ignorePatterns: ["*.cjs"], + overrides: [{ files: ["*.svelte"], processor: "svelte3/svelte3" }], + settings: { + "svelte3/typescript": () => require("typescript"), + }, + parserOptions: { + sourceType: "module", + ecmaVersion: 2020, + }, + env: { + browser: true, + es2017: true, + node: true, + }, +}; diff --git a/AgentQnA/ui/svelte/.prettierignore b/AgentQnA/ui/svelte/.prettierignore new file mode 100644 index 0000000000..38972655fa --- /dev/null +++ b/AgentQnA/ui/svelte/.prettierignore @@ -0,0 +1,13 @@ +.DS_Store +node_modules +/build +/.svelte-kit +/package +.env +.env.* +!.env.example + +# Ignore files for PNPM, NPM and YARN +pnpm-lock.yaml +package-lock.json +yarn.lock diff --git a/AgentQnA/ui/svelte/.prettierrc b/AgentQnA/ui/svelte/.prettierrc new file mode 100644 index 0000000000..3b2006102e --- /dev/null +++ b/AgentQnA/ui/svelte/.prettierrc @@ -0,0 +1,13 @@ +{ + "pluginSearchDirs": [ + "." + ], + "overrides": [ + { + "files": "*.svelte", + "options": { + "parser": "svelte" + } + } + ] +} \ No newline at end of file diff --git a/AgentQnA/ui/svelte/README.md b/AgentQnA/ui/svelte/README.md new file mode 100644 index 0000000000..bd0ae2da10 --- /dev/null +++ b/AgentQnA/ui/svelte/README.md @@ -0,0 +1,60 @@ +# AgentQnA + +## 📸 Project Screenshots + +![project-screenshot](../../assets/img/agent_ui.png) +![project-screenshot](../../assets/img/agent_ui_result.png) + +## 🧐 Features + +Here're some of the project's features: + +- Create Agent:Provide more precise answers based on user queries, showcase the high-quality output process of complex queries across different dimensions, and consolidate information to present comprehensive answers. + +## 🛠️ Get it Running + +1. Clone the repo. + +2. cd command to the current folder. + + ``` + cd AgentQnA/ui + ``` + +3. Modify the required .env variables. + + ``` + AGENT_URL = '' + ``` + +4. **For Local Development:** + +- Install the dependencies: + + ``` + npm install + ``` + +- Start the development server: + + ``` + npm run dev + ``` + +- The application will be available at `http://localhost:3000`. + +5. **For Docker Setup:** + +- Build the Docker image: + + ``` + docker build -t opea:agent-ui . + ``` + +- Run the Docker container: + + ``` + docker run -d -p 3000:3000 --name agent-ui opea:agent-ui + ``` + +- The application will be available at `http://localhost:3000`. diff --git a/AgentQnA/ui/svelte/package.json b/AgentQnA/ui/svelte/package.json new file mode 100644 index 0000000000..b778040bec --- /dev/null +++ b/AgentQnA/ui/svelte/package.json @@ -0,0 +1,60 @@ +{ + "name": "agent-example", + "version": "0.0.1", + "private": true, + "scripts": { + "dev": "vite dev --host 0.0.0.0", + "build": "vite build", + "preview": "vite preview", + "check": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json", + "check:watch": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json --watch", + "lint": "prettier --check . && eslint .", + "format": "prettier --write ." + }, + "devDependencies": { + "@fortawesome/free-solid-svg-icons": "6.2.0", + "@sveltejs/adapter-auto": "1.0.0-next.75", + "@sveltejs/kit": "^1.20.1", + "@tailwindcss/typography": "0.5.7", + "@types/debug": "4.1.7", + "@typescript-eslint/eslint-plugin": "^5.27.0", + "@typescript-eslint/parser": "^5.27.0", + "autoprefixer": "^10.4.7", + "daisyui": "^2.52.0", + "debug": "4.3.4", + "eslint": "^8.16.0", + "eslint-config-prettier": "^8.3.0", + "eslint-plugin-neverthrow": "1.1.4", + "eslint-plugin-svelte3": "^4.0.0", + "neverthrow": "5.0.0", + "pocketbase": "0.7.0", + "postcss": "^8.4.23", + "postcss-load-config": "^4.0.1", + "postcss-preset-env": "^8.3.2", + "prettier": "^2.8.8", + "prettier-plugin-svelte": "^2.7.0", + "prettier-plugin-tailwindcss": "^0.3.0", + "svelte": "^3.59.1", + "svelte-check": "^2.7.1", + "svelte-fa": "3.0.3", + "svelte-preprocess": "^4.10.7", + "tailwindcss": "^3.1.5", + "ts-pattern": "4.0.5", + "tslib": "^2.3.1", + "typescript": "^4.7.4", + "vite": "^4.3.9" + }, + "type": "module", + "dependencies": { + "@heroicons/vue": "^2.1.5", + "echarts": "^5.4.2", + "flowbite-svelte": "^0.38.5", + "flowbite-svelte-icons": "^0.3.6", + "fuse.js": "^6.6.2", + "marked": "^15.0.0", + "ramda": "^0.29.0", + "sjcl": "^1.0.8", + "sse.js": "^0.6.1", + "svelte-notifications": "^0.9.98" + } +} diff --git a/AgentQnA/ui/svelte/postcss.config.cjs b/AgentQnA/ui/svelte/postcss.config.cjs new file mode 100644 index 0000000000..e68d4de268 --- /dev/null +++ b/AgentQnA/ui/svelte/postcss.config.cjs @@ -0,0 +1,13 @@ +const tailwindcss = require("tailwindcss"); +const autoprefixer = require("autoprefixer"); + +const config = { + plugins: [ + //Some plugins, like tailwindcss/nesting, need to run before Tailwind, + tailwindcss(), + //But others, like autoprefixer, need to run after, + autoprefixer, + ], +}; + +module.exports = config; diff --git a/AgentQnA/ui/svelte/src/app.d.ts b/AgentQnA/ui/svelte/src/app.d.ts new file mode 100644 index 0000000000..76f5cae98c --- /dev/null +++ b/AgentQnA/ui/svelte/src/app.d.ts @@ -0,0 +1,50 @@ +// Copyright (C) 2025 Intel Corporation +// SPDX-License-Identifier: Apache-2.0 + +// See: https://kit.svelte.dev/docs/types#app +// import { Result} from "neverthrow"; + +declare namespace App { + interface Locals { + user?: User; + } + // interface PageData { } + // interface PageError {} + // interface Platform {} +} + +interface User { + id?: string; + email: string; + password?: string; + token?: string; + [key: string]: any; +} + +type AuthResponse = Result; + +interface AuthAdapter { + login(props: { email: string; password: string }): Promise; + signup(props: { email: string; password: string; password_confirm: string }): Promise; + validate_session(props: { token: string }): Promise; + logout(props: { token: string; email: string }): Promise>; + forgotPassword(props: { email: string; password: string }): Promise>; +} + +interface ChatAdapter { + modelList(props: {}): Promise>; + txt2img(props: {}): Promise>; +} + +interface ChatMessage { + role: string; + content: string; +} + +interface ChatMessageType { + model: string; + knowledge: string; + temperature: string; + max_new_tokens: string; + topk: string; +} diff --git a/AgentQnA/ui/svelte/src/app.html b/AgentQnA/ui/svelte/src/app.html new file mode 100644 index 0000000000..5baaf1750e --- /dev/null +++ b/AgentQnA/ui/svelte/src/app.html @@ -0,0 +1,17 @@ + + + + + + + + + %sveltekit.head% + + +
%sveltekit.body%
+ + diff --git a/AgentQnA/ui/svelte/src/app.postcss b/AgentQnA/ui/svelte/src/app.postcss new file mode 100644 index 0000000000..c3e0519c6a --- /dev/null +++ b/AgentQnA/ui/svelte/src/app.postcss @@ -0,0 +1,82 @@ +/* Write your global styles here, in PostCSS syntax */ +@tailwind base; +@tailwind components; +@tailwind utilities; + +.btn { + @apply flex-nowrap; +} +a.btn { + @apply no-underline; +} +.input { + @apply text-base; +} + +.bg-dark-blue { + background-color: #004a86; +} + +.bg-light-blue { + background-color: #0068b5; +} + +.bg-turquoise { + background-color: #00a3f6; +} + +.bg-header { + background-color: #ffffff; +} + +.bg-button { + background-color: #0068b5; +} + +.bg-title { + background-color: #f7f7f7; +} + +.text-header { + color: #0068b5; +} + +.text-button { + color: #0071c5; +} + +.text-title-color { + color: rgb(38,38,38); +} + +.font-intel { + font-family: "intel-clear","tahoma",Helvetica,"helvetica",Arial,sans-serif; +} + +.font-title-intel { + font-family: "intel-one","intel-clear",Helvetica,Arial,sans-serif; +} + +.bg-footer { + background-color: #e7e7e7; +} + +.bg-light-green { + background-color: #d7f3a1; +} + +.bg-purple { + background-color: #653171; +} + +.bg-dark-blue { + background-color: #224678; +} + +.border-input-color { + border-color: #605e5c; +} + +.w-12\/12 { + width: 100% +} \ No newline at end of file diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/createSub.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/createSub.svelte new file mode 100644 index 0000000000..b31044d0fc --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/createSub.svelte @@ -0,0 +1,25 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/download.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/download.svelte new file mode 100644 index 0000000000..da8bcefb3d --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/download.svelte @@ -0,0 +1,9 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/eye.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/eye.svelte new file mode 100644 index 0000000000..06f9a821e4 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/eye.svelte @@ -0,0 +1,16 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/newAI.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/newAI.svelte new file mode 100644 index 0000000000..6fc1179daf --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/newAI.svelte @@ -0,0 +1,97 @@ + + + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/resource.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/resource.svelte new file mode 100644 index 0000000000..6460bb34d2 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/resource.svelte @@ -0,0 +1,8 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/search.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/search.svelte new file mode 100644 index 0000000000..79c22c7b2f --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/search.svelte @@ -0,0 +1,13 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/searchDelete.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/searchDelete.svelte new file mode 100644 index 0000000000..e6907c21df --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/searchDelete.svelte @@ -0,0 +1,17 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/searchResult.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/searchResult.svelte new file mode 100644 index 0000000000..378f3cdf50 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/searchResult.svelte @@ -0,0 +1,20 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/star.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/star.svelte new file mode 100644 index 0000000000..1a0e4175cc --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/star.svelte @@ -0,0 +1,22 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/summary.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/summary.svelte new file mode 100644 index 0000000000..952c986061 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/summary.svelte @@ -0,0 +1,44 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/taskIcon.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/taskIcon.svelte new file mode 100644 index 0000000000..737b7a8cbc --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/taskIcon.svelte @@ -0,0 +1,24 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/taskResult.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/taskResult.svelte new file mode 100644 index 0000000000..f4b3833002 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/taskResult.svelte @@ -0,0 +1,60 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/time.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/time.svelte new file mode 100644 index 0000000000..140f544c82 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/time.svelte @@ -0,0 +1,8 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/Agent/toolIcon.svelte b/AgentQnA/ui/svelte/src/lib/assets/Agent/toolIcon.svelte new file mode 100644 index 0000000000..342f5df387 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/Agent/toolIcon.svelte @@ -0,0 +1,36 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/ActiveDislikeButtonIcon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/ActiveDislikeButtonIcon.svelte new file mode 100644 index 0000000000..b5c1ea06b0 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/ActiveDislikeButtonIcon.svelte @@ -0,0 +1,28 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/ActiveLikeButtonIcon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/ActiveLikeButtonIcon.svelte new file mode 100644 index 0000000000..b410c73386 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/ActiveLikeButtonIcon.svelte @@ -0,0 +1,24 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/Folder.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/Folder.svelte new file mode 100644 index 0000000000..c338962899 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/Folder.svelte @@ -0,0 +1,28 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/Knowledge.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/Knowledge.svelte new file mode 100644 index 0000000000..7b1593d757 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/Knowledge.svelte @@ -0,0 +1,38 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/NoTranslate.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/NoTranslate.svelte new file mode 100644 index 0000000000..7f94dcef9c --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/NoTranslate.svelte @@ -0,0 +1,32 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/OldHelp.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/OldHelp.svelte new file mode 100644 index 0000000000..71c3cfb132 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/OldHelp.svelte @@ -0,0 +1,41 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/Question.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/Question.svelte new file mode 100644 index 0000000000..c4f296e2c2 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/Question.svelte @@ -0,0 +1,6 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/addKnowledge.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/addKnowledge.svelte new file mode 100644 index 0000000000..001de15b19 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/addKnowledge.svelte @@ -0,0 +1,92 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/adminKnowledge.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/adminKnowledge.svelte new file mode 100644 index 0000000000..2ec562e6ae --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/adminKnowledge.svelte @@ -0,0 +1,229 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/arrow-path-icon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/arrow-path-icon.svelte new file mode 100644 index 0000000000..cecb4b1273 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/arrow-path-icon.svelte @@ -0,0 +1,25 @@ + + + + + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/assistant.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/assistant.svelte new file mode 100644 index 0000000000..73e947fa4b --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/assistant.svelte @@ -0,0 +1,33 @@ + + + + + + + + + + + + + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/chat-bubble-left-icon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/chat-bubble-left-icon.svelte new file mode 100644 index 0000000000..9ea21455b0 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/chat-bubble-left-icon.svelte @@ -0,0 +1,25 @@ + + + + + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/chat.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/chat.svelte new file mode 100644 index 0000000000..d62154acad --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/chat.svelte @@ -0,0 +1,6 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/check-icon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/check-icon.svelte new file mode 100644 index 0000000000..5905fc0866 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/check-icon.svelte @@ -0,0 +1,25 @@ + + + + + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/csv.svg b/AgentQnA/ui/svelte/src/lib/assets/icons/csv.svg new file mode 100644 index 0000000000..1e808ba6cf --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/csv.svg @@ -0,0 +1,12 @@ + + + + + + + + \ No newline at end of file diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/dislikeButtonIcon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/dislikeButtonIcon.svelte new file mode 100644 index 0000000000..2ae9f8ed84 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/dislikeButtonIcon.svelte @@ -0,0 +1,28 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/download-directory.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/download-directory.svelte new file mode 100644 index 0000000000..d5ad5fb259 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/download-directory.svelte @@ -0,0 +1,16 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/likeButtonIcon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/likeButtonIcon.svelte new file mode 100644 index 0000000000..aaabc6c072 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/likeButtonIcon.svelte @@ -0,0 +1,24 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/loading-button-spinner-icon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/loading-button-spinner-icon.svelte new file mode 100644 index 0000000000..9be0ceb5b2 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/loading-button-spinner-icon.svelte @@ -0,0 +1,25 @@ + + + + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/message-avatar.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/message-avatar.svelte new file mode 100644 index 0000000000..0cbe61c7ed --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/message-avatar.svelte @@ -0,0 +1,17 @@ + + + + +{#if role === "Assistant"} + +{:else} + +{/if} diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/no-file.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/no-file.svelte new file mode 100644 index 0000000000..2e22f76cd0 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/no-file.svelte @@ -0,0 +1,37 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/paper-airplane.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/paper-airplane.svelte new file mode 100644 index 0000000000..6ee8c19f1c --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/paper-airplane.svelte @@ -0,0 +1,27 @@ + + + + + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/paste-link.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/paste-link.svelte new file mode 100644 index 0000000000..79f5b4d351 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/paste-link.svelte @@ -0,0 +1,20 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/pencil-square-icon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/pencil-square-icon.svelte new file mode 100644 index 0000000000..3b55df66c1 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/pencil-square-icon.svelte @@ -0,0 +1,25 @@ + + + + + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/plus-icon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/plus-icon.svelte new file mode 100644 index 0000000000..e97c9d6fbb --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/plus-icon.svelte @@ -0,0 +1,27 @@ + + + + + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/portrait.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/portrait.svelte new file mode 100644 index 0000000000..2f9828b88c --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/portrait.svelte @@ -0,0 +1,20 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/translateIcon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/translateIcon.svelte new file mode 100644 index 0000000000..47a2f38ea1 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/translateIcon.svelte @@ -0,0 +1,32 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/trash-icon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/trash-icon.svelte new file mode 100644 index 0000000000..363430c6d8 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/trash-icon.svelte @@ -0,0 +1,25 @@ + + + + + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/upload-directory.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/upload-directory.svelte new file mode 100644 index 0000000000..2020d107f3 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/upload-directory.svelte @@ -0,0 +1,14 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/upload-files.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/upload-files.svelte new file mode 100644 index 0000000000..59913cbedc --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/upload-files.svelte @@ -0,0 +1,14 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/upoadKnowledge.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/upoadKnowledge.svelte new file mode 100644 index 0000000000..a9f3f8f813 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/upoadKnowledge.svelte @@ -0,0 +1,6 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/warning.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/warning.svelte new file mode 100644 index 0000000000..ba821a9a48 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/warning.svelte @@ -0,0 +1,22 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/assets/icons/x-mark-icon.svelte b/AgentQnA/ui/svelte/src/lib/assets/icons/x-mark-icon.svelte new file mode 100644 index 0000000000..8df3b7a20d --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/assets/icons/x-mark-icon.svelte @@ -0,0 +1,20 @@ + + + diff --git a/AgentQnA/ui/svelte/src/lib/common/sse.d.ts b/AgentQnA/ui/svelte/src/lib/common/sse.d.ts new file mode 100644 index 0000000000..c3f8ed69d6 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/common/sse.d.ts @@ -0,0 +1,15 @@ +// Copyright (C) 2025 Intel Corporation +// SPDX-License-Identifier: Apache-2.0 + +declare module "sse.js" { + export type SSEOptions = EventSourceInit & { + headers?: Record; + payload?: string; + method?: string; + }; + + export class SSE extends EventSource { + constructor(url: string | URL, sseOptions?: SSEOptions); + stream(): void; + } +} diff --git a/AgentQnA/ui/svelte/src/lib/common/timediff.ts b/AgentQnA/ui/svelte/src/lib/common/timediff.ts new file mode 100644 index 0000000000..09d191d8ae --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/common/timediff.ts @@ -0,0 +1,26 @@ +// Copyright (C) 2025 Intel Corporation +// SPDX-License-Identifier: Apache-2.0 + +export default function timeDifference(current: number, previous: number) { + const msPerMinute = 60 * 1000; + const msPerHour = msPerMinute * 60; + const msPerDay = msPerHour * 24; + const msPerMonth = msPerDay * 30; + const msPerYear = msPerDay * 365; + + const elapsed = current - previous; + + if (elapsed < msPerMinute) { + return Math.round(elapsed / 1000) + " seconds ago"; + } else if (elapsed < msPerHour) { + return Math.round(elapsed / msPerMinute) + " minutes ago"; + } else if (elapsed < msPerDay) { + return Math.round(elapsed / msPerHour) + " hours ago"; + } else if (elapsed < msPerMonth) { + return "approximately " + Math.round(elapsed / msPerDay) + " days ago"; + } else if (elapsed < msPerYear) { + return "approximately " + Math.round(elapsed / msPerMonth) + " months ago"; + } else { + return "approximately " + Math.round(elapsed / msPerYear) + " years ago"; + } +} diff --git a/AgentQnA/ui/svelte/src/lib/components/agent/loadingStatic.svelte b/AgentQnA/ui/svelte/src/lib/components/agent/loadingStatic.svelte new file mode 100644 index 0000000000..e100fce92a --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/components/agent/loadingStatic.svelte @@ -0,0 +1,16 @@ + + +
+
+
+
+
+
+
+
+
+
+
diff --git a/AgentQnA/ui/svelte/src/lib/components/chat/chat.svelte b/AgentQnA/ui/svelte/src/lib/components/chat/chat.svelte new file mode 100644 index 0000000000..b140e7d1b4 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/components/chat/chat.svelte @@ -0,0 +1,239 @@ + + + + + + AI Agent + + + + +
+ {#if chatMessages.length === 0 && query === ""} + + {:else if showAgent || chatMessages.length > 0} +
+
+
+ +
+

+ {agentName} +

+ +

+ {agentDescripe} +

+
+
+
+
+
+ {#if loading} +
+ +
+ {:else} + + {/if} +
+ + + + +
+
+
+
+ {/if} +
diff --git a/AgentQnA/ui/svelte/src/lib/components/chat/history.svelte b/AgentQnA/ui/svelte/src/lib/components/chat/history.svelte new file mode 100644 index 0000000000..81d9c277f4 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/components/chat/history.svelte @@ -0,0 +1,166 @@ + + + + + diff --git a/AgentQnA/ui/svelte/src/lib/components/chat/loadingAnimation.svelte b/AgentQnA/ui/svelte/src/lib/components/chat/loadingAnimation.svelte new file mode 100644 index 0000000000..5adffb1d0a --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/components/chat/loadingAnimation.svelte @@ -0,0 +1,37 @@ + + +
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/AgentQnA/ui/svelte/src/lib/components/content.svelte b/AgentQnA/ui/svelte/src/lib/components/content.svelte new file mode 100644 index 0000000000..b09f31fadd --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/components/content.svelte @@ -0,0 +1,33 @@ + + + + +
+ + +
+ {#key currentChatID} + + {/key} +
+
diff --git a/AgentQnA/ui/svelte/src/lib/components/create.svelte b/AgentQnA/ui/svelte/src/lib/components/create.svelte new file mode 100644 index 0000000000..4d7595b1e4 --- /dev/null +++ b/AgentQnA/ui/svelte/src/lib/components/create.svelte @@ -0,0 +1,238 @@ + + + + +
+

+ + Create Agent +

+ +
+
+
+ +
+
+ +
+
+
+ +
+ +
+