Skip to content

Actions: opea-project/GenAIExamples

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
29,432 workflow runs
29,432 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Align OpenAI API for FaqGen, DocSum
E2E test with docker compose #3162: Pull request #1401 synchronize by XinyaoWa
January 16, 2025 15:47 3s llm_openai_api
January 16, 2025 15:47 3s
Align OpenAI API for FaqGen, DocSum
Check hyperlinks and relative path validity #1059: Pull request #1401 synchronize by XinyaoWa
January 16, 2025 15:47 15s llm_openai_api
January 16, 2025 15:47 15s
Align OpenAI API for FaqGen, DocSum
Code Scan #3451: Pull request #1401 synchronize by XinyaoWa
January 16, 2025 15:47 45s llm_openai_api
January 16, 2025 15:47 45s
Align OpenAI API for FaqGen, DocSum
Compose file and dockerfile path checking #1057: Pull request #1401 synchronize by XinyaoWa
January 16, 2025 15:47 20s llm_openai_api
January 16, 2025 15:47 20s
Align OpenAI API for FaqGen, DocSum
E2E test with docker compose #3161: Pull request #1401 synchronize by XinyaoWa
January 16, 2025 15:47 Queued
January 16, 2025 15:47 Queued
Align OpenAI API for FaqGen, DocSum
Dependency Review #2687: Pull request #1401 synchronize by XinyaoWa
January 16, 2025 15:47 14s
January 16, 2025 15:47 14s
PR #1401
CodeQL #2816: by XinyaoWa
January 16, 2025 15:47 1m 44s refs/pull/1401/head
January 16, 2025 15:47 1m 44s
Push on main
CodeQL #2815: by chensuyue
January 16, 2025 15:10 1m 39s main
January 16, 2025 15:10 1m 39s
Standardize name for LLM comps
Check hyperlinks and relative path validity #1058: Pull request #1402 synchronize by XinyaoWa
January 16, 2025 15:09 22s llm_fix_name
January 16, 2025 15:09 22s
Standardize name for LLM comps
Compose file and dockerfile path checking #1056: Pull request #1402 synchronize by XinyaoWa
January 16, 2025 15:09 18s llm_fix_name
January 16, 2025 15:09 18s
Standardize name for LLM comps
Code Scan #3450: Pull request #1402 synchronize by XinyaoWa
January 16, 2025 15:09 52s llm_fix_name
January 16, 2025 15:09 52s
Standardize name for LLM comps
Check Online Document Building #1128: Pull request #1402 synchronize by XinyaoWa
January 16, 2025 15:09 2m 56s llm_fix_name
January 16, 2025 15:09 2m 56s
Standardize name for LLM comps
E2E test with docker compose #3160: Pull request #1402 synchronize by XinyaoWa
January 16, 2025 15:09 39m 12s
January 16, 2025 15:09 39m 12s
Standardize name for LLM comps
Dependency Review #2686: Pull request #1402 synchronize by XinyaoWa
January 16, 2025 15:09 14s
January 16, 2025 15:09 14s
PR #1402
CodeQL #2814: by XinyaoWa
January 16, 2025 15:09 1m 37s refs/pull/1402/head
January 16, 2025 15:09 1m 37s
Nightly build/publish latest docker images
Nightly build/publish latest docker images #89: Scheduled
January 16, 2025 14:38 Queued main
January 16, 2025 14:38 Queued
[ChatQnA] Switch to vLLM as default llm backend on Xeon
Check Online Document Building #1127: Pull request #1403 synchronize by wangkl2
January 16, 2025 14:06 2m 59s wangkl2:vllm-default
January 16, 2025 14:06 2m 59s
[ChatQnA] Switch to vLLM as default llm backend on Xeon
Check hyperlinks and relative path validity #1057: Pull request #1403 synchronize by wangkl2
January 16, 2025 14:06 20s wangkl2:vllm-default
January 16, 2025 14:06 20s
[ChatQnA] Switch to vLLM as default llm backend on Xeon
Code Scan #3449: Pull request #1403 synchronize by wangkl2
January 16, 2025 14:06 45s wangkl2:vllm-default
January 16, 2025 14:06 45s
[ChatQnA] Switch to vLLM as default llm backend on Xeon
Compose file and dockerfile path checking #1055: Pull request #1403 synchronize by wangkl2
January 16, 2025 14:06 18s wangkl2:vllm-default
January 16, 2025 14:06 18s
[ChatQnA] Switch to vLLM as default llm backend on Xeon
E2E test with docker compose #3159: Pull request #1403 synchronize by wangkl2
January 16, 2025 14:06 1h 25m 50s
January 16, 2025 14:06 1h 25m 50s
[ChatQnA] Switch to vLLM as default llm backend on Xeon
Dependency Review #2685: Pull request #1403 synchronize by wangkl2
January 16, 2025 14:06 13s
January 16, 2025 14:06 13s
[ChatQnA] Switch to vLLM as default llm backend on Xeon
Code Scan #3448: Pull request #1403 synchronize by wangkl2
January 16, 2025 14:04 56s wangkl2:vllm-default
January 16, 2025 14:04 56s
[ChatQnA] Switch to vLLM as default llm backend on Xeon
Check Online Document Building #1126: Pull request #1403 synchronize by wangkl2
January 16, 2025 14:04 2m 55s wangkl2:vllm-default
January 16, 2025 14:04 2m 55s
[ChatQnA] Switch to vLLM as default llm backend on Xeon
Compose file and dockerfile path checking #1054: Pull request #1403 synchronize by wangkl2
January 16, 2025 14:04 18s wangkl2:vllm-default
January 16, 2025 14:04 18s