From 71182fbbb2067a8cfb65b63545848df358826b5f Mon Sep 17 00:00:00 2001 From: alexsin368 Date: Fri, 15 Nov 2024 15:02:37 -0800 Subject: [PATCH] remove mention of vllm Signed-off-by: alexsin368 --- examples/CodeGen/deploy/gaudi.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/examples/CodeGen/deploy/gaudi.md b/examples/CodeGen/deploy/gaudi.md index 8877a166..b5a9a78c 100644 --- a/examples/CodeGen/deploy/gaudi.md +++ b/examples/CodeGen/deploy/gaudi.md @@ -1,4 +1,4 @@ -# Single node on-prem deployment with vLLM or TGI on Gaudi AI Accelerator +# Single node on-prem deployment with TGI on Gaudi AI Accelerator This deployment section covers single-node on-prem deployment of the CodeGen example with OPEA comps to deploy using the TGI service. We will be showcasing how