From d430604dfe8598854f5c486ad1b342ce3fa46796 Mon Sep 17 00:00:00 2001 From: "Randy, Huang" Date: Sun, 21 Apr 2024 15:22:19 +0900 Subject: [PATCH] Add Groq Llama3 model configuration instructions to usage guide --- docs/docs/usage-guide/additional_configurations.md | 13 +++++++++++++ 1 file changed, 13 insertions(+) diff --git a/docs/docs/usage-guide/additional_configurations.md b/docs/docs/usage-guide/additional_configurations.md index e11288177..8785a5801 100644 --- a/docs/docs/usage-guide/additional_configurations.md +++ b/docs/docs/usage-guide/additional_configurations.md @@ -125,6 +125,19 @@ key = ... Also, review the [AiHandler](https://github.com/Codium-ai/pr-agent/blob/main/pr_agent/algo/ai_handler.py) file for instructions on how to set keys for other models. +### Groq + +To use Llama3 model with Groq, for example, set: +``` +[config] # in configuration.toml +model = "llama3-70b-8192" +model_turbo = "llama3-70b-8192" +fallback_models = ["groq/llama3-70b-8192"] +[groq] # in .secrets.toml +key = ... # your Groq api key +``` +(you can obtain a Groq key from [here](https://console.groq.com/keys)) + ### Vertex AI To use Google's Vertex AI platform and its associated models (chat-bison/codechat-bison) set: