Skip to content

Commit

Permalink
DH-5775/removing the LLM Model from env vars (#478)
Browse files Browse the repository at this point in the history
  • Loading branch information
MohammadrezaPourreza authored May 6, 2024
1 parent d4d6f4e commit 514e498
Show file tree
Hide file tree
Showing 5 changed files with 0 additions and 10 deletions.
2 changes: 0 additions & 2 deletions .env.example
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
# Openai info. All these fields are required for the engine to work.
OPENAI_API_KEY = #This field is required for the engine to work.
ORG_ID =
# what is the LLM model to be used
LLM_MODEL = "gpt-4-turbo-preview"

# All of our SQL generation agents are using different tools to generate SQL queries, in order to limit the number of times that agents can
# use different tools you can set the "AGENT_MAX_ITERATIONS" env variable. By default it is set to 20 iterations.
Expand Down
1 change: 0 additions & 1 deletion .test.env
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@ SQL_GENERATOR='dataherald.tests.sql_generator.test_generator.TestGenerator'
EVALUATOR='dataherald.tests.evaluator.test_eval.TestEvaluator'
DB='dataherald.tests.db.test_db.TestDB'
OPENAI_API_KEY='foo'
LLM_MODEL='gpt_test'
PINECONE_API_KEY='foo2'
PINECONE_ENVIRONMENT='bar'
GOLDEN_RECORD_COLLECTION='bar2'
Expand Down
4 changes: 0 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,14 +68,10 @@ cp .env.example .env

Specifically the following fields must be manually set before the engine is started.

LLM_MODEL is employed by the engine to generate SQL from natural language. You can use the default model (gpt-4-turbo-preview) or use your own deployed model.

```
#OpenAI credentials and model
# mainly used for embedding models and finetunung
OPENAI_API_KEY =
# Used for the reasoning LLM or the main LLM which chooses the tools to generate SQL
LLM_MODEL =
ORG_ID =
#Encryption key for storing DB connection data in Mongo
Expand Down
2 changes: 0 additions & 2 deletions docs/envars.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@ provided in the .env.example file with the default values.
OPENAI_API_KEY =
ORG_ID =
LLM_MODEL = 'gpt-4-turbo-preview'
GOLDEN_RECORD_COLLECTION = 'my-golden-records'
Expand Down Expand Up @@ -51,7 +50,6 @@ provided in the .env.example file with the default values.

"OPENAI_API_KEY", "The OpenAI key used by the Dataherald Engine", "None", "Yes"
"ORG_ID", "The OpenAI Organization ID used by the Dataherald Engine", "None", "Yes"
"LLM_MODEL", "The Language Model used by the Dataherald Engine. Supported values include gpt-4-32k, gpt-4, gpt-3.5-turbo, gpt-3.5-turbo-16k", "``gpt-4-32k``", "No"
"GOLDEN_RECORD_COLLECTION", "The name of the collection in Mongo where golden records will be stored", "``my-golden-records``", "No"
"PINECONE_API_KEY", "The Pinecone API key used", "None", "Yes if using the Pinecone vector store"
"PINECONE_ENVIRONMENT", "The Pinecone environment", "None", "Yes if using the Pinecone vector store"
Expand Down
1 change: 0 additions & 1 deletion docs/quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@ The following environment variables must be set manually before the engine is st
#OpenAI credentials and model
OPENAI_API_KEY =
LLM_MODEL =
ORG_ID =
#Encryption key for storing DB connection data in Mongo
Expand Down

0 comments on commit 514e498

Please sign in to comment.