-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Additional guidance on configuring the pandasai.json file in the LLM setup process. #1407
Comments
To configure the {
"save_logs": true,
"verbose": false,
"enforce_privacy": false,
"enable_cache": true,
"use_error_correction_framework": true,
"max_retries": 3,
"open_charts": true,
"save_charts": false,
"save_charts_path": "exports/charts",
"custom_whitelisted_dependencies": [],
"llm": "BambooLLM",
"llm_options": null
} This configuration includes options such as enabling logs, setting verbosity, enforcing privacy, enabling cache, using an error correction framework, setting the maximum number of retries, and configuring chart options. You can adjust these settings according to your requirements [1]. |
@dosu and how can i use this file in the code and can I use it for every LLM or just for Bamboo LLM. |
The {
"llm": "BambooLLM",
"llm_options": {
"api_key": "API_KEY_GOES_HERE"
}
} This configuration file allows you to specify the LLM and any necessary parameters, making it applicable for any LLM supported by PandasAI, not just Bamboo LLM. The |
@dosu can you provide example code and explain each and every aspect in this code |
Here's an example of how you can use the import pandas as pd
from pandasai import SmartDataframe
from pandasai.config import load_config_from_json
# Load configuration from pandasai.json
config = load_config_from_json()
# Create a sample pandas DataFrame
data = {
'Name': ['Alice', 'Bob', 'Charlie'],
'Age': [25, 30, 35],
'City': ['New York', 'Los Angeles', 'Chicago']
}
df = pd.DataFrame(data)
# Initialize a SmartDataframe with the loaded configuration
smart_df = SmartDataframe(df, config=config)
# Example query using natural language
response = smart_df.chat("What is the average age?")
# Print the response
print(response) Explanation:
This example demonstrates how to set up and use PandasAI with a configuration file to perform data analysis using natural language queries. The |
@dosu |
Here's an example of how you can use the import pandas as pd
from pandasai import SmartDataframe
from pandasai.config import load_config_from_json
# Load configuration from pandasai.json
config = load_config_from_json()
# Create a sample pandas DataFrame
data = {
'Name': ['Alice', 'Bob', 'Charlie'],
'Age': [25, 30, 35],
'City': ['New York', 'Los Angeles', 'Chicago']
}
df = pd.DataFrame(data)
# Initialize a SmartDataframe with the loaded configuration
smart_df = SmartDataframe(df, config=config)
# Example query using natural language
response = smart_df.chat("What is the average age?")
# Print the response
print(response) Explanation of Configurations:
These configurations allow you to customize the behavior of PandasAI to suit your needs, from logging and privacy to performance and model selection. |
Also tell me what's the structure of the pandasai.json file will be |
The structure of the {
"llm": "AzureOpenAI",
"llm_options": {
"api_token": "MY_API_TOKEN",
"azure_endpoint": "https://YOUR_RESOURCE_NAME.openai.azure.com/",
"api_version": "MY_API_VERSION",
"deployment_name": "CUSTOM_NAME"
}
} Explanation of Each Option:
|
@Muhammad-Adam1 here you find further examples on llm usage: https://docs.pandas-ai.com/llms. If you run into issues, be more specific and consider using github discussions or our discord channel for QA and learning. |
Path: /llms
The text was updated successfully, but these errors were encountered: