-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: 🚀 Integrate Amazon Bedrock Support2 #733
base: main
Are you sure you want to change the base?
feat: 🚀 Integrate Amazon Bedrock Support2 #733
Conversation
…y extendable to include more!)
Add support for Amazon Bedrock models, including: - Implement AWS credentials retrieval for Bedrock - Add Bedrock model initialization and handling - Include Claude 3 models (Opus, Sonnet, Haiku) for Bedrock - Adjust token limits for Bedrock models - Update chat action to support model selection - Add @ai-sdk/amazon-bedrock dependency Key changes: - app/lib/.server/llm/api-key.ts: Add getAWSCredentials function - app/lib/.server/llm/constants.ts: Define MAX_TOKENS_BEDROCK - app/lib/.server/llm/model.ts: Implement getBedrockModel function - app/lib/.server/llm/stream-text.ts: Use Bedrock-specific token limit - app/routes/api.chat.ts: Update to support model selection - app/utils/constants.ts: Add Bedrock model options - package.json: Add @ai-sdk/amazon-bedrock dependency - pnpm-lock.yaml: Update with new dependencies
- Translate comments to English for consistency - Add explanatory comment for AWS credentials function - Refactor default region assignment with inline comment
I changed the branch name, but the content is the same as - #689! |
- Deleted references to fork by Cole Medin - Removed information about choosing LLM models - Maintained focus on original Bolt.new project description
README.md
Outdated
|
||
- **Batch simple instructions**: Save time by combining simple instructions into one message. For example, you can ask Bolt to change the color scheme, add mobile responsiveness, and restart the dev server, all in one go saving you time and reducing API credit consumption significantly. | ||
```bash | ||
git clone https://github.com/coleam00/bolt.new-any-llm.git |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
repository name is pointing to coleam00's github
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I responded with e141171
return anthropic('claude-3-5-sonnet-20240620'); | ||
return anthropic(model); | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
using openRoute you can use gemini but if you want to use gemini directly you need to add gemini provider
export function getGeminiModel(apiKey: string, model: string = 'gemini-1.5-pro-latest') {
const gemini= createGoogleGenerativeAI({
apiKey,
});
return gemini(model);
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I responded with e5d16df
case 'Anthropic': | ||
return getAnthropicModel(apiKey, model); | ||
case 'OpenAI': | ||
return getOpenAIModel(apiKey, model); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also add gemini Provider here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I responded with e5d16df
switch (provider) { | ||
case 'Anthropic': | ||
return env.ANTHROPIC_API_KEY || cloudflareEnv.ANTHROPIC_API_KEY; | ||
case 'OpenAI': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add api key from gemini provider
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I responded with 7e0287f
- Include Gemini API key in the getAPIKey function - Allow retrieval of Gemini API key from environment variables
- Import Google Generative AI SDK - Add getGeminiModel function to create Gemini model instances - Update getModel function to support Gemini provider
- Include Gemini 1.5 Pro and Flash models in the available model options - Add latest and stable versions for both Gemini 1.5 Pro and Flash
- Include @ai-sdk/google package version 0.0.52 for Gemini integration
- Add @ai-sdk/google package and its dependencies to the lock file - Ensure consistent package versions across the project
- Replace old URL (https://github.com/coleam00/bolt.new-any-llm.git) with new URL (https://github.com/stackblitz/bolt.new) - Improve documentation accuracy for users setting up the project
@@ -1,3 +1,6 @@ | |||
interface Env { | |||
ANTHROPIC_API_KEY: string; | |||
OPENAI_API_KEY: string; | |||
GROQ_API_KEY: string; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One last thing to approve:
add GEMINI_API_KEY here and also on .env.example
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank you!
I responded with 9e5e73a
- Added AWS credential environment variables: - AWS_ACCESS_KEY_ID - AWS_SECRET_ACCESS_KEY - AWS_REGION - Enables AWS service integration capabilities
Overview
This PR adds support for Amazon Bedrock models to our LLM integration, enhancing our AI capabilities with Claude 3 models (Opus, Sonnet, Haiku).
Key Changes
Detailed Changes
app/lib/.server/llm/api-key.ts
:getAWSCredentials
function to fetch AWS access keys and regionapp/lib/.server/llm/constants.ts
:MAX_TOKENS_BEDROCK
constant (4096) for Bedrock modelsapp/lib/.server/llm/model.ts
:getBedrockModel
function for Bedrock model initializationgetModel
function to handle Bedrock providerapp/lib/.server/llm/stream-text.ts
:MAX_TOKENS_BEDROCK
)app/routes/api.chat.ts
:app/utils/constants.ts
:package.json
:@ai-sdk/amazon-bedrock
dependency (version 0.0.30)pnpm-lock.yaml
: