Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AWS Bedrock Client for LLMs #102

Open
ThePyProgrammer opened this issue Aug 6, 2024 · 0 comments · May be fixed by #112
Open

AWS Bedrock Client for LLMs #102

ThePyProgrammer opened this issue Aug 6, 2024 · 0 comments · May be fixed by #112
Assignees
Labels
llm-support Support for LLMs

Comments

@ThePyProgrammer
Copy link
Member

Something like a wrapper for this:

import boto3

session = boto3.Session(
    aws_access_key_id='<insert id>',
    aws_secret_access_key='<insert key>',
    region_name='<insert region>' # 'use-east-1'
)

client = session.client('bedrock-runtime', '<insert region>') # 'use-east-1'
def sut(prompt):
    conversation = [
        {
            "role": "user",
            "content": [{"text": prompt}],
        }
    ]

    response = client.converse(
        modelId="anthropic.claude-3-sonnet-20240229-v1:0",#"meta.llama2-13b-chat-v1",
        messages=conversation,
        inferenceConfig={"maxTokens":10,"temperature":0.5,"topP":0.9},
        additionalModelRequestFields={}
    )
    # Extract and print the response text.
    response_text = response["output"]["message"]["content"][0]["text"]
    return response_text

but with the chat and complete methods.

@ThePyProgrammer ThePyProgrammer added the llm-support Support for LLMs label Aug 6, 2024
@ThePyProgrammer ThePyProgrammer linked a pull request Aug 8, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
llm-support Support for LLMs
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants