From 6a1e0756a39aefaebe9a555ca40219d18d3b76bc Mon Sep 17 00:00:00 2001 From: Izabela <82568642+izabelapawlik@users.noreply.github.com> Date: Mon, 22 Jul 2024 14:46:02 +0200 Subject: [PATCH] =?UTF-8?q?Update=20Blog=20=E2=80=9Caws-cdk-bedrock-basics?= =?UTF-8?q?=E2=80=9D?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- content/blog/aws-cdk-bedrock-basics.md | 15 ++++++--------- 1 file changed, 6 insertions(+), 9 deletions(-) diff --git a/content/blog/aws-cdk-bedrock-basics.md b/content/blog/aws-cdk-bedrock-basics.md index fff764aa98..4035f37881 100644 --- a/content/blog/aws-cdk-bedrock-basics.md +++ b/content/blog/aws-cdk-bedrock-basics.md @@ -15,8 +15,7 @@ comments: true published: true language: en --- - -AI is taking over the world. At Bright Inventions, we've already helped several clients with generative AI. +AI is taking over the world. At Bright Inventions, we've already helped several clients with [generative AI](/our-areas/ai-software-development/).\ In this blog post, we'll see how to use aws-cdk to create a simple API that responds to prompts. ## Request Bedrock model access @@ -108,7 +107,6 @@ curl -s -X POST --location "https://${YOUR_LAMBDA_ID}.lambda-url.eu-central-1.on } ] } - ``` ## Titan Text Express configuration @@ -117,10 +115,10 @@ We can control and tweak some the aspects of how the model responds to our promp we can configure: -- temperature: Float value to control randomness in the response (0 to 1, default 0). Lower values decrease randomness. -- topP: Float value to control the diversity of options (0 to 1, default 1). Lower values ignore less probable options. -- maxTokenCount: Integer specifying the maximum number of tokens in the generated response (0 to 8,000, default 512). -- stopSequences: Array of strings indicating where the model should stop generating text. Use the pipe character (|) to +* temperature: Float value to control randomness in the response (0 to 1, default 0). Lower values decrease randomness. +* topP: Float value to control the diversity of options (0 to 1, default 1). Lower values ignore less probable options. +* maxTokenCount: Integer specifying the maximum number of tokens in the generated response (0 to 8,000, default 512). +* stopSequences: Array of strings indicating where the model should stop generating text. Use the pipe character (|) to separate different sequences (up to 20 characters). Let's modify our lambda to allow controlling the parameters. @@ -181,5 +179,4 @@ curl -X POST --location "https://${YOUR_LAMBDA_ID}.lambda-url.eu-central-1.on.aw ## Summary As you see, it is straightforward to get started with AWS Bedrock. The full example of this blog post is available in -[GitHub repo](https://github.com/bright/bright-aws-cdk-bedrock). - +[GitHub repo](https://github.com/bright/bright-aws-cdk-bedrock). \ No newline at end of file