To use Amazon Bedrock in a Python application, you can leverage the AWS SDK for Python (Boto3). Bedrock is fully integrated with Boto3, allowing direct interaction with its APIs for invoking foundation models (FMs) like Anthropic Claude, AI21 Jurassic, or Amazon Titan. Here’s how to get started:
- Boto3 Setup and Authentication
First, install Boto3 (
pip install boto3
) and configure AWS credentials using IAM roles, access keys, or AWS CLI profiles. Ensure your IAM policy grants permissions for Bedrock actions likebedrock:InvokeModel
. Use the Bedrock Runtime client to invoke models:
import boto3
client = boto3.client(service_name='bedrock-runtime', region_name='us-east-1')
- Invoking Models
Bedrock requires specifying the model ID (e.g.,
anthropic.claude-v2
) and a structured input payload. For example, to use Claude:
response = client.invoke_model(
modelId='anthropic.claude-v2',
body=json.dumps({
"prompt": "Explain quantum computing in one sentence.",
"max_tokens_to_sample": 300
})
)
result = json.loads(response['body'].read())
print(result['completion'])
Each model family (Anthropic, AI21, etc.) has unique input formats. For Titan, you’d structure the payload with inputText
and textGenerationConfig
.
- Additional Considerations
- Model Access: Enable access to specific models in the AWS Bedrock console before invoking them.
- Regions: Check AWS documentation for model availability by region (e.g., Claude is not available in all regions).
- Alternatives: If using frameworks like LangChain, Bedrock’s
ChatBedrock
class simplifies integration.
Boto3 is the primary method, but you can also use the AWS SDK for Python (Boto3) Low-Level Client for more granular control or community tools like langchain-aws
for higher-level abstractions. Always validate responses and handle API rate limits or errors (e.g., ModelNotAccessibleException
) in your code.