|
| 1 | +--- |
| 2 | +title: "AWS Bedrock" |
| 3 | +description: "Instructions for integrating AWS Bedrock models with PraisonAI, including API setup and agent configuration" |
| 4 | +icon: "aws" |
| 5 | +--- |
| 6 | + |
| 7 | +# Add AWS Bedrock to PraisonAI |
| 8 | + |
| 9 | +AWS Bedrock provides access to high-performing foundation models from leading AI companies like Anthropic, Cohere, Meta, Stability AI, and Amazon through a single API. |
| 10 | + |
| 11 | +## Setup |
| 12 | + |
| 13 | +### Prerequisites |
| 14 | + |
| 15 | +Make sure you have AWS credentials configured: |
| 16 | + |
| 17 | +```bash |
| 18 | +pip install boto3 |
| 19 | +``` |
| 20 | + |
| 21 | +### Environment Variables |
| 22 | + |
| 23 | +Set up your AWS credentials: |
| 24 | + |
| 25 | +```bash |
| 26 | +export AWS_ACCESS_KEY_ID=your_access_key_id |
| 27 | +export AWS_SECRET_ACCESS_KEY=your_secret_access_key |
| 28 | +export AWS_REGION=us-east-1 |
| 29 | +``` |
| 30 | + |
| 31 | +## Using AWS Bedrock Models |
| 32 | + |
| 33 | +### Available Models |
| 34 | + |
| 35 | +AWS Bedrock supports various model providers: |
| 36 | + |
| 37 | +- **Anthropic Claude**: `bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0` |
| 38 | +- **Anthropic Claude Instant**: `bedrock/anthropic.claude-instant-v1` |
| 39 | +- **Amazon Titan**: `bedrock/amazon.titan-text-express-v1` |
| 40 | +- **Cohere Command**: `bedrock/cohere.command-text-v14` |
| 41 | +- **Meta Llama**: `bedrock/meta.llama2-70b-chat-v1` |
| 42 | + |
| 43 | +### agents.yaml Configuration |
| 44 | + |
| 45 | +```yaml |
| 46 | +framework: crewai |
| 47 | +topic: create movie script about cat in mars |
| 48 | +roles: |
| 49 | + researcher: |
| 50 | + backstory: Skilled in finding and organizing information, with a focus on research efficiency. |
| 51 | + goal: Gather information about Mars and cats |
| 52 | + role: Researcher |
| 53 | + llm: |
| 54 | + model: "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0" |
| 55 | + temperature: 0.7 |
| 56 | + tasks: |
| 57 | + gather_research: |
| 58 | + description: Research and gather information about Mars, its environment, and cats, including their behavior and characteristics. |
| 59 | + expected_output: Document with research findings, including interesting facts and information. |
| 60 | + tools: |
| 61 | + - '' |
| 62 | +``` |
| 63 | +
|
| 64 | +### Python Code Example |
| 65 | +
|
| 66 | +```python |
| 67 | +from praisonaiagents import Agent |
| 68 | + |
| 69 | +# Using Anthropic Claude via Bedrock |
| 70 | +agent = Agent( |
| 71 | + instructions="You are a helpful assistant", |
| 72 | + llm={ |
| 73 | + "model": "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0", |
| 74 | + "temperature": 0.7 |
| 75 | + } |
| 76 | +) |
| 77 | + |
| 78 | +# Using Amazon Titan via Bedrock |
| 79 | +titan_agent = Agent( |
| 80 | + instructions="You are a helpful assistant", |
| 81 | + llm={ |
| 82 | + "model": "bedrock/amazon.titan-text-express-v1", |
| 83 | + "temperature": 0.7 |
| 84 | + } |
| 85 | +) |
| 86 | + |
| 87 | +response = agent.ask("What is artificial intelligence?") |
| 88 | +print(response) |
| 89 | +``` |
| 90 | + |
| 91 | +## IAM Permissions |
| 92 | + |
| 93 | +Ensure your AWS IAM user/role has the necessary permissions to access Bedrock: |
| 94 | + |
| 95 | +```json |
| 96 | +{ |
| 97 | + "Version": "2012-10-17", |
| 98 | + "Statement": [ |
| 99 | + { |
| 100 | + "Effect": "Allow", |
| 101 | + "Action": [ |
| 102 | + "bedrock:InvokeModel", |
| 103 | + "bedrock:InvokeModelWithResponseStream" |
| 104 | + ], |
| 105 | + "Resource": "*" |
| 106 | + } |
| 107 | + ] |
| 108 | +} |
| 109 | +``` |
| 110 | + |
| 111 | +## Regional Availability |
| 112 | + |
| 113 | +AWS Bedrock is available in the following regions: |
| 114 | +- `us-east-1` (N. Virginia) |
| 115 | +- `us-west-2` (Oregon) |
| 116 | +- `ap-southeast-1` (Singapore) |
| 117 | +- `ap-northeast-1` (Tokyo) |
| 118 | +- `eu-central-1` (Frankfurt) |
| 119 | +- `eu-west-3` (Paris) |
| 120 | + |
| 121 | +Make sure to set your `AWS_REGION` environment variable to a supported region. |
| 122 | + |
| 123 | +## Cost Optimization |
| 124 | + |
| 125 | +AWS Bedrock charges are based on: |
| 126 | +- **Input tokens**: Text sent to the model |
| 127 | +- **Output tokens**: Text generated by the model |
| 128 | + |
| 129 | +Consider using smaller models for development and testing to optimize costs. |
| 130 | + |
| 131 | +## Error Handling |
| 132 | + |
| 133 | +Common errors and solutions: |
| 134 | + |
| 135 | +- **AccessDeniedException**: Check your IAM permissions |
| 136 | +- **ResourceNotFoundException**: Verify the model ID is correct and available in your region |
| 137 | +- **ThrottlingException**: Implement retry logic with exponential backoff |
| 138 | +- **ValidationException**: Check your input parameters and format |
| 139 | + |
| 140 | +## Advanced Configuration |
| 141 | + |
| 142 | +### Custom Endpoint |
| 143 | + |
| 144 | +For specific regions or custom endpoints: |
| 145 | + |
| 146 | +```python |
| 147 | +agent = Agent( |
| 148 | + instructions="You are a helpful assistant", |
| 149 | + llm={ |
| 150 | + "model": "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0", |
| 151 | + "aws_region": "us-west-2", |
| 152 | + "temperature": 0.7 |
| 153 | + } |
| 154 | +) |
| 155 | +``` |
| 156 | + |
| 157 | +### Streaming Responses |
| 158 | + |
| 159 | +For real-time responses: |
| 160 | + |
| 161 | +```python |
| 162 | +agent = Agent( |
| 163 | + instructions="You are a helpful assistant", |
| 164 | + llm={ |
| 165 | + "model": "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0", |
| 166 | + "stream": True |
| 167 | + } |
| 168 | +) |
| 169 | +``` |
| 170 | + |
| 171 | +| PraisonAI Chat | PraisonAI Code | PraisonAI (Multi-Agents) | |
| 172 | +| --- | --- | --- | |
| 173 | +| [LiteLLM](https://litellm.vercel.app/docs/providers) | [LiteLLM](https://litellm.vercel.app/docs/providers) | [Models](../models.md) | |
0 commit comments