Gathering detailed insights and metrics for @agenite/bedrock
Gathering detailed insights and metrics for @agenite/bedrock
🤖 Build powerful AI agents with TypeScript. Agenite makes it easy to create, compose, and control AI agents with first-class support for tools, streaming, and multi-agent architectures. Switch seamlessly between providers like OpenAI, Anthropic, AWS Bedrock, and Ollama.
npm install @agenite/bedrock
Typescript
Module System
Node Version
NPM Version
TypeScript (92.04%)
CSS (5.27%)
JavaScript (2.56%)
Handlebars (0.13%)
Verify real, reachable, and deliverable emails with instant MX records, SMTP checks, and disposable email detection.
Total Downloads
939
Last Day
30
Last Week
42
Last Month
371
Last Year
939
MIT License
30 Stars
70 Commits
2 Forks
1 Watchers
2 Branches
2 Contributors
Updated on Mar 12, 2025
Minified
Minified + Gzipped
Latest Version
0.3.1
Package Id
@agenite/bedrock@0.3.1
Unpacked Size
98.77 kB
Size
15.00 kB
File Count
10
NPM Version
11.1.0
Node Version
22.13.0
Published on
Feb 25, 2025
Cumulative downloads
Total Downloads
Last Day
500%
30
Compared to previous day
Last Week
-30%
42
Compared to previous week
Last Month
-34.7%
371
Compared to previous month
Last Year
0%
939
Compared to previous year
AWS Bedrock provider for Agenite, enabling seamless integration with Amazon's foundation models through the Bedrock runtime.
1npm install @agenite/bedrock
1import { BedrockProvider } from '@agenite/bedrock'; 2 3// Initialize the provider 4const provider = new BedrockProvider({ 5 model: 'anthropic.claude-3-5-haiku-20241022-v1:0', 6 region: 'us-west-2', 7}); 8 9// Generate a simple response 10const result = await provider.generate( 11 'What are the main features of Llama 2?' 12); 13console.log(result); 14 15// Use streaming for real-time responses 16const generator = provider.stream('Tell me about AWS Bedrock.'); 17for await (const chunk of generator) { 18 if (chunk.type === 'text') { 19 process.stdout.write(chunk.text); 20 } 21}
1interface BedrockProviderOptions { 2 model: string; // Bedrock model ID 3 region: string; // AWS region 4 temperature?: number; // Temperature for response generation 5 maxTokens?: number; // Maximum tokens in response 6 topP?: number; // Top P sampling parameter 7}
anthropic.claude-3-5-haiku-20241022-v1:0
anthropic.claude-3-sonnet-20240229-v1:0
anthropic.claude-instant-v1
amazon.titan-text-express-v1
amazon.titan-text-lite-v1
1const provider = new BedrockProvider({ 2 model: `us.anthropic.claude-3-7-sonnet-20250219-v1:0`, 3 region: 'us-east-2', 4 converseCommandConfig: { 5 additionalModelRequestFields: { 6 reasoning_config: { 7 type: 'enabled', 8 budget_tokens: 1024, 9 }, 10 }, 11 inferenceConfig: { 12 temperature: 1, 13 }, 14 }, 15});
1import { BedrockProvider } from '@agenite/bedrock'; 2import type { ToolDefinition } from '@agenite/llm'; 3 4// Define a calculator tool 5const calculatorTool: ToolDefinition = { 6 name: 'calculator', 7 description: 'Perform basic arithmetic operations', 8 inputSchema: { 9 type: 'object', 10 properties: { 11 operation: { 12 type: 'string', 13 enum: ['add', 'subtract', 'multiply', 'divide'], 14 }, 15 a: { type: 'number' }, 16 b: { type: 'number' }, 17 }, 18 required: ['operation', 'a', 'b'], 19 }, 20}; 21 22// Initialize provider with tool support 23const provider = new BedrockProvider({ 24 model: 'anthropic.claude-3-5-haiku-20241022-v1:0', 25 region: 'us-west-2', 26}); 27 28// Use tool in conversation 29const messages = [ 30 { 31 role: 'user', 32 content: [{ type: 'text', text: 'What is 123 multiplied by 456?' }], 33 }, 34]; 35 36const generator = provider.iterate(messages, { 37 tools: [calculatorTool], 38 stream: true, 39 systemPrompt: 40 'You are a helpful AI assistant with access to a calculator tool.', 41}); 42 43// Process streaming response with tool usage 44for await (const chunk of generator) { 45 if (chunk.type === 'text') { 46 process.stdout.write(chunk.text); 47 } else if (chunk.type === 'toolUse') { 48 console.log('Tool Use:', chunk); 49 } 50}
1// Maintain conversation history 2let messages = []; 3const result = await provider.generate(messages, { 4 systemPrompt: 'You are a helpful AI assistant.', 5}); 6messages.push( 7 { role: 'user', content: [{ type: 'text', text: 'Hello!' }] }, 8 result 9);
1class BedrockProvider implements LLMProvider { 2 constructor(options: BedrockProviderOptions); 3 4 generate( 5 messages: string | BaseMessage[], 6 options?: GenerateOptions 7 ): Promise<BaseMessage>; 8 9 stream( 10 messages: string | BaseMessage[], 11 options?: StreamOptions 12 ): AsyncGenerator<StreamChunk>; 13 14 iterate( 15 messages: string | BaseMessage[], 16 options?: StreamOptions 17 ): AsyncGenerator<StreamChunk>; 18}
1interface BaseMessage { 2 role: 'user' | 'assistant' | 'system'; 3 content: ContentBlock[]; 4} 5 6type ContentBlock = TextBlock | ToolUseBlock | ToolResultBlock;
Check out the examples directory for more:
basic-chat.ts
- Simple chat interactiontool-usage.ts
- Advanced tool integration example1aws configure
Ensure your AWS account has access to Bedrock and the required models.
Set up IAM permissions for Bedrock access:
1{ 2 "Version": "2012-10-17", 3 "Statement": [ 4 { 5 "Effect": "Allow", 6 "Action": [ 7 "bedrock:InvokeModel", 8 "bedrock:InvokeModelWithResponseStream" 9 ], 10 "Resource": "*" 11 } 12 ] 13}
Contributions are welcome! Please feel free to submit a Pull Request.
MIT
No vulnerabilities found.
No security vulnerabilities found.