Gathering detailed insights and metrics for @glyphyai/mastra-core
Gathering detailed insights and metrics for @glyphyai/mastra-core
Gathering detailed insights and metrics for @glyphyai/mastra-core
Gathering detailed insights and metrics for @glyphyai/mastra-core
npm install @glyphyai/mastra-core
Typescript
Module System
Min. Node Version
Node Version
NPM Version
Cumulative downloads
Total Downloads
Last Day
0%
NaN
Compared to previous day
Last Week
0%
NaN
Compared to previous week
Last Month
0%
NaN
Compared to previous month
Last Year
0%
NaN
Compared to previous year
29
The core foundation of the Mastra framework, providing essential components and interfaces for building AI-powered applications.
1npm install @mastra/core
@mastra/core
is the foundational package of the Mastra framework, providing:
For comprehensive documentation, visit our official documentation.
/agent
)Agents are autonomous AI entities that can understand instructions, use tools, and complete tasks. They encapsulate LLM interactions and can maintain conversation history, use provided tools, and follow specific behavioral guidelines through instructions.
1import { Agent } from '@mastra/core/agent'; 2import { openai } from '@ai-sdk/openai'; 3 4const agent = new Agent({ 5 name: 'my-agent', 6 instructions: 'Your task-specific instructions', 7 model: openai('gpt-4o-mini'), 8 tools: {}, // Optional tools 9});
/workflows
)Workflows orchestrate complex AI tasks by combining multiple actions into a coherent sequence. They handle state management, error recovery, and can include conditional logic and parallel execution.
1import { Workflow } from '@mastra/core'; 2 3const workflow = new Workflow({ 4 name: 'my-workflow', 5 steps: [ 6 // Workflow steps 7 ], 8});
/memory
)Memory management provides persistent storage and retrieval of AI interactions. It supports different storage backends and enables context-aware conversations and long-term learning.
1import { Memory } from '@mastra/memory'; 2import { Agent } from '@mastra/core/agent'; 3import { openai } from '@ai-sdk/openai'; 4 5const agent = new Agent({ 6 name: 'Project Manager', 7 instructions: 'You are a project manager assistant.', 8 model: openai('gpt-4o-mini'), 9 memory: new Memory({ 10 options: { 11 lastMessages: 20, 12 semanticRecall: { 13 topK: 3, 14 messageRange: { before: 2, after: 1 }, 15 }, 16 }, 17 }), 18});
/tools
)Tools are functions that agents can use to interact with external systems or perform specific tasks. Each tool has a clear description and schema, making it easy for AI to understand and use them effectively.
1import { createTool } from '@mastra/core/tools'; 2import { z } from 'zod'; 3 4const weatherInfo = createTool({ 5 id: 'Get Weather Information', 6 inputSchema: z.object({ 7 city: z.string(), 8 }), 9 description: 'Fetches the current weather information for a given city', 10 execute: async ({ context: { city } }) => { 11 // Tool implementation 12 }, 13});
/eval
)The evaluation system enables quantitative assessment of AI outputs. Create custom metrics to measure specific aspects of AI performance, from response quality to task completion accuracy.
1import { Agent } from '@mastra/core/agent'; 2import { openai } from '@ai-sdk/openai'; 3import { SummarizationMetric } from '@mastra/evals/llm'; 4import { ContentSimilarityMetric, ToneConsistencyMetric } from '@mastra/evals/nlp'; 5 6const model = openai('gpt-4o'); 7 8const agent = new Agent({ 9 name: 'ContentWriter', 10 instructions: 'You are a content writer that creates accurate summaries', 11 model, 12 evals: { 13 summarization: new SummarizationMetric(model), 14 contentSimilarity: new ContentSimilarityMetric(), 15 tone: new ToneConsistencyMetric(), 16 }, 17});
/logger
)The logging system provides structured, leveled logging with multiple transport options. It supports debug information, performance monitoring, and error tracking across your AI applications.
1import { createLogger, LogLevel } from '@mastra/core'; 2 3const logger = createLogger({ 4 name: 'MyApp', 5 level: LogLevel.INFO, 6});
/telemetry
)Telemetry provides OpenTelemetry (Otel) integration for comprehensive monitoring of your AI systems. Track latency, success rates, and system health with distributed tracing and metrics collection.
1import { Mastra } from '@mastra/core'; 2 3const mastra = new Mastra({ 4 telemetry: { 5 serviceName: 'my-service', 6 enabled: true, 7 sampling: { 8 type: 'ratio', 9 probability: 0.5, 10 }, 11 export: { 12 type: 'otlp', 13 endpoint: 'https://otel-collector.example.com/v1/traces', 14 }, 15 }, 16});
More Telemetry documentation →
No vulnerabilities found.
No security vulnerabilities found.