Gathering detailed insights and metrics for @spiceai/spice-ai-provider
Gathering detailed insights and metrics for @spiceai/spice-ai-provider
npm install @spiceai/spice-ai-provider
Typescript
Module System
Node Version
NPM Version
@spiceai/spice-ai-provider@0.0.5
Published on 01 Feb 2025
@spiceai/spice-ai-provider@0.0.4
Published on 31 Jan 2025
@spiceai/spice-ai-provider@0.0.3
Published on 02 Nov 2024
@spiceai/spice-ai-provider@0.0.2
Published on 23 Sept 2024
@spiceai/spice-ai-provider@0.0.1
Published on 23 Sept 2024
TypeScript (74.06%)
JavaScript (25.94%)
Love this project? Help keep it running — sponsor us today! 🚀
Total Downloads
4,845
Last Day
22
Last Week
580
Last Month
1,406
Last Year
4,845
42 Commits
6 Branches
2 Contributors
Minified
Minified + Gzipped
Latest Version
0.0.5
Package Id
@spiceai/spice-ai-provider@0.0.5
Unpacked Size
12.23 kB
Size
3.45 kB
File Count
7
NPM Version
10.8.2
Node Version
20.18.2
Publised On
01 Feb 2025
Cumulative downloads
Total Downloads
Last day
-76.6%
22
Compared to previous day
Last week
7.4%
580
Compared to previous week
Last month
4.8%
1,406
Compared to previous month
Last year
0%
4,845
Compared to previous year
3
1
4
Vercel AI Provider for Spice runtime
1npm install @spiceai/spice-ai-provider
Check Spice.ai to learn how to install and get started with local runtime:
You can import default spice provider instance spice
from @spiceai/spice-ai-provider
1import { spice } from "@spiceai/spice-ai-provider";
If you need a cusom setup, use createSpice
and create provider with your settings:
1import { createSpice } from "@spiceai/spice-ai-provider"; 2 3const spice = createSpice({ 4 // ... 5});
Provider settings:
http://localhost:8090/v1
Navigate to examples/sample-app
1curl https://install.spiceai.org | /bin/bash
Copy .env.example
→ .env
, and set your OpenAI API key and GitHub PAT for data connector
Start Spice runtime
1spice run 2Checking for latest Spice runtime release... 3Spice.ai runtime starting... 42024-09-10T14:17:37.598982Z INFO runtime::flight: Spice Runtime Flight listening on 127.0.0.1:50051 52024-09-10T14:17:37.599070Z INFO runtime::metrics_server: Spice Runtime Metrics listening on 127.0.0.1:9090 62024-09-10T14:17:37.599263Z INFO runtime::http: Spice Runtime HTTP listening on 127.0.0.1:8090 72024-09-10T14:17:37.600010Z INFO runtime::opentelemetry: Spice Runtime OpenTelemetry listening on 127.0.0.1:50052 82024-09-10T14:17:38.175204Z INFO runtime: Embedding [openai_embeddings] ready to embed 92024-09-10T14:17:38.175396Z INFO runtime: Tool [document_similarity] ready to use 102024-09-10T14:17:38.175416Z INFO runtime: Tool [table_schema] ready to use 112024-09-10T14:17:38.175421Z INFO runtime: Tool [sql] ready to use 122024-09-10T14:17:38.175428Z INFO runtime: Tool [list_datasets] ready to use 132024-09-10T14:17:38.179149Z INFO runtime: Initialized results cache; max size: 128.00 MiB, item ttl: 1s 142024-09-10T14:17:38.179779Z INFO runtime: Loading model [gpt-4o] from openai:gpt-4o... 152024-09-10T14:17:38.686558Z INFO runtime: Model [gpt-4o] deployed, ready for inferencing 162024-09-10T14:17:39.174429Z INFO runtime: Dataset vercel_ai_docs registered (github:github.com/vercel/ai/files/main), acceleration (arrow), results cache enabled. 172024-09-10T14:17:39.175632Z INFO runtime::accelerated_table::refresh_task: Loading data for dataset vercel_ai_docs 182024-09-10T14:17:51.810106Z INFO runtime::accelerated_table::refresh_task: Loaded 143 rows (2.29 MiB) for dataset vercel_ai_docs in 12s 634ms.
1npm run dev
Navigate to http://localhost:3000, and now you can chat with Vercel AI SDK docs:
No vulnerabilities found.
No security vulnerabilities found.