Gathering detailed insights and metrics for @surebob/ollama-local-provider
Gathering detailed insights and metrics for @surebob/ollama-local-provider
npm install @surebob/ollama-local-provider
Typescript
Module System
Node Version
NPM Version
31.7
Supply Chain
75.5
Quality
81.1
Maintenance
50
Vulnerability
98.9
License
Verify real, reachable, and deliverable emails with instant MX records, SMTP checks, and disposable email detection.
Total Downloads
65
Last Day
1
Last Week
2
Last Month
65
Last Year
65
Minified
Minified + Gzipped
Latest Version
0.1.0
Package Id
@surebob/ollama-local-provider@0.1.0
Unpacked Size
11.75 kB
Size
3.25 kB
File Count
6
NPM Version
11.0.0
Node Version
22.11.0
Published on
Feb 24, 2025
Cumulative downloads
Total Downloads
Last Day
-66.7%
1
Compared to previous day
Last Week
-81.8%
2
Compared to previous week
Last Month
0%
65
Compared to previous month
Last Year
0%
65
Compared to previous year
A client-side Ollama provider for the Vercel AI SDK that makes API calls directly from the browser to your local Ollama instance.
1npm install @surebob/ollama-local-provider
1// In your Next.js page or component 2'use client'; 3 4import { useChat } from 'ai/react'; 5import { ollama } from '@surebob/ollama-local-provider'; 6 7export default function Chat() { 8 const { messages, input, handleInputChange, handleSubmit } = useChat({ 9 api: '/api/chat', 10 initialMessages: [], 11 }); 12 13 return ( 14 <div> 15 {messages.map((m) => ( 16 <div key={m.id}>{m.content}</div> 17 ))} 18 <form onSubmit={handleSubmit}> 19 <input value={input} onChange={handleInputChange} /> 20 <button type="submit">Send</button> 21 </form> 22 </div> 23 ); 24}
1// app/api/chat/route.ts 2import { StreamingTextResponse } from 'ai'; 3import { ollama } from '@surebob/ollama-local-provider'; 4 5export async function POST(req: Request) { 6 const { messages } = await req.json(); 7 8 const model = ollama('deepscaler:1.5b', { 9 temperature: 0.7, 10 top_p: 0.9, 11 num_ctx: 4096, 12 repeat_penalty: 1.1, 13 }); 14 15 const response = await model.doStream({ 16 inputFormat: 'messages', 17 mode: { type: 'regular' }, 18 prompt: messages, 19 }); 20 21 return new StreamingTextResponse(response.stream); 22}
This provider uses the ollama/browser
client to make API calls directly from the browser to your local Ollama instance. This means:
When running locally:
localhost:11434
)When deployed:
localhost:11434
)MIT
No vulnerabilities found.
No security vulnerabilities found.