Gathering detailed insights and metrics for @upstash/rag-chat-widget
Gathering detailed insights and metrics for @upstash/rag-chat-widget
Gathering detailed insights and metrics for @upstash/rag-chat-widget
Gathering detailed insights and metrics for @upstash/rag-chat-widget
npm install @upstash/rag-chat-widget
Typescript
Module System
Node Version
NPM Version
Cumulative downloads
Total Downloads
Last Day
0%
NaN
Compared to previous day
Last Week
0%
NaN
Compared to previous week
Last Month
0%
NaN
Compared to previous month
Last Year
0%
NaN
Compared to previous year
16
3
A customizable Reach chat widget that combines Upstash Vector for similarity search, Together AI for LLM, and Vercel AI SDK for streaming responses. This ready-to-use component provides an out of the box solution for adding RAG-Powered chat interfaces to your Next.js application.
![]() Closed State |
![]() Open State |
⚡ Streaming responses support
💻 Server actions
📱 Responsive design
🔍 Real-time context retrieval
💾 Persistent chat history
🎨 Fully customizable UI components
🎨 Dark/light mode support
1# Using npm 2npm install @upstash/rag-chat-widget 3 4# Using pnpm 5pnpm add @upstash/rag-chat-widget 6 7# Using yarn 8yarn add @upstash/rag-chat-widget
Create an Upstash Vector database and set up the environment variables as below. If you don't have an account, you can start by going to Upstash Console.
Choose an embedding model when creating an index in Upstash Vector.
UPSTASH_VECTOR_REST_URL=
UPSTASH_VECTOR_REST_TOKEN=
OPENAI_API_KEY=
TOGETHER_API_KEY=
# Optional
TOGETHER_MODEL=
In your tailwind.config.ts
file, add the configuration below:
1import type { Config } from "tailwindcss"; 2 3export default { 4 content: ["./node_modules/@upstash/rag-chat-widget/**/*.{js,mjs}"], 5} satisfies Config;
The RAG Chat Widget can be integrated into your application using two straightforward approaches. Choose the method that best fits your project structure:
Create a seperate component file with the use client
directive, then import and use it anywhere in your application.
1// components/widget.tsx 2"use client"; 3 4import { ChatWidget } from "@upstash/rag-chat-widget"; 5 6export const Widget = () => { 7 return <ChatWidget />; 8};
1// page.tsx 2import { Widget } from "./components/widget"; 3 4export default function Home() { 5 return ( 6 <> 7 <Widget /> 8 <p>Home</p> 9 </> 10 ); 11}
Alternatively, import and use the ChatWidget directly in your client-side pages.
1// page.tsx 2"use client"; 3import { ChatWidget } from "@upstash/rag-chat-widget"; 4 5export default function Home() { 6 return ( 7 <> 8 <ChatWidget /> 9 <p>Home</p> 10 </> 11 ); 12}
It's possible to choose one of the together.ai models for the chat.
Default model is meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo
. You can configure it in the environment variables.
TOGETHER_MODEL="deepseek-ai/DeepSeek-V3"
You can add content to your RAG Chat widget in several ways:
The SDK provides methods to add various types of content programmatically:
1import { RAGChat, openai } from "@upstash/rag-chat"; 2 3export const ragChat = new RAGChat({ 4 model: openai("gpt-4-turbo"), 5}); 6// Add text content 7await ragChat.context.add("Your text content here"); 8 9// Add PDF documents 10await ragChat.context.add({ 11 type: "pdf", 12 fileSource: "./path/to/document.pdf", 13}); 14 15// Add web content 16await ragChat.context.add({ 17 type: "html", 18 source: "https://your-website.com", 19});
For more detailed examples and options, check out the RAG Chat documentation.
You can also manage your content directly through the Upstash Vector Console:
We welcome contributions! Please see our contributing guidelines for more details.
MIT License - see the LICENSE file for details.
No vulnerabilities found.
No security vulnerabilities found.