Gathering detailed insights and metrics for node-llama-cpp
Gathering detailed insights and metrics for node-llama-cpp
Gathering detailed insights and metrics for node-llama-cpp
Gathering detailed insights and metrics for node-llama-cpp
@node-llama-cpp/linux-arm64
Prebuilt binary for node-llama-cpp for Linux arm64
@node-llama-cpp/linux-x64
Prebuilt binary for node-llama-cpp for Linux x64
@node-llama-cpp/linux-armv7l
Prebuilt binary for node-llama-cpp for Linux armv7l
@llama-node/llama-cpp
The repo is for one of the backend: [llama.cpp](https://github.com/ggerganov/llama.cpp)
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
Typescript
Module System
Min. Node Version
Node Version
NPM Version
TypeScript (91.21%)
C++ (4.35%)
CSS (1.72%)
Vue (0.96%)
JavaScript (0.79%)
CMake (0.71%)
Shell (0.25%)
C (0.01%)
HTML (0.01%)
Total Downloads
0
Last Day
0
Last Week
0
Last Month
0
Last Year
0
MIT License
1,577 Stars
203 Commits
137 Forks
17 Watchers
1 Branches
7 Contributors
Updated on Jul 08, 2025
Latest Version
3.9.0
Package Id
node-llama-cpp@3.9.0
Unpacked Size
25.72 MB
Size
23.03 MB
File Count
851
NPM Version
10.9.2
Node Version
20.19.1
Published on
Jun 04, 2025