Gathering detailed insights and metrics for @huggingface/hub
Gathering detailed insights and metrics for @huggingface/hub
Gathering detailed insights and metrics for @huggingface/hub
Gathering detailed insights and metrics for @huggingface/hub
@huggingface/tasks
List of ML tasks for huggingface.co/tasks
@huggingface/gguf
a GGUF parser that works on remotely hosted files
@modelpark/huggingface-hub-api
Huggingface Hub API for Javascript
@huggingface/ollama-utils
Various utilities for maintaining Ollama compatibility with models on Hugging Face hub
npm install @huggingface/hub
Typescript
Module System
Min. Node Version
Node Version
NPM Version
TypeScript (85.86%)
JavaScript (10.57%)
Python (1.54%)
Jinja (0.85%)
Rust (0.79%)
Shell (0.35%)
Svelte (0.03%)
HTML (0.01%)
Total Downloads
0
Last Day
0
Last Week
0
Last Month
0
Last Year
0
MIT License
2,174 Stars
1,700 Commits
461 Forks
50 Watchers
90 Branches
334 Contributors
Updated on Jul 15, 2025
Latest Version
2.4.0
Package Id
@huggingface/hub@2.4.0
Unpacked Size
1.25 MB
Size
287.09 kB
File Count
443
NPM Version
10.8.2
Node Version
20.19.3
Published on
Jul 08, 2025
Cumulative downloads
Total Downloads
Last Day
0%
NaN
Compared to previous day
Last Week
0%
NaN
Compared to previous week
Last Month
0%
NaN
Compared to previous month
Last Year
0%
NaN
Compared to previous year
1
Official utilities to use the Hugging Face Hub API.
1pnpm add @huggingface/hub 2 3npm add @huggingface/hub 4 5yarn add @huggingface/hub
1// esm.sh 2import { uploadFiles, listModels } from "https://esm.sh/@huggingface/hub" 3// or npm: 4import { uploadFiles, listModels } from "npm:@huggingface/hub"
Check out the full documentation.
For some of the calls, you need to create an account and generate an access token.
Learn how to find free models using the hub package in this interactive tutorial.
1import * as hub from "@huggingface/hub"; 2import type { RepoDesignation } from "@huggingface/hub"; 3 4const repo: RepoDesignation = { type: "model", name: "myname/some-model" }; 5 6const {name: username} = await hub.whoAmI({accessToken: "hf_..."}); 7 8for await (const model of hub.listModels({search: {owner: username}, accessToken: "hf_..."})) { 9 console.log("My model:", model); 10} 11 12const specificModel = await hub.modelInfo({name: "openai-community/gpt2"}); 13await hub.checkRepoAccess({repo, accessToken: "hf_..."}); 14 15await hub.createRepo({ repo, accessToken: "hf_...", license: "mit" }); 16 17await hub.uploadFiles({ 18 repo, 19 accessToken: "hf_...", 20 files: [ 21 // path + blob content 22 { 23 path: "file.txt", 24 content: new Blob(["Hello World"]), 25 }, 26 // Local file URL 27 pathToFileURL("./pytorch-model.bin"), 28 // Local folder URL 29 pathToFileURL("./models"), 30 // Web URL 31 new URL("https://huggingface.co/xlm-roberta-base/resolve/main/tokenizer.json"), 32 // Path + Web URL 33 { 34 path: "myfile.bin", 35 content: new URL("https://huggingface.co/bert-base-uncased/resolve/main/pytorch_model.bin") 36 } 37 // Can also work with native File in browsers 38 ], 39}); 40 41// or 42 43for await (const progressEvent of await hub.uploadFilesWithProgress({ 44 repo, 45 accessToken: "hf_...", 46 files: [ 47 ... 48 ], 49})) { 50 console.log(progressEvent); 51} 52 53await hub.deleteFile({repo, accessToken: "hf_...", path: "myfile.bin"}); 54 55await (await hub.downloadFile({ repo, path: "README.md" })).text(); 56 57for await (const fileInfo of hub.listFiles({repo})) { 58 console.log(fileInfo); 59} 60 61await hub.deleteRepo({ repo, accessToken: "hf_..." });
You can use @huggingface/hub
in CLI mode to upload files and folders to your repo.
1npx @huggingface/hub upload coyotte508/test-model .
2npx @huggingface/hub upload datasets/coyotte508/test-dataset .
3# Same thing
4npx @huggingface/hub upload --repo-type dataset coyotte508/test-dataset .
5# Upload new data with 0 history in a separate branch
6npx @huggingface/hub branch create coyotte508/test-model release --empty
7npx @huggingface/hub upload coyotte508/test-model . --revision release
8
9npx @huggingface/hub --help
10npx @huggingface/hub upload --help
You can also install globally with npm install -g @huggingface/hub
. Then you can do:
1hfjs upload coyotte508/test-model . 2 3hfjs branch create --repo-type dataset coyotte508/test-dataset release --empty 4hfjs upload --repo-type dataset coyotte508/test-dataset . --revision release 5 6hfjs --help 7hfjs upload --help
It's possible to login using OAuth ("Sign in with HF").
This will allow you get an access token to use some of the API, depending on the scopes set inside the Space or the OAuth App.
1import { oauthLoginUrl, oauthHandleRedirectIfPresent } from "@huggingface/hub"; 2 3const oauthResult = await oauthHandleRedirectIfPresent(); 4 5if (!oauthResult) { 6 // If the user is not logged in, redirect to the login page 7 window.location.href = await oauthLoginUrl(); 8} 9 10// You can use oauthResult.accessToken, oauthResult.accessTokenExpiresAt and oauthResult.userInfo 11console.log(oauthResult);
Checkout the demo: https://huggingface.co/spaces/huggingfacejs/client-side-oauth
The @huggingface/hub
package provide basic capabilities to scan the cache directory. Learn more about Manage huggingface_hub cache-system.
scanCacheDir
You can get the list of cached repositories using the scanCacheDir
function.
1import { scanCacheDir } from "@huggingface/hub"; 2 3const result = await scanCacheDir(); 4 5console.log(result);
Note: this does not work in the browser
downloadFileToCacheDir
You can cache a file of a repository using the downloadFileToCacheDir
function.
1import { downloadFileToCacheDir } from "@huggingface/hub"; 2 3const file = await downloadFileToCacheDir({ 4 repo: 'foo/bar', 5 path: 'README.md' 6}); 7 8console.log(file);
Note: this does not work in the browser
snapshotDownload
You can download an entire repository at a given revision in the cache directory using the snapshotDownload
function.
1import { snapshotDownload } from "@huggingface/hub"; 2 3const directory = await snapshotDownload({ 4 repo: 'foo/bar', 5}); 6 7console.log(directory);
The code use internally the downloadFileToCacheDir
function.
Note: this does not work in the browser
When uploading large files, you may want to run the commit
calls inside a worker, to offload the sha256 computations.
Remote resources and local files should be passed as URL
whenever it's possible so they can be lazy loaded in chunks to reduce RAM usage. Passing a File
inside the browser's context is fine, because it natively behaves as a Blob
.
Under the hood, @huggingface/hub
uses a lazy blob implementation to load the file.
@huggingface/tasks
: Typings only@huggingface/lz4
: URL join utilityNo vulnerabilities found.
No security vulnerabilities found.