Gathering detailed insights and metrics for readdirp
Gathering detailed insights and metrics for readdirp
Gathering detailed insights and metrics for readdirp
Gathering detailed insights and metrics for readdirp
readdirp-walk
An optimized readdirp-compliant directory walker that gives you control over concurrency and performance tradeoffs
readdirp-async
readdirp async generator
fs-readdirp
simple readdirp with filter and sync/async version
@lunjs/readdirp
Recursive version of fs.readdir with streaming API.
Recursive version of fs.readdir with small RAM & CPU footprint.
npm install readdirp
Typescript
Module System
Min. Node Version
Node Version
NPM Version
JavaScript (58.96%)
TypeScript (41.04%)
Total Downloads
0
Last Day
0
Last Week
0
Last Month
0
Last Year
0
MIT License
387 Stars
463 Commits
54 Forks
8 Watchers
3 Branches
35 Contributors
Updated on Jun 17, 2025
Latest Version
4.1.2
Package Id
readdirp@4.1.2
Unpacked Size
35.28 kB
Size
7.86 kB
File Count
8
NPM Version
10.9.2
Node Version
22.13.0
Published on
Feb 14, 2025
Cumulative downloads
Total Downloads
Last Day
0%
NaN
Compared to previous day
Last Week
0%
NaN
Compared to previous week
Last Month
0%
NaN
Compared to previous month
Last Year
0%
NaN
Compared to previous year
Recursive version of fs.readdir. Exposes a stream API (with small RAM & CPU footprint) and a promise API.
1npm install readdirp 2jsr add jsr:@paulmillr/readdirp
1// Use streams to achieve small RAM & CPU footprint. 2// 1) Streams example with for-await. 3import readdirp from 'readdirp'; 4for await (const entry of readdirp('.')) { 5 const {path} = entry; 6 console.log(`${JSON.stringify({path})}`); 7} 8 9// 2) Streams example, non for-await. 10// Print out all JS files along with their size within the current folder & subfolders. 11import readdirp from 'readdirp'; 12readdirp('.', {alwaysStat: true, fileFilter: (f) => f.basename.endsWith('.js')}) 13 .on('data', (entry) => { 14 const {path, stats: {size}} = entry; 15 console.log(`${JSON.stringify({path, size})}`); 16 }) 17 // Optionally call stream.destroy() in `warn()` in order to abort and cause 'close' to be emitted 18 .on('warn', error => console.error('non-fatal error', error)) 19 .on('error', error => console.error('fatal error', error)) 20 .on('end', () => console.log('done')); 21 22// 3) Promise example. More RAM and CPU than streams / for-await. 23import { readdirpPromise } from 'readdirp'; 24const files = await readdirpPromise('.'); 25console.log(files.map(file => file.path)); 26 27// Other options. 28import readdirp from 'readdirp'; 29readdirp('test', { 30 fileFilter: (f) => f.basename.endsWith('.js'), 31 directoryFilter: (d) => d.basename !== '.git', 32 // directoryFilter: (di) => di.basename.length === 9 33 type: 'files_directories', 34 depth: 1 35});
const stream = readdirp(root[, options])
— Stream API
stream
of entry infosfor await (const entry of stream)
with node.js 10+ (asyncIterator
).on('data', (entry) => {})
entry info for every file / dir.on('warn', (error) => {})
non-fatal Error
that prevents a file / dir from being processed. Example: inaccessible to the user.on('error', (error) => {})
fatal Error
which also ends the stream. Example: illegal options where passed.on('end')
— we are done. Called when all entries were found and no more will be emitted.on('close')
— stream is destroyed via stream.destroy()
.
Could be useful if you want to manually abort even on a non fatal error.
At that point the stream is no longer readable
and no more entries, warning or errors are emittedconst entries = await readdirp.promise(root[, options])
— Promise API. Returns a list of entry infos.
First argument is awalys root
, path in which to start reading and recursing into subdirectories.
fileFilter
: filter to include or exclude files
directoryFilter
: filter to include/exclude directories found and to recurse into. Directories that do not pass a filter will not be recursed into.depth: 5
: depth at which to stop recursing even if more subdirectories are foundtype: 'files'
: determines if data events on the stream should be emitted for 'files'
(default), 'directories'
, 'files_directories'
, or 'all'
. Setting to 'all'
will also include entries for other types of file descriptors like character devices, unix sockets and named pipes.alwaysStat: false
: always return stats
property for every file. Default is false
, readdirp will return Dirent
entries. Setting it to true
can double readdir execution time - use it only when you need file size
, mtime
etc. Cannot be enabled on node <10.10.0.lstat: false
: include symlink entries in the stream along with files. When true
, fs.lstat
would be used instead of fs.stat
EntryInfo
Has the following properties:
path: 'assets/javascripts/react.js'
: path to the file/directory (relative to given root)fullPath: '/Users/dev/projects/app/assets/javascripts/react.js'
: full path to the file/directory foundbasename: 'react.js'
: name of the file/directorydirent: fs.Dirent
: built-in dir entry object - only with alwaysStat: false
stats: fs.Stats
: built in stat object - only with alwaysStat: true
let {readdirp} = require('readdirp')
in common.jshighWaterMark
option. Fixes race conditions related to for-await
looping.bigint
support to stat
output on Windows. This is backwards-incompatible for some cases. Be careful. It you use it incorrectly, you'll see "TypeError: Cannot mix BigInt and other types, use explicit conversions".readdirp(options)
to readdirp(root, options)
entryType
option to type
entryType: 'both'
to 'files_directories'
EntryInfo
stat
to stats
alwaysStat: true
dirent
is emitted instead of stats
by default with alwaysStat: false
name
to basename
parentDir
and fullParentDir
propertiesCopyright (c) 2012-2019 Thorsten Lorenz, Paul Miller (https://paulmillr.com)
MIT License, see LICENSE file.
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
no dangerous workflow patterns detected
Reason
all dependencies are pinned
Details
Reason
license file detected
Details
Reason
1 existing vulnerabilities detected
Details
Reason
Found 2/29 approved changesets -- score normalized to 0
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Score
Last Scanned on 2025-07-07
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More