Gathering detailed insights and metrics for stream-json
Gathering detailed insights and metrics for stream-json
Gathering detailed insights and metrics for stream-json
Gathering detailed insights and metrics for stream-json
The micro-library of Node.js stream components for creating custom JSON processing pipelines with a minimal memory footprint. It can parse JSON files far exceeding available memory streaming individual primitives using a SAX-inspired API.
npm install stream-json
Typescript
Module System
Node Version
NPM Version
99.8
Supply Chain
100
Quality
77.8
Maintenance
100
Vulnerability
100
License
JavaScript (100%)
Total Downloads
165,928,794
Last Day
109,743
Last Week
2,041,065
Last Month
8,809,781
Last Year
78,719,329
NOASSERTION License
1,074 Stars
381 Commits
48 Forks
14 Watchers
5 Branches
7 Contributors
Updated on Jul 01, 2025
Minified
Minified + Gzipped
Latest Version
1.9.1
Package Id
stream-json@1.9.1
Unpacked Size
88.21 kB
Size
16.32 kB
File Count
25
NPM Version
10.9.0
Node Version
23.1.0
Published on
Nov 12, 2024
Cumulative downloads
Total Downloads
Last Day
2%
109,743
Compared to previous day
Last Week
-7.9%
2,041,065
Compared to previous week
Last Month
5.3%
8,809,781
Compared to previous month
Last Year
62.4%
78,719,329
Compared to previous year
1
1
stream-json
is a micro-library of node.js stream components with minimal dependencies for creating custom data processors oriented on processing huge JSON files while requiring a minimal memory footprint. It can parse JSON files far exceeding available memory. Even individual primitive data items (keys, strings, and numbers) can be streamed piece-wise. Streaming SAX-inspired event-based API is included as well.
Available components:
Pick
, or generated by other means."{}[]"
), or with white spaces (like in "true 1 null"
).utf8
text input.StreamValues
.
Parser({jsonStreaming: true})
+ StreamValues
.Disassembler
+ Stringer
.All components are meant to be building blocks to create flexible custom data processing pipelines. They can be extended and/or combined with custom code. They can be used together with stream-chain to simplify data processing.
This toolkit is distributed under New BSD license.
1const {chain} = require('stream-chain'); 2 3const {parser} = require('stream-json'); 4const {pick} = require('stream-json/filters/Pick'); 5const {ignore} = require('stream-json/filters/Ignore'); 6const {streamValues} = require('stream-json/streamers/StreamValues'); 7 8const fs = require('fs'); 9const zlib = require('zlib'); 10 11const pipeline = chain([ 12 fs.createReadStream('sample.json.gz'), 13 zlib.createGunzip(), 14 parser(), 15 pick({filter: 'data'}), 16 ignore({filter: /\b_meta\b/i}), 17 streamValues(), 18 data => { 19 const value = data.value; 20 // keep data only for the accounting department 21 return value && value.department === 'accounting' ? data : null; 22 } 23]); 24 25let counter = 0; 26pipeline.on('data', () => ++counter); 27pipeline.on('end', () => 28 console.log(`The accounting department has ${counter} employees.`));
See the full documentation in Wiki.
Companion projects:
stream-json
:
rows as arrays of string values. If a header row is used, it can stream rows as objects with named fields.1npm install --save stream-json 2# or: yarn add stream-json
The whole library is organized as a set of small components, which can be combined to produce the most effective pipeline. All components are based on node.js streams, and events. They implement all required standard APIs. It is easy to add your own components to solve your unique tasks.
The code of all components is compact and simple. Please take a look at their source code to see how things are implemented, so you can produce your own components in no time.
Obviously, if a bug is found, or a way to simplify existing components, or new generic components are created, which can be reused in a variety of projects, don't hesitate to open a ticket, and/or create a pull request.
stream-chain
), bugfix: inconsistent object/array braces. Thx Xiao Li.utils/Utf8Stream
to sanitize utf8
input, all parsers support it automatically. Thx john30 for the suggestion.jsonl/Parser
and jsonl/Stringer
.The rest can be consulted in the project's wiki Release history.
No vulnerabilities found.
Reason
no dangerous workflow patterns detected
Reason
no binaries found in the repo
Reason
0 existing vulnerabilities detected
Reason
license file detected
Details
Reason
dependency not pinned by hash detected -- score normalized to 3
Details
Reason
Found 0/30 approved changesets -- score normalized to 0
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
no SAST tool detected
Details
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
branch protection not enabled on development/release branches
Details
Score
Last Scanned on 2025-06-23
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More