Gathering detailed insights and metrics for @discoveryjs/json-ext
Gathering detailed insights and metrics for @discoveryjs/json-ext
Gathering detailed insights and metrics for @discoveryjs/json-ext
Gathering detailed insights and metrics for @discoveryjs/json-ext
A set of performant and memory efficient utilities that extend the use of JSON
npm install @discoveryjs/json-ext
Typescript
Module System
Min. Node Version
Node Version
NPM Version
JavaScript (100%)
Total Downloads
1,829,597,897
Last Day
2,202,370
Last Week
13,486,996
Last Month
56,946,977
Last Year
630,617,071
MIT License
166 Stars
213 Commits
6 Forks
4 Watchers
2 Branches
8 Contributors
Updated on Apr 24, 2025
Minified
Minified + Gzipped
Latest Version
0.6.3
Package Id
@discoveryjs/json-ext@0.6.3
Unpacked Size
144.53 kB
Size
35.66 kB
File Count
20
NPM Version
10.8.3
Node Version
22.9.0
Published on
Oct 24, 2024
Cumulative downloads
Total Downloads
Last Day
14.8%
2,202,370
Compared to previous day
Last Week
9.4%
13,486,996
Compared to previous week
Last Month
-7.1%
56,946,977
Compared to previous month
Last Year
16.2%
630,617,071
Compared to previous year
A set of utilities designed to extend JSON's capabilities, especially for handling large JSON data (over 100MB) efficiently:
JSON.parse()
, but processing JSON data in chunks.JSON.stringify()
, but returns a generator that yields JSON strings in parts.JSON.stringify()
result and identifies circular references without generating the JSON.JSON.parse()
and JSON.stringify()
require loading entire data into memory, leading to high memory consumption and increased garbage collection pressure. parseChunked()
and stringifyChunked()
process data incrementally, optimizing memory usage.stringifyInfo()
allows estimating the size of resulting JSON before generating it, enabling better decision-making for JSON generation strategies.1npm install @discoveryjs/json-ext
Functions like JSON.parse()
, iterating over chunks to reconstruct the result object, and returns a Promise.
Note:
reviver
parameter is not supported yet.
1function parseChunked(input: Iterable<Chunk> | AsyncIterable<Chunk>): Promise<any>; 2function parseChunked(input: () => (Iterable<Chunk> | AsyncIterable<Chunk>)): Promise<any>; 3 4type Chunk = string | Buffer | Uint8Array;
Usage:
1import { parseChunked } from '@discoveryjs/json-ext'; 2 3const data = await parseChunked(chunkEmitter);
Parameter chunkEmitter
can be an iterable or async iterable that iterates over chunks, or a function returning such a value. A chunk can be a string
, Uint8Array
, or Node.js Buffer
.
Examples:
1parseChunked(function*() { 2 yield '{ "hello":'; 3 yield Buffer.from(' "wor'); // Node.js only 4 yield new TextEncoder().encode('ld" }'); // returns Uint8Array 5});
1parseChunked(async function*() { 2 for await (const chunk of someAsyncSource) { 3 yield chunk; 4 } 5});
1parseChunked(['{ "hello":', ' "world"}'])
1parseChunked(() => ['{ "hello":', ' "world"}'])
Readable
stream:
1import fs from 'node:fs';
2
3parseChunked(fs.createReadStream('path/to/file.json'))
Note: Iterability for Web streams was added later in the Web platform, not all environments support it. Consider using
parseFromWebStream()
for broader compatibility.
1const response = await fetch('https://example.com/data.json'); 2const data = await parseChunked(response.body); // body is ReadableStream
Functions like JSON.stringify()
, but returns a generator yielding strings instead of a single string.
Note: Returns
"null"
whenJSON.stringify()
returnsundefined
(since a chunk cannot beundefined
).
1function stringifyChunked(value: any, replacer?: Replacer, space?: Space): Generator<string, void, unknown>; 2function stringifyChunked(value: any, options: StringifyOptions): Generator<string, void, unknown>; 3 4type Replacer = 5 | ((this: any, key: string, value: any) => any) 6 | (string | number)[] 7 | null; 8type Space = string | number | null; 9type StringifyOptions = { 10 replacer?: Replacer; 11 space?: Space; 12 highWaterMark?: number; 13};
Usage:
Getting an array of chunks:
1const chunks = [...stringifyChunked(data)];
Iterating over chunks:
1for (const chunk of stringifyChunked(data)) { 2 console.log(chunk); 3}
Specifying the minimum size of a chunk with highWaterMark
option:
1const data = [1, "hello world", 42]; 2 3console.log([...stringifyChunked(data)]); // default 16kB 4// ['[1,"hello world",42]'] 5 6console.log([...stringifyChunked(data, { highWaterMark: 16 })]); 7// ['[1,"hello world"', ',42]'] 8 9console.log([...stringifyChunked(data, { highWaterMark: 1 })]); 10// ['[1', ',"hello world"', ',42', ']']
Streaming into a stream with a Promise
(modern Node.js):
1import { pipeline } from 'node:stream/promises'; 2import fs from 'node:fs'; 3 4await pipeline( 5 stringifyChunked(data), 6 fs.createWriteStream('path/to/file.json') 7);
Wrapping into a Promise
streaming into a stream (legacy Node.js):
1import { Readable } from 'node:stream'; 2 3new Promise((resolve, reject) => { 4 Readable.from(stringifyChunked(data)) 5 .on('error', reject) 6 .pipe(stream) 7 .on('error', reject) 8 .on('finish', resolve); 9});
Writing into a file synchronously:
Note: Slower than
JSON.stringify()
but uses much less heap space and has no limitation on string length
1import fs from 'node:fs';
2
3const fd = fs.openSync('output.json', 'w');
4
5for (const chunk of stringifyChunked(data)) {
6 fs.writeFileSync(fd, chunk);
7}
8
9fs.closeSync(fd);
Using with fetch (JSON streaming):
Note: This feature has limited support in browsers, see Streaming requests with the fetch API
Note:
ReadableStream.from()
has limited support in browsers, usecreateStringifyWebStream()
instead.
1fetch('http://example.com', { 2 method: 'POST', 3 duplex: 'half', 4 body: ReadableStream.from(stringifyChunked(data)) 5});
Wrapping into ReadableStream
:
Note: Use
ReadableStream.from()
orcreateStringifyWebStream()
when no extra logic is needed
1new ReadableStream({ 2 start() { 3 this.generator = stringifyChunked(data); 4 }, 5 pull(controller) { 6 const { value, done } = this.generator.next(); 7 8 if (done) { 9 controller.close(); 10 } else { 11 controller.enqueue(value); 12 } 13 }, 14 cancel() { 15 this.generator = null; 16 } 17});
1export function stringifyInfo(value: any, replacer?: Replacer, space?: Space): StringifyInfoResult;
2export function stringifyInfo(value: any, options?: StringifyInfoOptions): StringifyInfoResult;
3
4type StringifyInfoOptions = {
5 replacer?: Replacer;
6 space?: Space;
7 continueOnCircular?: boolean;
8}
9type StringifyInfoResult = {
10 bytes: number; // size of JSON in bytes
11 spaceBytes: number; // size of white spaces in bytes (when space option used)
12 circular: object[]; // list of circular references
13};
Functions like JSON.stringify()
, but returns an object with the expected overall size of the stringify operation and a list of circular references.
Example:
1import { stringifyInfo } from '@discoveryjs/json-ext'; 2 3console.log(stringifyInfo({ test: true }, null, 4)); 4// { 5// bytes: 20, // Buffer.byteLength('{\n "test": true\n}') 6// spaceBytes: 7, 7// circular: [] 8// }
Type: Boolean
Default: false
Determines whether to continue collecting info for a value when a circular reference is found. Setting this option to true
allows finding all circular references.
A helper function to consume JSON from a Web Stream. You can use parseChunked(stream)
instead, but @@asyncIterator
on ReadableStream
has limited support in browsers (see ReadableStream compatibility table).
1import { parseFromWebStream } from '@discoveryjs/json-ext'; 2 3const data = await parseFromWebStream(readableStream); 4// equivalent to (when ReadableStream[@@asyncIterator] is supported): 5// await parseChunked(readableStream);
A helper function to convert stringifyChunked()
into a ReadableStream
(Web Stream). You can use ReadableStream.from()
instead, but this method has limited support in browsers (see ReadableStream.from() compatibility table).
1import { createStringifyWebStream } from '@discoveryjs/json-ext'; 2 3createStringifyWebStream({ test: true }); 4// equivalent to (when ReadableStream.from() is supported): 5// ReadableStream.from(stringifyChunked({ test: true }))
MIT
No vulnerabilities found.
Reason
no dangerous workflow patterns detected
Reason
no binaries found in the repo
Reason
license file detected
Details
Reason
4 existing vulnerabilities detected
Details
Reason
dependency not pinned by hash detected -- score normalized to 4
Details
Reason
2 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 1
Reason
Found 1/30 approved changesets -- score normalized to 0
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Score
Last Scanned on 2025-05-05
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More