Gathering detailed insights and metrics for @json2csv/node
Gathering detailed insights and metrics for @json2csv/node
Gathering detailed insights and metrics for @json2csv/node
Gathering detailed insights and metrics for @json2csv/node
node-red-contrib-json2csv
A node to convert json to csv
object_mode-json2csv-per_record
A node transform stream in object mode that consumes a JSON object and produces a CSV record with optional header until null is consumed
@types/json2csv__node
Stub TypeScript definitions entry for @json2csv/node, which provides its own types definitions
npm install @json2csv/node
Typescript
Module System
Node Version
NPM Version
99.8
Supply Chain
99.5
Quality
76
Maintenance
100
Vulnerability
100
License
TypeScript (94.51%)
JavaScript (5.49%)
Total Downloads
5,758,069
Last Day
16,624
Last Week
87,614
Last Month
380,913
Last Year
3,658,752
MIT License
331 Stars
685 Commits
33 Forks
5 Watchers
4 Branches
61 Contributors
Updated on May 09, 2025
Minified
Minified + Gzipped
Latest Version
7.0.6
Package Id
@json2csv/node@7.0.6
Unpacked Size
114.45 kB
Size
31.64 kB
File Count
34
NPM Version
10.2.4
Node Version
20.11.0
Published on
Feb 11, 2024
Cumulative downloads
Total Downloads
Last Day
28.5%
16,624
Compared to previous day
Last Week
2.3%
87,614
Compared to previous week
Last Month
-5.2%
380,913
Compared to previous month
Last Year
93.1%
3,658,752
Compared to previous year
1
1
Fast and highly configurable JSON to CSV converter. It fully support conversion following the RFC4180 specification as well as other similar text delimited formats as TSV.
@json2csv/node
exposes two modules to integrate json2csv
with the Node.js Stream API for stream processing of JSON data.
This package includes two modules:
Node Transform
to offer a friendly promise-based API.There are multiple flavours of json2csv:
Parser
API and a new StreamParser
API which doesn't the conversion in a streaming fashion in pure js.Node Transform
and Node Async Parser
APIs for Node users.WHATWG Transform Stream
and WHATWG Async Parser
APIs for users of WHATWG streams (browser, Node or Deno).CLI
interface.And a couple of libraries that enable additional configurations:
transforms
for json2csv (unwind and flatten) allowing the using to transform data before is parsed.formatters
for json2csv (one for each data type, an excel-specific one, etc.). Formatters convert JSON data types into CSV-compatible strings.You can install json2csv as a dependency using NPM.
1$ npm install --save @json2csv/node
You can install json2csv as a dependency using Yarn.
1$ yarn add --save @json2csv/node
For Node.js users, the Streaming API is wrapped in a Node.js Stream Transform. This approach ensures a consistent memory footprint and avoids blocking JavaScript's event loop.
The async API takes a second options arguments that is directly passed to the underlying streams and accepts the same options as the standard Node.js streams, plus the options supported by the Stream Parser
.
This Transform uses the StreamParser
under the hood and support similar events.
1import { createReadStream, createWriteStream } from 'fs'; 2import { Transform } from '@json2csv/node'; 3 4const input = createReadStream(inputPath, { encoding: 'utf8' }); 5const output = createWriteStream(outputPath, { encoding: 'utf8' }); 6 7const opts = {}; 8const transformOpts = {}; 9const asyncOpts = {}; 10const parser = new Transform(opts, asyncOpts, transformOpts); 11 12const processor = input.pipe(parser).pipe(output); 13 14// You can also listen for events on the conversion and see how the header or the lines are coming out. 15parser 16 .on('header', (header) => console.log(header)) 17 .on('line', (line) => console.log(line));
ndjson
<Boolean> indicates that the data is in NDJSON format. Only effective when using the streaming API and not in object mode.fields
<DataSelector[]> Defaults to toplevel JSON attributes.transforms
<Transform[]> Array of transforms to apply to the data. A transform is a function that receives a data recod and returns a transformed record. Transforms are executed in order.formatters
<Formatters> Object where the each key is a Javascript data type and its associated value is a formatters for the given type.defaultValue
<Any> value to use when missing data. Defaults to <empty>
if not specified. (Overridden by fields[].default
)delimiter
<String> delimiter of columns. Defaults to ,
if not specified.eol
<String> overrides the default OS line ending (i.e. \n
on Unix and \r\n
on Windows).header
<Boolean> determines whether or not CSV file will contain a title column. Defaults to true
if not specified.includeEmptyRows
<Boolean> includes empty rows. Defaults to false
.withBOM
<Boolean> with BOM character. Defaults to false
.See the Duplex stream options for more details.
Options used by the underlying parsing library to process the binary or text stream.
Not relevant when running in objectMode
.
Buffering is only relevant if you expect very large strings/numbers in your JSON.
See @streamparser/json for more details about buffering.
stringBufferSize
<number> Size of the buffer used to parse strings. Defaults to 0 which means to don't buffer. Min valid value is 4.numberBufferSize
<number> Size of the buffer used to parse numbers. Defaults to 0 to don't buffer.See https://juanjodiaz.github.io/json2csv/#/parsers/node-transform.
To facilitate usage, NodeAsyncParser
wraps NodeTransform
exposing a single parse
method similar to the sync API. This method accepts JSON arrays/objects, TypedArrays, strings and readable streams as input and returns a stream that produces the CSV.
NodeAsyncParser
also exposes a convenience promise
method which turns the stream into a promise that resolves to the whole CSV.
1import { AsyncParser } from '@json2csv/node'; 2 3const opts = {}; 4const transformOpts = {}; 5const asyncOpts = {}; 6const parser = new AsyncParser(opts, asyncOpts, transformOpts); 7 8const csv = await parser.parse(data).promise(); 9 10// The parse method return the transform stream. 11// So data can be passed to a writable stream (a file, http request, etc.) 12parser.parse(data).pipe(writableStream);
ndjson
<Boolean> indicates that the data is in NDJSON format. Only effective when using the streaming API and not in object mode.fields
<DataSelector[]> Defaults to toplevel JSON attributes.transforms
<Transform[]> Array of transforms to apply to the data. A transform is a function that receives a data recod and returns a transformed record. Transforms are executed in order.formatters
<Formatters> Object where the each key is a Javascript data type and its associated value is a formatters for the given type.defaultValue
<Any> value to use when missing data. Defaults to <empty>
if not specified. (Overridden by fields[].default
)delimiter
<String> delimiter of columns. Defaults to ,
if not specified.eol
<String> overrides the default OS line ending (i.e. \n
on Unix and \r\n
on Windows).header
<Boolean> determines whether or not CSV file will contain a title column. Defaults to true
if not specified.includeEmptyRows
<Boolean> includes empty rows. Defaults to false
.withBOM
<Boolean> with BOM character. Defaults to false
.See the Duplex stream options for more details.
Options used by the underlying parsing library to process the binary or text stream.
Not relevant when running in objectMode
.
Buffering is only relevant if you expect very large strings/numbers in your JSON.
See @streamparser/json for more details about buffering.
stringBufferSize
<number> Size of the buffer used to parse strings. Defaults to 0 which means to don't buffer. Min valid value is 4.numberBufferSize
<number> Size of the buffer used to parse numbers. Defaults to 0 to don't buffer.See https://juanjodiaz.github.io/json2csv/#/parsers/node-async-parser.
See LICENSE.md.
No vulnerabilities found.
No security vulnerabilities found.