Gathering detailed insights and metrics for through2
Gathering detailed insights and metrics for through2
Gathering detailed insights and metrics for through2
Gathering detailed insights and metrics for through2
through2-filter
A through2 to create an Array.prototype.filter analog for streams.
through2-map
A through2 to create an Array.prototype.map analog for streams.
@types/through2
TypeScript definitions for through2
through2-concurrent
Like through2 except runs in parallel with limited concurrency
Tiny wrapper around Node streams2 Transform to avoid explicit subclassing noise
npm install through2
Typescript
Module System
Node Version
NPM Version
JavaScript (100%)
Total Downloads
8,780,979,795
Last Day
4,973,479
Last Week
31,672,285
Last Month
132,246,044
Last Year
1,539,527,428
MIT License
1,900 Stars
125 Commits
105 Forks
20 Watchers
2 Branches
20 Contributors
Updated on May 09, 2025
Minified
Minified + Gzipped
Latest Version
4.0.2
Package Id
through2@4.0.2
Size
3.92 kB
NPM Version
6.14.5
Node Version
14.4.0
Published on
Jun 30, 2020
Cumulative downloads
Total Downloads
Last Day
13%
4,973,479
Compared to previous day
Last Week
9.4%
31,672,285
Compared to previous week
Last Month
-3.1%
132,246,044
Compared to previous month
Last Year
0.3%
1,539,527,428
Compared to previous year
1
A tiny wrapper around Node.js streams.Transform (Streams2/3) to avoid explicit subclassing noise
Inspired by Dominic Tarr's through in that it's so much easier to make a stream out of a function than it is to set up the prototype chain properly: through(function (chunk) { ... })
.
1fs.createReadStream('ex.txt') 2 .pipe(through2(function (chunk, enc, callback) { 3 for (let i = 0; i < chunk.length; i++) 4 if (chunk[i] == 97) 5 chunk[i] = 122 // swap 'a' for 'z' 6 7 this.push(chunk) 8 9 callback() 10 })) 11 .pipe(fs.createWriteStream('out.txt')) 12 .on('finish', () => doSomethingSpecial())
Or object streams:
1const all = [] 2 3fs.createReadStream('data.csv') 4 .pipe(csv2()) 5 .pipe(through2.obj(function (chunk, enc, callback) { 6 const data = { 7 name : chunk[0] 8 , address : chunk[3] 9 , phone : chunk[10] 10 } 11 this.push(data) 12 13 callback() 14 })) 15 .on('data', (data) => { 16 all.push(data) 17 }) 18 .on('end', () => { 19 doSomethingSpecial(all) 20 })
Note that through2.obj(fn)
is a convenience wrapper around through2({ objectMode: true }, fn)
.
Since Node.js introduced Simplified Stream Construction, many uses of through2 have become redundant. Consider whether you really need to use through2 or just want to use the 'readable-stream'
package, or the core 'stream'
package (which is derived from 'readable-stream'
):
1const { Transform } = require('readable-stream') 2 3const transformer = new Transform({ 4 transform(chunk, enc, callback) { 5 // ... 6 } 7})
through2([ options, ] [ transformFunction ] [, flushFunction ])
Consult the stream.Transform documentation for the exact rules of the transformFunction
(i.e. this._transform
) and the optional flushFunction
(i.e. this._flush
).
The options argument is optional and is passed straight through to stream.Transform
. So you can use objectMode:true
if you are processing non-binary streams (or just use through2.obj()
).
The options
argument is first, unlike standard convention, because if I'm passing in an anonymous function then I'd prefer for the options argument to not get lost at the end of the call:
1fs.createReadStream('/tmp/important.dat')
2 .pipe(through2({ objectMode: true, allowHalfOpen: false },
3 (chunk, enc, cb) => {
4 cb(null, 'wut?') // note we can use the second argument on the callback
5 // to provide data as an alternative to this.push('wut?')
6 }
7 ))
8 .pipe(fs.createWriteStream('/tmp/wut.txt'))
The transformFunction
must have the following signature: function (chunk, encoding, callback) {}
. A minimal implementation should call the callback
function to indicate that the transformation is done, even if that transformation means discarding the chunk.
To queue a new chunk, call this.push(chunk)
—this can be called as many times as required before the callback()
if you have multiple pieces to send on.
Alternatively, you may use callback(err, chunk)
as shorthand for emitting a single chunk or an error.
If you do not provide a transformFunction
then you will get a simple pass-through stream.
The optional flushFunction
is provided as the last argument (2nd or 3rd, depending on whether you've supplied options) is called just prior to the stream ending. Can be used to finish up any processing that may be in progress.
1fs.createReadStream('/tmp/important.dat')
2 .pipe(through2(
3 (chunk, enc, cb) => cb(null, chunk), // transform is a noop
4 function (cb) { // flush function
5 this.push('tacking on an extra buffer to the end');
6 cb();
7 }
8 ))
9 .pipe(fs.createWriteStream('/tmp/wut.txt'));
through2.ctor([ options, ] transformFunction[, flushFunction ])
Instead of returning a stream.Transform
instance, through2.ctor()
returns a constructor for a custom Transform. This is useful when you want to use the same transform logic in multiple instances.
1const FToC = through2.ctor({objectMode: true}, function (record, encoding, callback) { 2 if (record.temp != null && record.unit == "F") { 3 record.temp = ( ( record.temp - 32 ) * 5 ) / 9 4 record.unit = "C" 5 } 6 this.push(record) 7 callback() 8}) 9 10// Create instances of FToC like so: 11const converter = new FToC() 12// Or: 13const converter = FToC() 14// Or specify/override options when you instantiate, if you prefer: 15const converter = FToC({objectMode: true})
through2 is Copyright © Rod Vagg and additional contributors and licensed under the MIT license. All rights not explicitly granted in the MIT license are reserved. See the included LICENSE file for more details.
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
no dangerous workflow patterns detected
Reason
0 existing vulnerabilities detected
Reason
license file detected
Details
Reason
Found 5/26 approved changesets -- score normalized to 1
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
dependency not pinned by hash detected -- score normalized to 0
Details
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Score
Last Scanned on 2025-05-05
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More