Gathering detailed insights and metrics for stream-concat
Gathering detailed insights and metrics for stream-concat
Gathering detailed insights and metrics for stream-concat
Gathering detailed insights and metrics for stream-concat
concat-stream
writable stream that concatenates strings or binary data and calls a callback with the result
get-stream
Get a stream as a string, Buffer, ArrayBuffer or array
simple-concat
Super-minimalist version of `concat-stream`. Less than 15 lines!
@types/concat-stream
TypeScript definitions for concat-stream
npm install stream-concat
Typescript
Module System
Min. Node Version
Node Version
NPM Version
JavaScript (100%)
Total Downloads
0
Last Day
0
Last Week
0
Last Month
0
Last Year
0
MIT License
25 Stars
33 Commits
5 Forks
1 Watchers
2 Branches
6 Contributors
Updated on Dec 24, 2024
Latest Version
2.0.0
Package Id
stream-concat@2.0.0
Unpacked Size
8.21 kB
Size
3.35 kB
File Count
4
NPM Version
10.1.0
Node Version
20.9.0
Published on
Apr 10, 2024
Cumulative downloads
Total Downloads
Last Day
0%
NaN
Compared to previous day
Last Week
0%
NaN
Compared to previous week
Last Month
0%
NaN
Compared to previous month
Last Year
0%
NaN
Compared to previous year
3
Simple and efficient node stream concatenation.
node-stream-concat
concatenates several streams into one single readable stream. The input streams can either be existing streams or can be determined on the fly by a user specified function. Because the library and tests use modern APIs, node-stream-concat
supports Node LTS versions. Prior versions of the library (< 1.0.0
) have been tested from Node versions v8.0.0 through v10.0.0, but should work with versions down to v0.12 (tests will fail < 8.0.0 because of .destroy()).
npm install stream-concat
1 const StreamConcat = require('stream-concat'); 2 const combinedStream = new StreamConcat(streams, [options]);
The simplest way to use StreamConcat is to supply an array of readable streams.
1const fs = require('fs'); 2 3const stream1 = fs.createReadStream('file1.csv'); 4const stream2 = fs.createReadStream('file2.csv'); 5const stream3 = fs.createReadStream('file3.csv'); 6 7const output = fs.createWriteStream('combined.csv'); 8 9const combinedStream = new StreamConcat([stream1, stream2, stream3]); 10combinedStream.pipe(output);
However, when working with large amounts of data, this can lead to high memory usage and relatively poor performance (versus the original stream). This is because all streams' read queues are buffered and waiting to be read.
A better way is to defer opening a new stream until the moment it's needed. You can do this by passing a function into the constructor that returns the next available stream, or null
if there are no more streams.
If we're reading from several large files, we can do the following.
1const fs = require('fs'); 2 3const fileNames = ['file1.csv', 'file2.csv', 'file3.csv']; 4const fileIndex = 0; 5const nextStream = () => { 6 if (fileIndex === fileNames.length) { 7 return null; 8 } 9 return fs.createReadStream(fileNames[fileIndex++]); 10}; 11 12const combinedStream = new StreamConcat(nextStream);
Once StreamConcat is done with a stream it'll call nextStream
and start using the returned stream (if not null).
Additionally, the function you pass to the constructor can return a Promise
that resolves to a stream
. If the function fails, its error will be forwarded in an error
event in the outer StreamConcat
instance.
1const fs = require('fs'); 2 3const fileNames = ['file1.csv', 'file2.csv', 'file3.csv']; 4const fileIndex = 0; 5const nextStreamAsync = () => { 6 return new Promise((res) => { 7 if (fileIndex === fileNames.length) { 8 return null; 9 } 10 return fs.createReadStream(fileNames[fileIndex++]); 11 }); 12}; 13 14const combinedStream = new StreamConcat(nextStreamAsync);
Errors emitted in the provided streams will also be forwarded to the outer StreamConcat
instance:
1const stream = require('stream'); 2const StreamConcat = require('stream-concat'); 3 4const fileIndex = 0; 5const nextStream = () => { 6 if (fileIndex === 3) { 7 return null; 8 } 9 return new stream.Readable({ 10 read(){ throw new Error('Read failed'); } 11 }).once('error', e=>console.log('Got inner error: ', e)); 12}; 13 14const combinedStream = new StreamConcat(nextStream); 15// will be called with the same "Read failed" error 16combinedStream.once('error', e=>console.log('Got outer error: ', e));
These are standard Stream
options passed to the underlying Transform
stream.
highWaterMark
Number The maximum number of bytes to store in the internal buffer before ceasing to read from the underlying resource. Default=16kbencoding
String If specified, then buffers will be decoded to strings using the specified encoding. Default=nullobjectMode
Boolean Whether this stream should behave as a stream of objects. Meaning that stream.read(n) returns a single value instead of a Buffer of size n. Default=falseAdditional options:
advanceOnClose
Boolean Controls if the concatenation should move onto the next stream when the underlying streams emit close event, useful when operating on Transform
streams and calling destroy on them to skip the remaining data (supported on node >=8). Default=falseIf you've created the StreamConcat object from an array of streams, you can use addStream()
as long as the last stream hasn't finishing being read (StreamConcat hasn't emitted the end
event).
To add streams to a StreamConcat object created from a function, you should modify the underlying data that the function is accessing.
npm run test
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
0 existing vulnerabilities detected
Reason
license file detected
Details
Reason
Found 5/17 approved changesets -- score normalized to 2
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Score
Last Scanned on 2025-06-30
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More