Gathering detailed insights and metrics for csvtojson
Gathering detailed insights and metrics for csvtojson
Gathering detailed insights and metrics for csvtojson
Gathering detailed insights and metrics for csvtojson
Blazing fast and Comprehensive CSV Parser for Node.JS / Browser / Command Line.
npm install csvtojson
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
2,024 Stars
385 Commits
272 Forks
35 Watching
21 Branches
55 Contributors
Updated on 19 Nov 2024
Minified
Minified + Gzipped
TypeScript (67.83%)
JavaScript (32.16%)
Batchfile (0.01%)
Cumulative downloads
Total Downloads
Last day
-2.2%
166,719
Compared to previous day
Last week
1.2%
919,825
Compared to previous week
Last month
8.6%
3,780,415
Compared to previous month
Last year
10.6%
42,953,226
Compared to previous year
csvtojson
module is a comprehensive nodejs csv parser to convert csv to json or column arrays. It can be used as node.js library / command line tool / or in browser. Below are some features:
Here is a free online csv to json convert service utilizing latest csvtojson
module.
csvtojson
has released version 2.0.0
.
v1
, open this pageIt is still able to use v1 with csvtojson@2.0.0
1// v1 2const csvtojsonV1=require("csvtojson/v1"); 3// v2 4const csvtojsonV2=require("csvtojson"); 5const csvtojsonV2=require("csvtojson/v2"); 6
npm i --save csvtojson
1/** csv file 2a,b,c 31,2,3 44,5,6 5*/ 6const csvFilePath='<path to csv file>' 7const csv=require('csvtojson') 8csv() 9.fromFile(csvFilePath) 10.then((jsonObj)=>{ 11 console.log(jsonObj); 12 /** 13 * [ 14 * {a:"1", b:"2", c:"3"}, 15 * {a:"4", b:"5". c:"6"} 16 * ] 17 */ 18}) 19 20// Async / await usage 21const jsonArray=await csv().fromFile(csvFilePath); 22
1/** 2csvStr: 31,2,3 44,5,6 57,8,9 6*/ 7const csv=require('csvtojson') 8csv({ 9 noheader:true, 10 output: "csv" 11}) 12.fromString(csvStr) 13.then((csvRow)=>{ 14 console.log(csvRow) // => [["1","2","3"], ["4","5","6"], ["7","8","9"]] 15}) 16
1const request=require('request') 2const csv=require('csvtojson') 3 4csv() 5.fromStream(request.get('http://mywebsite.com/mycsvfile.csv')) 6.subscribe((json)=>{ 7 return new Promise((resolve,reject)=>{ 8 // long operation for each json e.g. transform / write into database. 9 }) 10},onError,onComplete); 11
1/** 2csvStr: 3a,b,c 41,2,3 54,5,6 6*/ 7 8const csv=require('csvtojson') 9csv({output:"line"}) 10.fromString(csvStr) 11.subscribe((csvLine)=>{ 12 // csvLine => "1,2,3" and "4,5,6" 13})
1const csv=require('csvtojson'); 2 3const readStream=require('fs').createReadStream(csvFilePath); 4 5const writeStream=request.put('http://mysite.com/obj.json'); 6 7readStream.pipe(csv()).pipe(writeStream); 8
To find more detailed usage, please see API section
$ npm i -g csvtojson
$ csvtojson [options] <csv file path>
Convert csv file and save result to json file:
$ csvtojson source.csv > converted.json
Pipe in csv data:
$ cat ./source.csv | csvtojson > converted.json
Print Help:
$ csvtojson
require('csvtojson')
returns a constructor function which takes 2 arguments:
1const csv=require('csvtojson') 2const converter=csv(parserParameters, streamOptions)
Both arguments are optional.
For Stream Options
please read Stream Option from Node.JS
parserParameters
is a JSON object like:
1const converter=csv({ 2 noheader:true, 3 trim:true, 4})
Following parameters are supported:
true
if version < 1.1.4)headName: <String | Function | ColParser>
. e.g. {field1:'number'} will use built-in number parser to convert value of the field1
column to number. For more information See details beloweol
like \n
) as a row. This will prevent eol
characters from being used within a row (even inside a quoted field). Default is false. Change to true if you are confident no inline line breaks (like line break in a cell which has multi line text)..then
is called (or await is used). If this is not desired, set this to false. Default is true.
All parameters can be used in Command Line tool.Since v2.0.0
, asynchronous processing has been fully supported.
e.g. Process each JSON result asynchronously.
1csv().fromFile(csvFile) 2.subscribe((json)=>{ 3 return new Promise((resolve,reject)=>{ 4 // Async operation on the json 5 // don't forget to call resolve and reject 6 }) 7})
For more details please read:
Converter
class defined a series of events.
header
event is emitted for each CSV file once. It passes an array object which contains the names of the header row.
1const csv=require('csvtojson') 2csv() 3.on('header',(header)=>{ 4 //header=> [header1, header2, header3] 5})
header
is always an array of strings without types.
data
event is emitted for each parsed CSV line. It passes buffer of stringified JSON in ndjson format unless objectMode
is set true in stream option.
1const csv=require('csvtojson') 2csv() 3.on('data',(data)=>{ 4 //data is a buffer object 5 const jsonStr= data.toString('utf8') 6})
error
event is emitted if any errors happened during parsing.
1const csv=require('csvtojson') 2csv() 3.on('error',(err)=>{ 4 console.log(err) 5})
Note that if error
being emitted, the process will stop as node.js will automatically unpipe()
upper-stream and chained down-stream1. This will cause end
event never being emitted because end
event is only emitted when all data being consumed 2. If need to know when parsing finished, use done
event instead of end
.
done
event is emitted either after parsing successfully finished or any error happens. This indicates the processor has stopped.
1const csv=require('csvtojson') 2csv() 3.on('done',(error)=>{ 4 //do some stuff 5})
if any error during parsing, it will be passed in callback.
the hook -- preRawData
will be called with csv string passed to parser.
1const csv=require('csvtojson') 2// synchronous 3csv() 4.preRawData((csvRawData)=>{ 5 var newData=csvRawData.replace('some value','another value'); 6 return newData; 7}) 8 9// asynchronous 10csv() 11.preRawData((csvRawData)=>{ 12 return new Promise((resolve,reject)=>{ 13 var newData=csvRawData.replace('some value','another value'); 14 resolve(newData); 15 }) 16 17})
The function is called each time a file line has been parsed in csv stream. The lineIdx
is the file line number in the file starting with 0.
1const csv=require('csvtojson') 2// synchronous 3csv() 4.preFileLine((fileLineString, lineIdx)=>{ 5 if (lineIdx === 2){ 6 return fileLineString.replace('some value','another value') 7 } 8 return fileLineString 9}) 10 11// asynchronous 12csv() 13.preFileLine((fileLineString, lineIdx)=>{ 14 return new Promise((resolve,reject)=>{ 15 // async function processing the data. 16 }) 17 18 19})
To transform result that is sent to downstream, use .subscribe
method for each json populated.
1const csv=require('csvtojson') 2csv() 3.subscribe((jsonObj,index)=>{ 4 jsonObj.myNewKey='some value' 5 // OR asynchronously 6 return new Promise((resolve,reject)=>{ 7 jsonObj.myNewKey='some value'; 8 resolve(); 9 }) 10}) 11.on('data',(jsonObj)=>{ 12 console.log(jsonObj.myNewKey) // some value 13});
csvtojson
is able to convert csv line to a nested JSON by correctly defining its csv header row. This is default out-of-box feature.
Here is an example. Original CSV:
1fieldA.title, fieldA.children.0.name, fieldA.children.0.id,fieldA.children.1.name, fieldA.children.1.employee.0.name,fieldA.children.1.employee.1.name, fieldA.address.0,fieldA.address.1, description 2Food Factory, Oscar, 0023, Tikka, Tim, Joe, 3 Lame Road, Grantstown, A fresh new food factory 3Kindom Garden, Ceil, 54, Pillow, Amst, Tom, 24 Shaker Street, HelloTown, Awesome castle 4
The data above contains nested JSON including nested array of JSON objects and plain texts.
Using csvtojson to convert, the result would be like:
1[{ 2 "fieldA": { 3 "title": "Food Factory", 4 "children": [{ 5 "name": "Oscar", 6 "id": "0023" 7 }, { 8 "name": "Tikka", 9 "employee": [{ 10 "name": "Tim" 11 }, { 12 "name": "Joe" 13 }] 14 }], 15 "address": ["3 Lame Road", "Grantstown"] 16 }, 17 "description": "A fresh new food factory" 18}, { 19 "fieldA": { 20 "title": "Kindom Garden", 21 "children": [{ 22 "name": "Ceil", 23 "id": "54" 24 }, { 25 "name": "Pillow", 26 "employee": [{ 27 "name": "Amst" 28 }, { 29 "name": "Tom" 30 }] 31 }], 32 "address": ["24 Shaker Street", "HelloTown"] 33 }, 34 "description": "Awesome castle" 35}]
In order to not produce nested JSON, simply set flatKeys:true
in parameters.
1/** 2csvStr: 3a.b,a.c 41,2 5*/ 6csv({flatKeys:true}) 7.fromString(csvStr) 8.subscribe((jsonObj)=>{ 9 //{"a.b":1,"a.c":2} rather than {"a":{"b":1,"c":2}} 10}); 11
csvtojson
uses csv header row as generator of JSON keys. However, it does not require the csv source containing a header row. There are 4 ways to define header rows:
headers:[]
and noheader:false
parameters.headers:[]
and noheader:true
parameters.noheader:true
. This will automatically add fieldN
header to csv cells1// replace header row (first row) from original source with 'header1, header2' 2csv({ 3 noheader: false, 4 headers: ['header1','header2'] 5}) 6 7// original source has no header row. add 'field1' 'field2' ... 'fieldN' as csv header 8csv({ 9 noheader: true 10}) 11 12// original source has no header row. use 'header1' 'header2' as its header row 13csv({ 14 noheader: true, 15 headers: ['header1','header2'] 16}) 17
Column Parser
allows writing a custom parser for a column in CSV data.
What is Column Parser
When csvtojson
walks through csv data, it converts value in a cell to something else. For example, if checkType
is true
, csvtojson
will attempt to find a proper type parser according to the cell value. That is, if cell value is "5", a numberParser
will be used and all value under that column will use the numberParser
to transform data.
There are currently following built-in parser:
This will override types inferred from checkType:true
parameter. More built-in parsers will be added as requested in issues page.
Example:
1/*csv string 2column1,column2 3hello,1234 4*/ 5csv({ 6 colParser:{ 7 "column1":"omit", 8 "column2":"string", 9 }, 10 checkType:true 11}) 12.fromString(csvString) 13.subscribe((jsonObj)=>{ 14 //jsonObj: {column2:"1234"} 15})
Sometimes, developers want to define custom parser. It is able to pass a function to specific column in colParser
.
Example:
1/*csv data 2name, birthday 3Joe, 1970-01-01 4*/ 5csv({ 6 colParser:{ 7 "birthday":function(item, head, resultRow, row , colIdx){ 8 /* 9 item - "1970-01-01" 10 head - "birthday" 11 resultRow - {name:"Joe"} 12 row - ["Joe","1970-01-01"] 13 colIdx - 1 14 */ 15 return new Date(item); 16 } 17 } 18})
Above example will convert birthday
column into a js Date
object.
The returned value will be used in result JSON object. Returning undefined
will not change result JSON object.
It is also able to mark a column as flat
:
1 2/*csv string 3person.comment,person.number 4hello,1234 5*/ 6csv({ 7 colParser:{ 8 "person.number":{ 9 flat:true, 10 cellParser: "number" // string or a function 11 } 12 } 13}) 14.fromString(csvString) 15.subscribe((jsonObj)=>{ 16 //jsonObj: {"person.number":1234,"person":{"comment":"hello"}} 17})
Very much appreciate any types of donation and support.
csvtojson
follows github convention for contributions. Here are some steps:
npm test
locally before pushing code back.Thanks all the contributors
Thank you to all our backers! [Become a backer]
Thank you to all our sponsors! (please ask your company to also support this open source project by becoming a sponsor)
To use csvtojson
in browser is quite simple. There are two ways:
1. Embed script directly into script tag
There is a pre-built script located in browser/csvtojson.min.js
. Simply include that file in a script
tag in index.html
page:
1<script src="node_modules/csvtojson/browser/csvtojson.min.js"></script> 2<!-- or use cdn --> 3<script src="https://cdn.rawgit.com/Keyang/node-csvtojson/d41f44aa/browser/csvtojson.min.js"></script>
then use a global csv
function
1<script> 2csv({ 3 output: "csv" 4}) 5.fromString("a,b,c\n1,2,3") 6.then(function(result){ 7 8}) 9</script>
2. Use webpack or browserify
If a module packager is preferred, just simply require("csvtojson")
:
1var csv=require("csvtojson"); 2 3// or with import 4import {csv} from "csvtojson"; 5 6//then use csv as normal, you'll need to load the CSV first, this example is using Fetch https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch 7fetch('http://mywebsite.com/mycsvfile.csv') 8 .then(response => response.text()) 9 .then(text => csv.fromString(text)); 10 .then(function(result){ 11 12 }) 13
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
license file detected
Details
Reason
Found 4/7 approved changesets -- score normalized to 5
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Reason
74 existing vulnerabilities detected
Details
Score
Last Scanned on 2024-11-25
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More