csv-node
v1.4.1
Published
An parser for csv files
Downloads
2,360
Maintainers
Readme
CSV Node
Overview
The library is for read csv and manager csv tables like humam, automating the process of read or write and serializer the csv rows to object.
Sumary
Features
- Read an csv file and serializer data to
javascript/typescript
objects; - Write csv tables;
- Automatic cast for numbers an booleans;
- Alias in colunms of csv table;
- Skip row of csv;
- Limit numbers of rows.
- Map row;
- Agregates function (max, min, avg and sum) in an colunm;
- Filter rows.
Next Features
- Join csv tables;
Install
npm install csv-node
or
yarn add csv-node
Usage
First import class CSVReader of module csv-node.
import { CSVReader } from "csv-node"
// or
const { CSVReader } = require("csv-node")
The module csv-node export:
|name|description| |:-----|:-----| |AliasMap|An object for mapper alias columns names| |FilterFunction|An function for filter rows in csv files| |PredicateFunction|An function for apply an predicate to rows in csv files| |CSVReadOptions|The options for read an csv| |CSVWriterOptions|The options for write an csv| |CSVReader|The class to read csv files| |CSVWriter|The class to write csv files| |CSVNotFound|Error, throw if file not exist|
Basic usage
// names.csv
name,age
Joh,19
Mary,20
Nicoll,21
Ju,18
Let, create an file index.js, for example, and function loadCsv, for example, for your tests.
const { CSVReader } = require("csv-node")
async function loadCsv() {
// let's go...
}
loadCsv()
.then()
.catch(console.error)
Run node index.js
in your bash for tests.
CSVRead
The examples below are for read an csv, they can be found in examples
folder.
First read
Use path
for absolutes paths in NodeJS.
JS
const path = require("path")
const { CSVReader } = require("csv-node")
const fileName = path.resolve(__dirname, "names.csv")
async function loadCsv() {
const reader = new CSVReader(fileName)
const data = await reader.read()
console.log(data)
}
loadCsv()
.then()
.catch(console.error)
TS
import path from "path"
import { CSVReader } from "csv-node"
const fileName = path.resolve(__dirname, "names.csv")
interface SimplePerson {
name: string
age: string
}
async function loadCsv() {
const reader = new CSVReader<SimplePerson>(fileName)
const data = await reader.read()
console.log(data) // data is of type SimplePerson[]
}
loadCsv()
.then()
.catch(console.error)
Output.
[
{ "name": "Joh", "age": "19" },
{ "name": "Mary", "age": "20" },
{ "name": "Nicoll", "age": "21" },
{ "name": "Ju", "age": "18" }
]
Even though age is an number, but is loaded like an string, your can be use an map function or enable castNumbers
option
of CSVReader for fix this.
Options
The second param of contructor CSVReader
is an object of options, the options availables are.
|name|description|type|required|default|
|:-----|:-----|:---:|:-----|:-----|
|alias|An object that will rename columns|object
|false|{}
|
|skipLines|The numbers of lines for skipping|number
|false|0
|
|limit|The numbers max of rows|number
|false|Infinity
|
|delimiter|Delimiter between columns|string
|false|,
|
|castNumbers|Automatic cast for numbers|boolean
|false|false
|
|castBooleans|Automatic cast for booleans|boolean
|false|false
|
|filter|Filter rows likes Array.filter
|FilterFunction
|false|none
|
|map|Map rows likes Array.map
|MapFunction
|false|none
|
Options usage
Alias
You doesn't need rename all headers of csv table.
JS
const path = require("path")
const { CSVReader } = require("csv-node")
const fileName = path.resolve(__dirname, "names.csv")
async function loadCsv() {
const reader = new CSVReader(fileName, {
alias: {
name: 'Name',
age: 'Age'
}
})
const data = await reader.read()
console.log(data)
}
loadCsv()
.then()
.catch(console.error)
TS
import path from "path"
import { CSVReader } from "csv-node"
const fileName = path.resolve(__dirname, "names.csv")
interface SimplePerson {
Name: string
Age: string
}
async function loadCsv() {
const reader = new CSVReader<SimplePerson>(fileName, {
alias: {
name: 'Name',
age: 'Age'
}
})
const data = await reader.read()
console.log(data) // data is of type SimplePerson[]
}
loadCsv()
.then()
.catch(console.error)
Output.
[
{ "Name": "Joh", "Age": "19" },
{ "Name": "Mary", "Age": "20" },
{ "Name": "Nicoll", "Age": "21" },
{ "Name": "Ju", "Age": "18" }
]
Skip Lines
This option will skip x lines, like offset
in SQL.
JS
const path = require("path")
const { CSVReader } = require("csv-node")
const fileName = path.resolve(__dirname, "names.csv")
async function loadCsv() {
const reader = new CSVReader(fileName, {
skipLines: 1
})
const data = await reader.read()
console.log(data)
}
loadCsv()
.then()
.catch(console.error)
TS
import path from "path"
import { CSVReader } from "csv-node"
const fileName = path.resolve(__dirname, "names.csv")
interface SimplePerson {
name: string
age: string
}
async function loadCsv() {
const reader = new CSVReader<SimplePerson>(fileName, {
skipLines: 1
})
const data = await reader.read()
console.log(data) // data is of type SimplePerson[]
}
loadCsv()
.then()
.catch(console.error)
Output.
[
{ "name": "Mary", "age": "20" },
{ "name": "Nicoll", "age": "21" },
{ "name": "Ju", "age": "18" }
]
Limit
The option is for limit the result size, like limit
in SQL;
JS
const path = require("path")
const { CSVReader } = require("csv-node")
const fileName = path.resolve(__dirname, "names.csv")
async function loadCsv() {
const reader = new CSVReader(fileName, {
limit: 2
})
const data = await reader.read()
console.log(data)
}
loadCsv()
.then()
.catch(console.error)
TS
import path from "path"
import { CSVReader } from "csv-node"
const fileName = path.resolve(__dirname, "names.csv")
interface SimplePerson {
name: string
age: string
}
async function loadCsv() {
const reader = new CSVReader<SimplePerson>(fileName, {
limit: 2
})
const data = await reader.read()
console.log(data) // data is of type SimplePerson[]
}
loadCsv()
.then()
.catch(console.error)
Output.
[
{ "name": "Joh", "age": "19" },
{ "name": "Mary", "age": "20" }
]
Delimiter
This is delimiter between colunms.
Filter
Filter the row of csv, the callback function is of type FilterFunction
, this feat is like Array.filter
.
JS
const path = require("path")
const { CSVReader } = require("csv-node")
const fileName = path.resolve(__dirname, "names.csv")
async function loadCsv() {
const reader = new CSVReader(fileName, {
filter: (data) => data.age < 20
})
const data = await reader.read()
console.log(data)
}
loadCsv()
.then()
.catch(console.error)
TS
import path from "path"
import { CSVReader } from "csv-node"
const fileName = path.resolve(__dirname, "names.csv")
interface SimplePerson {
name: string
age: string
}
async function loadCsv() {
const reader = new CSVReader<SimplePerson>(fileName, {
// the `data` is of type SimplePerson
filter: (data) => Number(data.age) < 20
})
const data = await reader.read()
console.log(data) // data is of type SimplePerson[]
}
loadCsv()
.then()
.catch(console.error)
Output.
[
{ "Name": "Joh", "Age": "19" },
{ "Name": "Ju", "Age": "18" }
]
Map
The option will map the csv row, the callback function is of type MapFunction
, this feat is like Array.map
.
JS
const path = require("path")
const { CSVReader } = require("csv-node")
const fileName = path.resolve(__dirname, "names.csv")
async function loadCsv() {
const reader = new CSVReader(fileName, {
map: (data) => `${data.name}-${data.age}`
})
const data = await reader.read()
console.log(data)
}
loadCsv()
.then()
.catch(console.error)
Output.
[ "Joh-19", "Mary-20", "Nicoll-21", "Ju-18" ]
TS
import path from "path"
import { CSVReader } from "csv-node"
const fileName = path.resolve(__dirname, "names.csv")
interface SimplePerson {
name: string
age: string
}
interface Person {
name: string
age: number
}
async function loadCsv() {
const reader = new CSVReader<SimplePerson, Person>(fileName, {
// data is of type SimplePerson
map: (data) => ({
name: data.name,
age: Number(data.age)
})
})
const data = await reader.read()
console.log(data) // data is of type Person[]
}
loadCsv()
.then()
.catch(console.error)
Output.
[
{ "name": "Joh", "age": 19 },
{ "name": "Mary", "age": 20 },
{ "name": "Nicoll", "age": 21 },
{ "name": "Ju", "age": 18 }
]
Cast Numbers
Automatic cast numbers.
JS
const path = require("path")
const { CSVReader } = require("csv-node")
const fileName = path.resolve(__dirname, "names.csv")
async function loadCsv() {
const reader = new CSVReader(fileName, {
castNumbers: true
})
const data = await reader.read()
console.log(data)
}
loadCsv()
.then()
.catch(console.error)
TS
import path from "path"
import { CSVReader } from "csv-node"
const fileName = path.resolve(__dirname, "names.csv")
interface SimplePerson {
name: string
age: number
}
async function loadCsv() {
const reader = new CSVReader<SimplePerson>(fileName, {
castNumbers: true
})
const data = await reader.read()
console.log(data) // data is of type SimplePerson[]
}
loadCsv()
.then()
.catch(console.error)
Output.
[
{ "name": "Joh", "age": 19 },
{ "name": "Mary", "age": 20 },
{ "name": "Nicoll", "age": 21 },
{ "name": "Ju", "age": 18 }
]
Cast Booleans
Automatic cast booleans.
JS
const path = require("path")
const { CSVReader } = require("csv-node")
const fileName = path.resolve(__dirname, "todos.csv")
async function loadCsv() {
const reader = new CSVReader(fileName, {
castBooleans: true
})
const data = await reader.read()
console.log(data)
}
loadCsv()
.then()
.catch(console.error)
TS
import path from "path"
import { CSVReader } from "csv-node"
const fileName = path.resolve(__dirname, "todos.csv")
interface SimplePerson {
name: string
completed: boolean
}
async function loadCsv() {
const reader = new CSVReader<SimplePerson>(fileName, {
castBooleans: true
})
const data = await reader.read()
console.log(data) // data is of type SimplePerson[]
}
loadCsv()
.then()
.catch(console.error)
Output.
[
{ "name": "Todo 1", "completed": true },
{ "name": "Todo 2", "completed": true },
{ "name": "Todo 3", "completed": false },
{ "name": "Todo 4", "completed": true },
{ "name": "Todo 5", "completed": false }
]
The options can be combined.
Order of call of options:
- Alias;
- Map;
- Skip Lines & Limit;
- Filter;
- cast.
The filePath
filePath
must be absolute or csv-node
search the file startirg of root folder of project node.
CSVReader API
The CSVReader
class provide the methods and fields bellow.
Fields
|name|description|type|
|:-----|:-----|:---:|
|headers|The headers columns with alias|string []
|
|nativeHeaders|The real headers of csv table|string []
|
|data|The data of csv|T[]
|
All fields only is available before call function read
. The nativeHeaders
and headers
are available before call any methods.
Methods
|name|description|return|
|:-----|:-----|:---:|
|read()
|Read the csv data|Promise<T[]>
|
|min(column: string)
|Return the min value of an column|Promise<number | undefined>
|
|sum(column: string)
|Return the sum value of an column|Promise<number | undefined>
|
|max(column: string)
|Return the max value of an column|Promise<number | undefined>
|
|avg(column: string)
|Return the average value of an column|Promise<number | undefined>
|
For tests, you can be usage the file CSV Test.
Read
The read
function already explained in Usage.
Min
The min function return the min value of column passed in parameters of min(string: column)
.
You can be usage the option config like read()
function.
async function loadCsv() {
const fileName = path.resolve(__dirname, "file3.csv")
const reader = new CSVReader(fileName)
const min = await reader.min("price")
console.log(min)
}
// 0.03
Max
The max function return the max value of column passed in parameters of max(string: column)
.
You can be usage the option config like read()
function.
async function loadCsv() {
const fileName = path.resolve(__dirname, "file3.csv")
const reader = new CSVReader(fileName)
const max = await reader.max("price")
console.log(max)
}
// 99.99
Avg
The avg functions return the average value of column passed in parameters of avg(string: column)
.
You can be usage the option config like read()
function.
async function loadCsv() {
const fileName = path.resolve(__dirname, "file3.csv")
const reader = new CSVReader(fileName)
const avg = await reader.avg("price")
console.log(max)
}
// 49.492769999999936
Sum
The avg functions return the sum value of column passed in parameters of sum(string: column)
.
You can be usage the option config like read()
function.
async function loadCsv() {
const fileName = path.resolve(__dirname, "file3.csv")
const reader = new CSVReader(fileName)
const sum = await reader.sum("price")
console.log(sum)
}
// 49492.76999999994
The options can be used.
CSVWriter
Options
The second param of contructor CSVWriter
is an object of options, the options availables are.
|name|description|type|required|default|
|:-----|:-----|:---:|:-----|:-----|
|headers|An object that will describe the columns|object
|true|---|
|delimiter|Delimiter between columns|string
|false|,
|
|format|The function for format an column|object
|false|{}
|
|defaultValue|An object with default value for empty columns|object
|false|{}
|
Options usage
Headers
You must provide the headers that will writer in csv file, you can rename columns or not.
JS
const path = require("path")
const { CSVWriter } = require("csv-node")
const fileName = path.resolve(__dirname, "output.csv")
const data = [
{ name: 'David0', age: 18 },
{ name: 'David1', age: 18 },
{ name: 'David2', age: 18 },
{ name: 'David3', age: 18 },
{ name: 'David4', age: 18 }
]
async function loadCsv() {
const writer = new CSVWriter(fileName, {
headers: {
name: 'name',
age: 'age'
}
})
await writer.write(data)
}
loadCsv()
.then()
.catch(console.error)
TS
import path from "path"
import { CSVWriter } from "csv-node"
const fileName = path.resolve(__dirname, "output.csv")
interface Person {
name: string
age: number
}
const data: Person[] = [
{ name: 'David0', age: 18 },
{ name: 'David1', age: 18 },
{ name: 'David2', age: 18 },
{ name: 'David3', age: 18 },
{ name: 'David4', age: 18 }
]
async function loadCsv() {
const writer = new CSVWriter<Person>(fileName, {
headers: {
name: 'name',
age: 'age'
}
})
await writer.write(data)
}
loadCsv()
.then()
.catch(console.error)
Output.
name,age
David0,18
David1,18
David2,18
David3,18
David4,18
Delimiter
The delimiter between columns.
Format
The option is for apply an function before save, for example if object contain Dates
is interesting save time only.
JS
const path = require("path")
const { CSVWriter } = require("csv-node")
const fileName = path.resolve(__dirname, "output.csv")
const data = [
{ name: 'David0', age: 18 },
{ name: 'David1', age: 18 },
{ name: 'David2', age: 18 },
{ name: 'David3', age: 18 },
{ name: 'David4', age: 18 }
]
async function loadCsv() {
const writer = new CSVWriter(fileName, {
headers: {
name: 'name',
age: 'age'
},
format: {
age: (age) => `${age} years`
}
})
await writer.write(data)
}
loadCsv()
.then()
.catch(console.error)
TS
import path from "path"
import { CSVWriter } from "csv-node"
const fileName = path.resolve(__dirname, "output.csv")
const data: Person[] = [
{ name: 'David0', age: 18 },
{ name: 'David1', age: 18 },
{ name: 'David2', age: 18 },
{ name: 'David3', age: 18 },
{ name: 'David4', age: 18 }
]
interface Person {
name: string
age: number
}
async function loadCsv() {
const writer = new CSVWriter<Person>(fileName, {
headers: {
name: 'name',
age: 'age'
},
format: {
age: (age) => `${age} years`
}
})
await writer.write(data)
}
loadCsv()
.then()
.catch(console.error)
Output.
name,age
David0,18 years
David1,18 years
David2,18 years
David3,18 years
David4,18 years
Default value
The option is for add fallback value if object not contains the column. The default value is NULL
, but you can change.
JS
const path = require("path")
const { CSVWriter } = require("csv-node")
const fileName = path.resolve(__dirname, "output.csv")
const data = [
{ name: 'David0' },
{ age: 18 },
{ name: 'David2', age: 18 },
{ name: 'David3'},
{ name: 'David4', age: 18 }
]
async function loadCsv() {
const writer = new CSVWriter(fileName, {
headers: {
name: 'name',
age: 'age'
},
defaultValue: {
name: 'None',
age: '0'
}
})
await writer.write(data)
}
loadCsv()
.then()
.catch(console.error)
TS
import path from "path"
import { CSVWriter } from "csv-node"
const fileName = path.resolve(__dirname, "output.csv")
const data = [
{ name: 'David0' },
{ age: 18 },
{ name: 'David2', age: 18 },
{ name: 'David3'},
{ name: 'David4', age: 18 }
]
interface Person {
name: string
age: number
}
async function loadCsv() {
const writer = new CSVWriter<Partial<Person>>(fileName, {
headers: {
name: 'name',
age: 'age'
},
defaultValue: {
name: 'None',
age: '0'
}
})
await writer.write(data)
}
loadCsv()
.then()
.catch(console.error)
Output.
name,age
David0,0
None,18
David2,18
David3,0
David4,18
The options can be combinated.
Methods
|name|description|return|
|:-----|:-----|:---:|
|writer(data: object)
|Write the object in csv|Promise<void>
|