npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

fast-csv-delims

v0.0.211

Published

CSV parser for node.js. Doug Martin is the original author. This is merely a temporary fork to support alternate cell delimiters. Please support Mr Martin and his awesome work by using his module.

Downloads

7

Readme

build status

Fast-csv

This is a library is aimed at providing fast CSV parsing. It accomplishes this by not handling some of the more complex edge cases such as multi line rows. However it does support escaped values, embedded commas, double and single quotes.

Installation

npm install fast-csv

Usage

To parse a file.

var csv = require("fast-csv");

csv("my.csv")
 .on("data", function(data){
     console.log(data):
 })
 .on("end", function(){
     console.log("done");
 })
 .parse();

You may also parse a stream.

var stream = fs.createReadStream("my.csv");

csv(stream)
 .on("data", function(data){
     console.log(data):
 })
 .on("end", function(){
     console.log("done");
 })
 .parse();

If you expect the first line your csv to headers you may pass a headers option in. Setting the headers option will cause change each row to an object rather than an array.

var stream = fs.createReadStream("my.csv");

csv(stream, {headers : true})
 .on("data", function(data){
     console.log(data):
 })
 .on("end", function(){
     console.log("done");
 })
 .parse();

You may alternatively pass an array of header names which must match the order of each column in the csv, otherwise the data columns will not match.

var stream = fs.createReadStream("my.csv");

csv(stream, {headers : ["firstName", "lastName", "address"]})
 .on("data", function(data){
     console.log(data):
 })
 .on("end", function(){
     console.log("done");
 })
 .parse();

If your data may include empty rows, the sort Excel might include at the end of the file for instance, you can ignore these by including the ignoreEmpty option.

Any rows consisting of nothing but empty strings and/or commas will be skipped, without emitting a 'data' or 'error' event.

var stream = fs.createReadStream("my.csv");

csv(stream, {ignoreEmpty: true})
 .on("data", function(data){
     console.log(data):
 })
 .on("end", function(){
     console.log("done");
 })
 .parse();

Validating

You can validate each row in the csv by providing a validate handler. If a row is invalid then a data-invalid event will be emitted with the row and the index.

var stream = fs.createReadStream("my.csv");

csv(stream, {headers : true})
 .validate(function(data){
     return data.age < 50; //all persons must be under the age of 50
 })
 .on("data-invalid", function(data){
     //do something with invalid row
 })
 .on("data", function(data){
     console.log(data):
 })
 .on("end", function(){
     console.log("done");
 })
 .parse();

Transforming

You can transform data by providing in a transform function. What is returned from the transform function will be provided to validate and emitted as a row.

var stream = fs.createReadStream("my.csv");

csv(stream)
 .transform(function(data){
     return data.reverse(); //reverse each row.
 })
 .on("data", function(data){
     console.log(data):
 })
 .on("end", function(){
     console.log("done");
 })
 .parse();

License

MIT https://github.com/C2FO/fast-csv/raw/master/LICENSE

##Meta