npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

elastic-import

v1.3.0

Published

CLI for import data to ElasticSearch

Downloads

5

Readme

elastic-import

A light tool to import data to ElasticSearch

Install

npm install elastic-import -g

Usage from CLI

You can see all the options using the command elastic-import --help

Usage: elastic-import [options] <file> <host> <index> <type>

  Imports a file from differents types

  Options:

    -h, --help                   output usage information
    -V, --version                output the version number
    -l, --log <level>            ElasticSearch log value. One of 'trace', 'debug', 'info', 'warn', 'error'. Default is 'info'
    -b, --bulk-size <size>       Records sent to the Elasticsearch server for each request. Default is 1000
    -i, --ignore <fields>        Comma separated fields that will be ignored. You can use 'field.sub', 'field.sub[0].other' or 'field.sub[*].other'
    -w, --warn-errors            Warns on error instead of kill the process
    -t, --transform-file <file>  Path to a file that exports a function to transform the object
    -f, --fields <fields>        Comma separated name of the fields for CSV import
    -h, --header-fields          Try to use the first line to parse the name of the fields for CSV import
    -d, --delimiter <delimiter>  Field delimiter for CSV import. Defaults to comma. For tab use 'tab'
    -q, --quote <quote>          Character surrounding the fields for CSV import. Defaults to nothing
    -p, --csvParse               Parser will attempt to convert read data types to native types when using CSV import
    -T  --timeout <timeout>      Milliseconds before an Elastic request will be aborted and retried. Default is 30000
    --mongo                      Imports from mongo-export file
    --json                       Imports from json file
    --csv                        Imports from csv file

Transform function

You can use a function to transform any record before submitting to ElasticSearch

Here's an example

'use strict'

module.exports = function (record) {
  record.myfield = record.myfield.toLowerCase()
}

The argument of the function is the original JSON object

You can return a new object instead the original object

'use strict'

module.exports = function (record) {
  return {
    newField : record.oldField
  }
}

Examples

Import from a mongoexport JSON file

elastic-import ~/tmp/data.json localhost:9200 myindex mytype --mongo
 

Import from a JSON file ignoring file ignoreMe and all the ignoreMe fields in the field myArray

elastic-import ~/tmp/data.json localhost:9200 myindex mytype -i ignoreMe,myArray[*].ignoreMe --json

Import from a CSV file using the function in the file transform.js to transform the records

elastic-import ~/tmp/data.csv localhost:9200 myindex mytype -t transform.js --csv -h -p

Usage from another module

var Importer = require('elastic-import')
var importer = new Importer({
  host: 'localhost:9200',
  index: 'myindex',
  type: 'mytype',
  log: 'info',
  ignore: 'ignoreMe',
  warnErrors: false,
  transform: function (record) {
    record.text = record.text.toUpperCase()
  }
})

importer.import([{text: 'Hello world', ignoreMe: 'ignore'}], function (err, response) {
    // ...
})