npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@qfin/tools

v0.7.0

Published

The awesome repo for Quantofin tools

Downloads

2

Readme

@qfin/tools


This module is part of the Quantofin project.

The awesome @qfin/tools for Quantofin apps ❤

💾 Install

npm install --save @qfin/tools

⚙ Module documentation

base64.encode

Encode a string to base64

Type: module.base64.encode

Parameters

  • value string The string that needs to encoded to base64 encoding

Examples

const { base64 } = require('@qfin/tools');

const encoded = base64.encode('Hello World');
// 'SGVsbG8gV29ybGQ='

Returns string the encoded value

base64.encodeObject

Encode a object to base64 string by first serializing the object through JSON.stringify

Type: module.base64.encodeObject

Parameters

  • objectToEncode object The object to encode

Examples

const { base64 } = require('@qfin/tools');

const encoded = base64.encodeObject({ message: 'Hello World' });
// 'eyJtZXNzYWdlIjoiSGVsbG8gV29ybGQifQ=='

Returns string the encoded value

base64.decode

Decode a base64 encoded string to string data

Type: module.base64.decode

Parameters

  • encodedData string base64 encoded string that needs to be decoded

Examples

const { base64 } = require('@qfin/tools');

const decoded = base64.decode('SGVsbG8gV29ybGQ=');
// 'Hello World'

Returns string the decoded value

base64.decodeAsObject

Decode a base64 encoded string to javascript object by parsing the decoded string through JSON.parse

Type: module.base64.decodeAsObject

Parameters

  • encodedData string the encoded base64 string

Examples

const decoded = base64.decodeAsObject('eyJtZXNzYWdlIjoiSGVsbG8gV29ybGQifQ==');
// { message: 'Hello World' }

Returns object the decoded value

BufferSplitter

The BufferSplitter class

This class can help transform data stream with a given delimiter into chunks upto the delimiter

Type: module.BufferSplitter

Examples

const { BufferSplitter } = require('@qfin/tools');

const NewLineBufferSplitter = new BufferSplitter('\n');

function transformer(lines) {
   return csvToJsonTransformer(lines, headers);
}

// .... when new data comes in
const csvArrays = NewLineBufferSplitter.transform(buffer, transformer);

// ... when stream is flushed
const csvArrays = NewLineBufferSplitter.finishTransform(transformer);

split

Split a given buffer by searching for the deliminating character

Parameters

  • buffer Buffer The buffer that has to be split

Returns Buffer The splitted buffer that can be safely read and parsed as required

finishSplit

Finish splitting the buffer and get the remaining chunk of data

Returns Buffer The remaining chunk of data that does not have the deliminating character

transform

Transform a buffer using a transforming function

If a transforming function is not present that it is assumed that the buffer contains comma separated values (CSV) deliminated by the delimiter (defaults to new line) and then will be presented as an array of csv data

Parameters

  • buffer Buffer The buffer that needs to be transformed
  • transformer function? The transforming function, should take a string parameter

Returns any

finishTransform

Finish the transform by calling this method. This will run the transforming function through the remaining buffer

Parameters

  • transformer function? The transforming function, should take a string parameter

Returns any

BufferSplitter.NewLineBufferSplitter

A BufferSplitter with deliminator as new line \n

Type: module.BufferSplitter.NewLineBufferSplitter

Returns BufferSplitter

dotObjectPath.getValue

Gets the object's value by key with dot path

Type: module.dotObjectPath.getValue

Parameters

  • obj object The data object
  • keyPath string The dot separated key path

Examples

const { dotObjectPath } = require('@qfin/tools');

const obj = { a: { b: { c: 100 } } };

const val = dotObjectPath.getValue(obj, 'a.b');
// val = { c: 100 }

const val = dotObjectPath.getValue(obj, 'a.b.c');
// val = 100

Returns any The value at the path

dotObjectPath.setValue

Sets the value of the object at the given dot path

Type: module.dotObjectPath.setValue

Parameters

  • obj object The data object
  • keyPath string The dot separated key path
  • value any The value to be set at the path

Examples

const { dotObjectPath } = require('@qfin/tools');

const obj = { a: {} };

dotObjectPath.setValue(obj, 'a.b.c', 100);
// obj = { a: { b: { c: 100 } } };

filesystem.readFileAsStream

Creates a readable stream from a file

Type: module.filesystem.readFileAsStream

Parameters

  • filepath string The filepath without the filename
  • filename string The filename
  • options object NodeJS fs options (encoding defaults to utf-8)

Returns ReadableStream ReadableStream

filesystem.writeFileAsStream

Creates a writable stream to stream data to a file

Type: module.filesystem.writeFileAsStream

Parameters

  • filepath string The filepath without the filename
  • filename string The filename
  • options object NodeJS fs options (encoding defaults to utf-8)

Returns WritableStream WritableStream

filesystem.writeToFile

Writes to a file by streaming the data by creating a writable stream internally

Type: module.filesystem.writeToFile

Parameters

  • data (Buffer | string) The data to write to the file
  • filepath string The filepath without the filename
  • filename string The filename
  • options object NodeJS fs options (encoding defaults to utf-8)

Returns Promise<any> resolves to nothing or rejects with error

filesystem.streamToFile

Stream data to a file (useful for piping a stream)

Type: module.filesystem.streamToFile

Parameters

  • filepath string The filepath without the filename
  • filename string The filename

Returns Transform Transform

murmurhash2

This hash function takes in a string and hashes it to a unique number value

murmurhash2 algorithm adapted from https://gist.github.com/raycmorgan/588423

Type: module.murmurhash2

Parameters

  • str string The string to hash
  • seed string? The seed value for this hash, defaults to the length of the string

Returns number The hashed value of the string

transform.csvToJsonTransformer

Transforms CSV lines to its json counterpart

Type: module.transform.csvToJsonTransformer

Parameters

  • lines array The array of lines containing the csv data (required)
  • headers array The headers which become the keys of json (required) The header values can be dot.separated i.e. 'profile.name' will become { profile: { name: 'value' } } Also, a special header value of underscore (_) can be used to skip the header and value assignment
  • delimiter string? The line item delimiter (defaults to comma)
  • trim boolean? Whether to trim the values or not (defaults to true)

Returns array Array(Object) JSON Objects

transform.csvToJsonStreamTransformer

Transform a CSV buffer Stream to a JSON stream

Type: module.transform.csvToJsonStreamTransformer

Parameters

  • headers array The headers which become the keys of json (required) The header values can be dot.separated i.e. 'profile.name' will become { profile: { name: 'value' } } Also, a special header value of underscore (_) can be used to skip the header and value assignment
  • callback function The callback which receives the error or final array of json objects when the stream terminates (required)

Examples

const { transformers } = require('@qfin/tools');

res.data
   .pipe(() =>
     transformers.csvToJsonStreamTransformer(headers, async (err, json) => {
       if (err) {
         throw err;
       } else {
         await processData(json);
       }
     })
   )
   .on('close', () => console.log('Stream closed'))
   .on('error', (err) => console.error(err));

Returns Transform Transform