npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

wink-nlp

v2.3.0

Published

Developer friendly Natural Language Processing ✨

Downloads

154,819

Readme

winkNLP

Build Status Coverage Status Known Vulnerabilities CII Best Practices Gitter Follow on Twitter

Developer friendly Natural Language Processing ✨

WinkNLP is a JavaScript library for Natural Language Processing (NLP). Designed specifically to make development of NLP applications easier and faster, winkNLP is optimized for the right balance of performance and accuracy.

Its word embedding support unlocks deeper text analysis. Represent words and text as numerical vectors with ease, bringing higher accuracy in tasks like semantic similarity, text classification, and beyond – even within a browser.

It is built ground up with no external dependency and has a lean code base of ~10Kb minified & gzipped. A test coverage of ~100% and compliance with the Open Source Security Foundation best practices make winkNLP the ideal tool for building production grade systems with confidence.

WinkNLP with full Typescript support, runs on Node.js, web browsers and Deno.

Build amazing apps quickly

| Wikipedia article timeline | Context aware word cloud | Key sentences detection | | --- | --- | --- | | | ||

Head to live examples to explore further.

Blazing fast

WinkNLP can easily process large amount of raw text at speeds over 650,000 tokens/second  on a M1 Macbook Pro in both browser and Node.js environments. It even runs smoothly on a low-end smartphone's browser.

| Environment | Benchmarking Command | |--- | --- | | Node.js | node benchmark/run | | Browser | How to measure winkNLP's speed on browsers? |

Features

WinkNLP has a comprehensive natural language processing (NLP) pipeline covering tokenization, sentence boundary detection (sbd), negation handling, sentiment analysis, part-of-speech (pos) tagging, named entity recognition (ner), custom entities recognition (cer). It offers a rich feature set:

Utilities & Tools 💼

Documentation

  • Concepts — everything you need to know to get started.
  • API Reference — explains usage of APIs with examples.
  • Change log — version history along with the details of breaking changes, if any.
  • Examples — live examples with code to give you a head start.

Installation

Use npm install:

npm install wink-nlp --save

In order to use winkNLP after its installation, you also need to install a language model according to the node version used. The table below outlines the version specific installation command:

| Node.js Version |Installation | | --- | --- | | 16 or 18 | npm install wink-eng-lite-web-model --save | | 14 or 12 | node -e "require('wink-nlp/models/install')" |

The wink-eng-lite-web-model is designed to work with Node.js version 16 or 18. It can also work on browsers as described in the next section. This is the recommended model.

The second command installs the wink-eng-lite-model, which works with Node.js version 14 or 12.

How to configure TypeScript project

Enable esModuleInterop and allowSyntheticDefaultImports in the tsconfig.json file:

"compilerOptions": {
    "esModuleInterop": true,
    "allowSyntheticDefaultImports": true,
    ...
}

How to install for Web Browser

If you’re using winkNLP in the browser use the wink-eng-lite-web-model. Learn about its installation and usage in our guide to using winkNLP in the browser. Explore winkNLP recipes on Observable for live browser based examples.

How to run on Deno

Follow the example on replit.

Get started

Here is the "Hello World!" of winkNLP:

// Load wink-nlp package.
const winkNLP = require( 'wink-nlp' );
// Load english language model.
const model = require( 'wink-eng-lite-web-model' );
// Instantiate winkNLP.
const nlp = winkNLP( model );
// Obtain "its" helper to extract item properties.
const its = nlp.its;
// Obtain "as" reducer helper to reduce a collection.
const as = nlp.as;
 
// NLP Code.
const text = 'Hello   World🌎! How are you?';
const doc = nlp.readDoc( text );
 
console.log( doc.out() );
// -> Hello   World🌎! How are you?
 
console.log( doc.sentences().out() );
// -> [ 'Hello   World🌎!', 'How are you?' ]
 
console.log( doc.entities().out( its.detail ) );
// -> [ { value: '🌎', type: 'EMOJI' } ]
 
console.log( doc.tokens().out() );
// -> [ 'Hello', 'World', '🌎', '!', 'How', 'are', 'you', '?' ]
 
console.log( doc.tokens().out( its.type, as.freqTable ) );
// -> [ [ 'word', 5 ], [ 'punctuation', 2 ], [ 'emoji', 1 ] ]

Experiment with winkNLP on RunKit.

Speed & Accuracy

The winkNLP processes raw text at ~650,000 tokens per second with its wink-eng-lite-web-model, when benchmarked using "Ch 13 of Ulysses by James Joyce" on a M1 Macbook Pro machine with 16GB RAM. The processing included the entire NLP pipeline — tokenization, sentence boundary detection, negation handling, sentiment analysis, part-of-speech tagging, and named entity extraction. This speed is way ahead of the prevailing speed benchmarks.

The benchmark was conducted on Node.js versions 16, and 18.

It pos tags a subset of WSJ corpus with an accuracy of ~95% — this includes tokenization of raw text prior to pos tagging. The present state-of-the-art is at ~97% accuracy but at lower speeds and is generally computed using gold standard pre-tokenized corpus.

Its general purpose sentiment analysis delivers a f-score of ~84.5%, when validated using Amazon Product Review Sentiment Labelled Sentences Data Set at UCI Machine Learning Repository. The current benchmark accuracy for specifically trained models can range around 95%.

Memory Requirement

Wink NLP delivers this performance with the minimal load on RAM. For example, it processes the entire History of India Volume I with a total peak memory requirement of under 80MB. The book has around 350 pages which translates to over 125,000 tokens.

Need Help?

Usage query 👩🏽‍💻

Please ask at Stack Overflow or discuss at Wink JS GitHub Discussions or chat with us at Wink JS Gitter Lobby.

Bug report 🐛

If you spot a bug and the same has not yet been reported, raise a new issue or consider fixing it and sending a PR.

New feature 🌟

Looking for a new feature, request it via the new features & ideas discussion forum or consider becoming a contributor.

About winkJS

WinkJS is a family of open source packages for Natural Language Processing, Machine Learning, and Statistical Analysis in NodeJS. The code is thoroughly documented for easy human comprehension and has a test coverage of ~100% for reliability to build production grade solutions.

Copyright & License

Wink NLP is copyright 2017-24 GRAYPE Systems Private Limited.

It is licensed under the terms of the MIT License.