npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

reusify

v1.0.4

Published

Reuse objects and functions with style

Downloads

171,104,024

Readme

reusify

npm version Build Status Coverage Status

Reuse your objects and functions for maximum speed. This technique will make any function run ~10% faster. You call your functions a lot, and it adds up quickly in hot code paths.

$ node benchmarks/createNoCodeFunction.js
Total time 53133
Total iterations 100000000
Iteration/s 1882069.5236482036

$ node benchmarks/reuseNoCodeFunction.js
Total time 50617
Total iterations 100000000
Iteration/s 1975620.838848608

The above benchmark uses fibonacci to simulate a real high-cpu load. The actual numbers might differ for your use case, but the difference should not.

The benchmark was taken using Node v6.10.0.

This library was extracted from fastparallel.

Example

var reusify = require('reusify')
var fib = require('reusify/benchmarks/fib')
var instance = reusify(MyObject)

// get an object from the cache,
// or creates a new one when cache is empty
var obj = instance.get()

// set the state
obj.num = 100
obj.func()

// reset the state.
// if the state contains any external object
// do not use delete operator (it is slow)
// prefer set them to null
obj.num = 0

// store an object in the cache
instance.release(obj)

function MyObject () {
  // you need to define this property
  // so V8 can compile MyObject into an
  // hidden class
  this.next = null
  this.num = 0

  var that = this

  // this function is never reallocated,
  // so it can be optimized by V8
  this.func = function () {
    if (null) {
      // do nothing
    } else {
      // calculates fibonacci
      fib(that.num)
    }
  }
}

The above example was intended for synchronous code, let's see async:

var reusify = require('reusify')
var instance = reusify(MyObject)

for (var i = 0; i < 100; i++) {
  getData(i, console.log)
}

function getData (value, cb) {
  var obj = instance.get()

  obj.value = value
  obj.cb = cb
  obj.run()
}

function MyObject () {
  this.next = null
  this.value = null

  var that = this

  this.run = function () {
    asyncOperation(that.value, that.handle)
  }

  this.handle = function (err, result) {
    that.cb(err, result)
    that.value = null
    that.cb = null
    instance.release(that)
  }
}

Also note how in the above examples, the code, that consumes an istance of MyObject, reset the state to initial condition, just before storing it in the cache. That's needed so that every subsequent request for an instance from the cache, could get a clean instance.

Why

It is faster because V8 doesn't have to collect all the functions you create. On a short-lived benchmark, it is as fast as creating the nested function, but on a longer time frame it creates less pressure on the garbage collector.

Other examples

If you want to see some complex example, checkout middie and steed.

Acknowledgements

Thanks to Trevor Norris for getting me down the rabbit hole of performance, and thanks to Mathias Buss for suggesting me to share this trick.

License

MIT