npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

process-pool

v0.3.5

Published

A process pool for efficient delegation of work over multiple CPU cores.

Downloads

30

Readme

process-pool

Circle CI

process-pool allows you to maintain a set of sub-processes with a cached state, creating a process pool that can be used to efficiently delegate work over multiple CPU cores.

Using process pool

var moment = require('moment')
var ProcessPool = require('process-pool')

// Limit number of running processes to two.
var pool = new ProcessPool({ processLimit: 2 })

function time() { return moment().diff(time.start, 'seconds') }
time.start = moment()

var func = pool.prepare(function() {
  // code here is run in the subprocess before it is first called, this allows you
  // to cache state in the subprocess so that it is immediately available.

  // this is the function run in the sub-process whenever the wrapping function
  // is called from a sub-process.
  return function(value) {
    // the promise is used to keep the process active for a second, usually
    // promises would not be used for this purpose in a process pool.
    return new Promise(function(resolve) {
      console.log('begin %s: %s', time(), returnValue)
      setTimeout(function() { resolve(p * 10) }, 1000)
    })
  }
})

for (var i = 1; i < 4; ++i) {
  func(i).then(function(returnValue) {
    console.log('end %s: %s', time(), returnValue)
  })
}

This would print:

begin 0: 1
begin 0: 2
end 1: 10
end 1: 20
begin 1: 3
end 2: 30

The process pool is set to run two processes concurrently, this delays the execution of the third call by a second.

Functions past to prepare are not closures and do not have access to surrounding scope. The following would fail:

var ProcessPool = require('process-pool')
var global = 5

var pool = new ProcessPool
var pooled = pool.prepare(function() {
  return function(argument) {
    return argument + global
  }
})

global is not available within the call to prepare. To pass context to prepare the two argument version of prepare can be used:

var ProcessPool = require('process-pool')
var global1 = 2, global2 = 10

var pool = new ProcessPool
var pooled = pool.prepare(function(context) {
  // global module requires are not available and must be required.
  var _ = require('lodash')

  return function(args) {
    return context.multiply * _.max(args) + context.add
  }
}, { multiply: global1, add: global2 })

pooled([1, 3]).then(function(value) {
  console.log("The value 16": value)
})

Requiring modules in a sub-process

By the the module path data is inherited from module.parent which is the module that included process-pool, in many cases this may not be the environment the sup-process should use. In order to use the current module path data the module option can be used. In most cases the module global variable provided by node should be passed which will case require to resolve modules according to module of the current source file.

// In this case the 'pooler' module includes 'process-pool', without using
// the `module` argument then require would resolve paths according to the
// 'pooler' module rather than this one.
var pooler = require('pooler')

var pooled = pooler.procPool.prepare(function() {
  var compiler = require('compiler')
  return function(data) {
    return compiler.compile(data)
  }
}, null, { module: module })

Running multiple functions with a single pool

Many functions can be wrapped to run in a subprocess by a single pool via calls to prepare using the processLimit option as shown in the previous example. By default processLimit copies of each prepared function are created. Up to processLimit * number of calls to prepare can be created but only processLimit subprocesses will be running code at any given time, the rest will be sleeping. This can be restricted on a per function basis:

var Promise = require('bluebird')
var ProcessPool = require('process-pool')
var pool = new ProcessPool({ processLimit: 3 })

var twoFunc = pool.prepare(function() {
  var nCalls = 0
  return function() {
    console.log("twoFunc", ++nCalls)
    return Promise.delay(1000)
  }
}, { processLimit: 2 })

var oneFunc = pool.prepare(function() {
  var nCalls = 0
  return function() {
    console.log("oneFunc", ++nCalls)
    return Promise.delay(1000)
  }
}, { processLimit: 1 })

twoFunc()
twoFunc()
twoFunc()
oneFunc()
oneFunc()

This would print:

twoFunc 1
twoFunc 2
oneFunc 1

followed by

twoFunc 3
oneFunc 2

a second later.

Future work

  • Killing a pooled function should drain the wait queue.