npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@nicholaswmin/timerify

v0.6.5

Published

tiny utility for measuring function performance

Downloads

16

Readme

test-workflow coverage-workflow codeql-workflow size-report

timerify

like performance.timerify() but:

Usage

Install

npm i @nicholaswmin/timerify

timerify(fn)

Instruments a function and returns it. You then use the instrumented function as usual & every time it's called, the function durations are logged.

example: log the mean runtime durations of a fibonacci function, computing the 10th fibonacci number

import { timerify } from 'timerify'

// function
const fibonacci = n => n < 1 ? 0 : n <= 2
  ? 1 : fibonacci(n - 1) + fibonacci(n - 2)

// same function but instrumented
const timed_fibonacci = timerify(fibonacci)

timed_fibonacci(10)  // recorded a run
timed_fibonacci(10)  // recorded another
timed_fibonacci(10)  // recorded another

console.log(timed_fibonacci.stats_ms.count)
// 3 (times called)

console.log(timed_fibonacci.stats_ms.mean)
// 2.94 (milliseconds, per run, on average)

Promise/async functions

same as above, just await the returned function:

const sleep = ms => new Promise((resolve => setTimeout(resolve, ms)))

const timed_sleep = timerify(sleep)

await timed_sleep(100)
await timed_sleep(100)
await timed_sleep(100)

console.log(timed_sleep.stats_ms.count)
// 3 (times called)

console.log(timed_sleep.stats_ms.mean)
// 100 (milliseconds)

Recorded data

Timerified functions contain recorded statistics in:

nanoseconds (ns)

timerified.stats_ns

milliseconds (ms)

timerified.stats_ms

Recorded values

Both contain the following:

| property | description | |--------------- |---------------------------------------------------- | | count | count of function invocations | | min | fastest recorded duration | | mean | statistical mean of all durations | | max | slowest recorded duration | | stddev | standard deviation of all durations | | percentiles | k-th percentiles of all durations |

example: log running time of foo, in nanoseconds:

const timed_foo = timerify(foo)

timed_foo()
timed_foo()
timed_foo()

console.log(timed_foo.stats_ns)

//  count: 3,
//  min: 3971072,
//  max: 4030463,
//  mean: 4002406.4,
//  exceeds: 0,
//  stddev: 24349.677891914707,
//  percentiles: { '75': 4020224, '100': 4028416, '87.5': 4028416 }

example: same as above, this time in milliseconds:

const timed_foo = timerify(foo)

timed_foo()
timed_foo()
timed_foo()

console.log(timed_foo.stats_ms)

//  count: 3,
//  min: 3.97,
//  max: 4.03,
//  mean: 4,
//  exceeds: 0,
//  stddev: 0.02,
//  percentiles: {  '75': 4.02, '100': 4.03, '87.5': 4.03 }

both are derived from an internal perf_hooks: Histogram.

timerified.reset()

resets recorded stats to zero.

example: run foo 2 times, reset recorded stats to 0 & continue recording:

const timed_foo = timerify(foo)

timed_foo()
timed_foo()

console.log(timed_foo.stats_ms.max)
// 2.01

timed_foo.reset()

console.log(timed_foo.stats_ms.max)
// 0

timed_foo()
timed_foo()

console.log(timed_foo.stats_ms.max)
// 1.99

log([fn,fn...])

Pretty-prints the recorded durations of one or more timerified functions.

example: pretty-print the stats of foo and bar:

import { timerify, log } from '@nicholaswmin/timerify'

const foo = () => new Promise((resolve => setTimeout(resolve, 5)))
const bar = () => new Promise((resolve => setTimeout(resolve, 15)))

const timed_foo = timerify(foo)
const timed_bar = timerify(bar)

for (let i = 0; i < 30; i++)
  await timed_foo()

for (let i = 0; i < 50; i++)
  await timed_bar()

log([ timed_foo, timed_bar ])

logs:

┌────────────────┬───────┬──────────┬───────────┬──────────┬─────────────┐
│ (index)        │ count │ min (ms) │ mean (ms) │ max (ms) │ stddev (ms) │
├────────────────┼───────┼──────────┼───────────┼──────────┼─────────────┤
│ timerified foo │ 30    │ 4.56     │ 5.68      │ 6.25     │ 0.25        │
│ timerified bar │ 50    │ 15.14    │ 16.04     │ 16.21    │ 0.23        │
└────────────────┴───────┴──────────┴───────────┴──────────┴─────────────┘

Usage with test runners

Just assert the result in any test runner using any assertion library

example: using node test-runner:

requires Node.js v20+

import test from 'node:test'
import assert from 'node:assert'

import { timerify } from '@nicholaswmin/timerify'

const fibonacci = n => n < 1 ? 0 : n <= 2
  ? 1 : fibonacci(n - 1) + fibonacci(n - 2)

test('perf: #fibonacci(20) x 10 times', async t => {
  t.beforeEach(() => {
    for (let i = 0; i < 10; i++)
      timed_fibonacci(20)
  })

  await t.test('called 10 times', () => {
    const callCount = timed_fibonacci.stats_ms.count

    assert.strictEqual(callCount, 10)
  })

  await t.test('runs quickly, on average', () => {
    const mean = timed_fibonacci.stats_ms.mean

    assert.ok(mean < 30, `mean: ${mean} ms exceeded 30ms threshold`)
  })

  await t.test('has consistent running times', () => {
    const dev = timed_fibonacci.stats_ms.stddev

    assert.ok(dev < 2, `deviation: ${dev} ms exceeded 30ms threshold`)
  })
})

In the examples above, I specifically omit testing for the statistical min/max, opting instead for mean and deviation.

This is intentional. min/max times aren't useful metrics unless you're building a pacemaker or the chronometer that launches the Space Shuttle, in which case you probably wouldn't be looking at this page. They are also very susceptible to environmental events that are outside your control hence they can make your tests brittle.

Performance-testing shouldn't ever be included as part of unit-testing. At best my advice is to keep them around in a CI workflow and have them serve as performance regression canaries that you check every now and then.

Tests

Install deps

npm ci

Run unit tests

npm test

Run test coverage

npm run test:coverage

Authors

@nicholaswmin

License

MIT-0 "No Attribution" License

Footnotes

[^1]: This module assembles native PerformanceMeasurement utilities such as performance.timerify & Histogram into an easy-to-use unit which avoids repeated & elaborate test setups. You can skip this module entirely and just use the native functions.