@nicholaswmin/timerify
v0.6.5
Published
tiny utility for measuring function performance
Downloads
16
Maintainers
Readme
timerify
like
performance.timerify()
but:
- metrics are included in the instrumented function
- metrics are in milliseconds as well as nanoseconds.
Usage
Install
npm i @nicholaswmin/timerify
timerify(fn)
Instruments a function
and returns it.
You then use the instrumented function as usual & every time it's called,
the function durations are logged.
example: log the
mean
runtime durations of afibonacci function
, computing the 10th fibonacci number
import { timerify } from 'timerify'
// function
const fibonacci = n => n < 1 ? 0 : n <= 2
? 1 : fibonacci(n - 1) + fibonacci(n - 2)
// same function but instrumented
const timed_fibonacci = timerify(fibonacci)
timed_fibonacci(10) // recorded a run
timed_fibonacci(10) // recorded another
timed_fibonacci(10) // recorded another
console.log(timed_fibonacci.stats_ms.count)
// 3 (times called)
console.log(timed_fibonacci.stats_ms.mean)
// 2.94 (milliseconds, per run, on average)
Promise
/async
functions
same as above, just await
the returned function:
const sleep = ms => new Promise((resolve => setTimeout(resolve, ms)))
const timed_sleep = timerify(sleep)
await timed_sleep(100)
await timed_sleep(100)
await timed_sleep(100)
console.log(timed_sleep.stats_ms.count)
// 3 (times called)
console.log(timed_sleep.stats_ms.mean)
// 100 (milliseconds)
Recorded data
Timerified functions contain recorded statistics in:
nanoseconds (ns)
timerified.stats_ns
milliseconds (ms)
timerified.stats_ms
Recorded values
Both contain the following:
| property | description |
|--------------- |---------------------------------------------------- |
| count
| count of function invocations |
| min
| fastest recorded duration |
| mean
| statistical mean of all durations |
| max
| slowest recorded duration |
| stddev
| standard deviation of all durations |
| percentiles
| k-th percentiles of all durations |
example: log running time of
foo
, innanoseconds
:
const timed_foo = timerify(foo)
timed_foo()
timed_foo()
timed_foo()
console.log(timed_foo.stats_ns)
// count: 3,
// min: 3971072,
// max: 4030463,
// mean: 4002406.4,
// exceeds: 0,
// stddev: 24349.677891914707,
// percentiles: { '75': 4020224, '100': 4028416, '87.5': 4028416 }
example: same as above, this time in
milliseconds
:
const timed_foo = timerify(foo)
timed_foo()
timed_foo()
timed_foo()
console.log(timed_foo.stats_ms)
// count: 3,
// min: 3.97,
// max: 4.03,
// mean: 4,
// exceeds: 0,
// stddev: 0.02,
// percentiles: { '75': 4.02, '100': 4.03, '87.5': 4.03 }
both are derived from an internal
perf_hooks: Histogram
.
timerified.reset()
resets recorded stats to zero.
example: run
foo
2 times, reset recorded stats to0
& continue recording:
const timed_foo = timerify(foo)
timed_foo()
timed_foo()
console.log(timed_foo.stats_ms.max)
// 2.01
timed_foo.reset()
console.log(timed_foo.stats_ms.max)
// 0
timed_foo()
timed_foo()
console.log(timed_foo.stats_ms.max)
// 1.99
log([fn,fn...])
Pretty-prints the recorded durations of one or more timerified functions.
example: pretty-print the stats of
foo
andbar
:
import { timerify, log } from '@nicholaswmin/timerify'
const foo = () => new Promise((resolve => setTimeout(resolve, 5)))
const bar = () => new Promise((resolve => setTimeout(resolve, 15)))
const timed_foo = timerify(foo)
const timed_bar = timerify(bar)
for (let i = 0; i < 30; i++)
await timed_foo()
for (let i = 0; i < 50; i++)
await timed_bar()
log([ timed_foo, timed_bar ])
logs:
┌────────────────┬───────┬──────────┬───────────┬──────────┬─────────────┐
│ (index) │ count │ min (ms) │ mean (ms) │ max (ms) │ stddev (ms) │
├────────────────┼───────┼──────────┼───────────┼──────────┼─────────────┤
│ timerified foo │ 30 │ 4.56 │ 5.68 │ 6.25 │ 0.25 │
│ timerified bar │ 50 │ 15.14 │ 16.04 │ 16.21 │ 0.23 │
└────────────────┴───────┴──────────┴───────────┴──────────┴─────────────┘
Usage with test runners
Just assert the result in any test runner using any assertion library
example: using node test-runner:
requires Node.js v20+
import test from 'node:test'
import assert from 'node:assert'
import { timerify } from '@nicholaswmin/timerify'
const fibonacci = n => n < 1 ? 0 : n <= 2
? 1 : fibonacci(n - 1) + fibonacci(n - 2)
test('perf: #fibonacci(20) x 10 times', async t => {
t.beforeEach(() => {
for (let i = 0; i < 10; i++)
timed_fibonacci(20)
})
await t.test('called 10 times', () => {
const callCount = timed_fibonacci.stats_ms.count
assert.strictEqual(callCount, 10)
})
await t.test('runs quickly, on average', () => {
const mean = timed_fibonacci.stats_ms.mean
assert.ok(mean < 30, `mean: ${mean} ms exceeded 30ms threshold`)
})
await t.test('has consistent running times', () => {
const dev = timed_fibonacci.stats_ms.stddev
assert.ok(dev < 2, `deviation: ${dev} ms exceeded 30ms threshold`)
})
})
In the examples above, I specifically omit testing for the statistical
min
/max
, opting instead for mean
and deviation
.
This is intentional. min
/max
times aren't useful metrics unless you're
building a pacemaker or the chronometer that launches the Space Shuttle, in
which case you probably wouldn't be looking at this page.
They are also very susceptible to environmental events that are outside
your control hence they can make your tests brittle.
Performance-testing shouldn't ever be included as part of unit-testing. At best my advice is to keep them around in a CI workflow and have them serve as performance regression canaries that you check every now and then.
Tests
Install deps
npm ci
Run unit tests
npm test
Run test coverage
npm run test:coverage
Authors
License
MIT-0 "No Attribution" License
Footnotes
[^1]: This module assembles native PerformanceMeasurement
utilities such as performance.timerify
&
Histogram
into an easy-to-use unit which avoids
repeated & elaborate test setups.
You can skip this module entirely and just use the native functions.