perf-table
v1.0.0
Published
Compare the performance of functions.
Downloads
10
Maintainers
Readme
Easily create a table comparing the performance of different functions. Useful for comparing npm packages and different implementations.
Install
npm i --save-dev perf-table
Usage
To use perf-table
simply provide and array of testing tuples. Each testing tuple should have a name or npm package name as it's first member and a function to run as it's second.
The following example tests the perf differences between looping strategies.
const data = new Array(45).fill(100);
function forLoop() {
let result = 1;
for (let i = 0; i < data.length; i++) {
result *= result + data[i];
}
}
function forOf() {
let result = 1;
for (const item of data) {
result *= result + item;
}
}
function whileLoop() {
let result = 1;
let i = 0;
while (i < data.length) {
result *= result + data[i++];
}
}
function forEach() {
let result = 1;
data.forEach(item => {
result *= result + item;
});
}
// Table is a promise that resolve when done printing/writing the table.
// Resolves with formatted table
const data = await table([
['for', forLoop],
['for of', forOf],
['while', whileLoop],
['forEach', forEach]
]);
OUTPUT:
╔═════════╤════════════╤══════════════════════════╤═════════════╗
║ NAME │ OPS/SEC │ RELATIVE MARGIN OF ERROR │ SAMPLE SIZE ║
╟─────────┼────────────┼──────────────────────────┼─────────────╢
║ for │ 11,251,638 │ ± 0.71% │ 186 ║
╟─────────┼────────────┼──────────────────────────┼─────────────╢
║ while │ 10,804,607 │ ± 1.75% │ 185 ║
╟─────────┼────────────┼──────────────────────────┼─────────────╢
║ for of │ 7,223,835 │ ± 1.17% │ 167 ║
╟─────────┼────────────┼──────────────────────────┼─────────────╢
║ forEach │ 4,146,640 │ ± 1.72% │ 183 ║
╚═════════╧════════════╧══════════════════════════╧═════════════╝
Providing Tuples with other configuration
If you want to configure the table further you will have to put the array demonstrated above in the compare
key of your options.
table({
compare: [
['for', forLoop],
['for of', forOf],
['while', whileLoop],
['forEach', forEach]
],
...
});
Benchmark Options
perf-table
is on top of benchmark.js and you can configure the each comparison run with benchmark.js options.
table({
compare: [ ... ],
options: {
minSamples: 500
}
});
Output Modes
perf-table
comes with a few output modes pre configured.
- cli (default) - output a table formatted for the CLI
- md - output a markdown table
- html - output a html table
- csv - output the table data in csv
table({
compare: [ ... ],
renderer: 'html'
})
Custom Renderer
If you want to control how the table renders, pass a function to renderer
and this will be used to render the perf table data.
const renderCSV: IRenderFunction = data =>
data.reduce((text, line) => text + `\n${line.join(',')}`, '');
table({
compare: [ ... ],
renderer: renderCSV
})
Writing to a file
By default perf-table
just prints the table to the console. If you provide a file path in the file
option the table output will be written to that path.
table({
compare: [ ... ],
file: 'test.txt'
})
Force Logging
When the file
option is provided the table will not be logged to the console. To override this behavior also pass log: true
.
table({
compare: [ ... ],
file: 'test.txt',
log: true
})
BundlePhobia Stats
You can display bundlePhobia column with your perf table to really compare yourself against the competition.
table({
compare: [ ... ],
bundle: true
})
Link
To add a link the to bundlePhobia output set bundle
to 'link'
.
table({
compare: [ ... ],
bundle: 'link'
})