rian
v0.4.0
Published
Effective tracing for the edge and origins
Downloads
449
Maintainers
Readme
A utility to simplify your tracing
This is free to use software, but if you do like it, consisder supporting me ❤️
⚡ Features
🤔 Familiar — looks very much like opentelemetry.
✅ Simple —
configure()
an environment, create atracer()
,report()
and done.🏎 Performant — check the benchmarks.
🪶 Lightweight — a mere 1KB and next to no dependencies.
🚀 Usage
Visit /examples for more!
import { configure, tracer, report } from 'rian';
import { exporter } from 'rian/exporter.otel.http';
// ~> configure the environment, all tracers will inherit this
configure('my-service', {
'service.version': 'DEV',
});
// ~> create a tracer — typically "per request" or "per operation".
const trace = tracer('request');
function handler(req) {
// ~> start a span
return trace.span(`${req.method} ${req.path}`)(async (s) => {
// set some fields on this span's context
s.set_context({ user_id: req.params.user_id });
// ~> span again for `db::read`
const data = await s.span('db::read')(() => db_execute('SELECT * FROM users'));
// ~> maybe have some manual spanning
const processing_span = s.span('process records');
for (let row of data) {
processing_span.add_event('doing stuff', { id: row.id });
do_stuff(row);
}
// don't forget to end
processing_span.end();
return reply(200, { data });
});
}
const otel_exporter = exporter((payload) =>
fetch('/traces/otlp', {
method: 'POST',
body: JSON.stringify(payload),
}),
);
http.listen((req, executionCtx) => {
// ~> report all the spans once the response is sent
executionCtx.defer(() => report(otel_exporter));
return handler(req);
});
You only need to report
in your application once somewhere. All spans are collected into the same "bucket".
Using: examples/basic.ts
╭─ basic
│ ╭─────────────────────────────────────────────────────────────────╮
│ 95 ms │ ┣━━━━━━━━━━┫ │◗ setup
│ 41 ms │ ┣━━━━┫ │◗ bootstrap
│ 32 ms │ ┣━━━┫ │◗ building
│ 59 ms │ ┣━━━━━┫ │◗ precompile
│ 80 ms │ ┣━━━━━━━━┫ │◗ verify
│ 75 ms │ ┣━━━━━━━┫ │◗ spawn thread
│ 371 ms │ ┣╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍┫ │◗ doesnt finish
│ 347 ms │ ┣━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┫ │◗ running
│ 341 ms │ ┣━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┫ │◗ e2e
│ 38 ms │ ┣━━━┫ │◗ snapshot
│ 13 ms │ ┣━┫ │◗ url for page /my-product/sle…
│ ╰┼┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┼┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┼╯
│ 0 ms 318.500 ms 637 ms
│
│ one └┘ unit is less than: 10.443 ms
│ total time: 637 ms
╰─
╭─ thread #1
│ ╭──────────────────────────────────────────────────────────────────╮
│ 20 ms │ ┣━━━━━━━━━━━━━━━━━━━━┫ │◗ setup
│ 63 ms │ ┣━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┫ │◗ bootstrap
│ ╰┼┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┼┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┴┼╯
│ 0 ms 31.500 ms 63 ms
│
│ one └┘ unit is less than: 1.016 ms
│ total time: 63 ms
╰─
🔎 API
Module: rian
The main and default module responsible for creating and provisioning spans.
💡 Note ~> when providing span context values, you can use Semantic Conventions, but won't be enforced.
Module: rian/async
A module that utilizes the async_hooks
API to provide a tracer
and spans
that can be used where the current span
isn't accessible.
💡 Note ~> this module should be used mutually exclusively with the main
rian
module.
import { configure, tracer, span, currentSpan, report } from 'rian/async';
import { exporter } from 'rian/exporter.otel.http';
function handler(req) {
return span(`${req.method} ${req.path}`)(async () => {
const s = currentSpan();
s.set_context({ user_id: req.params.user_id });
const data = await s.span('db::read')(() => db_execute('SELECT * FROM users'));
const processing_span = s.span('process records');
for (let row of data) {
processing_span.add_event('doing stuff', { id: row.id });
do_stuff(row);
}
processing_span.end();
return reply(200, { data });
});
}
const httpTrace = tracer('http');
http.listen((req, executionCtx) => {
executionCtx.defer(() => report(exporter));
return httpTrace(() => handler(req));
});
Module: rian/exporter.zipkin
Exports the spans created using the zipkin protocol and leaves the shipping up to you.
Module: rian/exporter.otel.http
Implements the OpenTelemetry protocol for use with http transports.
🧑🍳 Exporter Recipes
import { configure, tracer, report } from 'rian';
import { exporter } from 'rian/exporter.zipkin';
const newrelic = exporter((payload) =>
fetch('https://trace-api.newrelic.com/trace/v1', {
method: 'POST',
headers: {
'api-key': '<your api key>',
'content-type': 'application/json',
'data-format': 'zipkin',
'data-format-version': '2',
},
body: JSON.stringify(payload),
}),
);
configure('my-service');
const tracer = tracer('app');
await report(newrelic);
import { configure, tracer, report } from 'rian';
import { exporter } from 'rian/exporter.otel.http';
const lightstep = exporter((payload) =>
fetch('https://ingest.lightstep.com/traces/otlp/v0.9', {
method: 'POST',
headers: {
'lightstep-access-token': '<your api key>',
'content-type': 'application/json',
},
body: JSON.stringify(payload),
}),
);
configure('my-service');
const tracer = tracer('app');
await report(lightstep);
🤔 Motivation
To clarify, rian
is the Irish word for "trace".
In our efforts to be observant citizens, we often rely on tools such as NewRelic, Lightstep, and Datadog. However, these tools can be bloated and slow, often performing too many unnecessary tasks and driving up costs, as every span costs.
This is where rian comes in as a lightweight, fast, and effective tracer inspired by industry giants OpenTracing and OpenTelemetry. These frameworks were designed to abstract the telemetry part from vendors, allowing libraries to be instrumented without needing to know about the vendor.
Rian does not intend to align or compete with them, slightly different goals. Rian aims to be used exclusively for instrumenting your application, particularly critical business paths. While rian can scale to support more complex constructs, there are profiler tools that are better suited for those jobs. Rian's primary design goal is to provide better insights into your application's behavior, particularly for edge or service workers where a lean tracer is favored.
Rian does not by design handle injecting w3c trace-context
, or
propagating baggage. But we do expose api's for achieving this.
💨 Benchmark
via the
/bench
directory with Node v17.2.0
Validation :: single span
✔ rian
✔ rian/async
✔ opentelemetry
Benchmark :: single span
rian x 277,283 ops/sec ±3.57% (90 runs sampled)
rian/async x 279,525 ops/sec ±2.33% (91 runs sampled)
opentelemetry x 155,019 ops/sec ±13.13% (70 runs sampled)
Validation :: child span
✔ rian
✔ rian/async
✔ opentelemetry
Benchmark :: child span
rian x 146,793 ops/sec ±3.38% (87 runs sampled)
rian/async x 180,488 ops/sec ±1.64% (92 runs sampled)
opentelemetry x 102,541 ops/sec ±9.77% (73 runs sampled)
And please... I know these results are anything but the full story. But it's a number and point on comparison.
License
MIT © Marais Rossouw
Disclaimer
- NewRelic is a registered trademark of https://newrelic.com/ and not affiliated with this project. - Datadog is a registered trademark of https://www.datadoghq.com/ and not affiliated with this project. - Lightstep is a registered trademark of https://lightstep.com/ and not affiliated with this project.