npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@solanafm/internal-ts-logger

v2.0.0

Published

Internal pino logger to push structured nd-json logs to any targets

Downloads

376

Readme

internal-ts-logger

Logging utility pino wrapper for Javascript/Typescript services

Features

  • Logs out in nd-json format for easy parsing
  • Singleton logger instances for each service and stored in a Map
  • Multiple configurable output targets to file, loki or stdout
  • Customizable log levels for each target
  • Listens to trace events when a logger is initialized, but the log levels are filtered at the target level
  • Defaults to log level info for all targets

Logging Discipline

+--------------------------------------------------+
|   0        1        2       3       4        5   |
| FATAL <- ERROR <- WARN <- INFO <- DEBUG <- TRACE |
|                                                  |
+--------------------------------------------------+

The lowest log level is fatal and fatal will always be logged regardless of the log level that pino is listening to.

It's important to know what log levels you would want to log for. The default log level that the targets are consuming is info. If you want to log debug or trace logs, you would need to set the log level for the target to debug or trace respectively.

Here are some simple guidelines on what to log for each log level:

  • FATAL - The most severe issues in an application. This indicates that there is a critical failure that prevents the application from functioning. These entries should be logged before the application crashes
  • ERROR - Errors that should be investigated and resolved in due time. These entries indicate that the application is unable to perform a specific function or operation. Logs that are logged as errors should allow the application to still function at a reduced level of functionality or performance.
    • However, if an exception or entry is an expected behaviour and it does not result in a degradation of application functionality or performance, it should not be logged as an error, but at a lower log level
    • To add on, errors with the potential to recover can be logged as warn but if multiple attempts were made to reocver but failed, they can be logged as error
  • WARN - Events logged at this level should indicate that something unexpected has occured, but the application can continue to function normally without any performance impact or degradation. These entries should signify conditions that should be promptly.
  • INFO - Events that are captured in this log level should show that the application is operating normally. These entries should be used to provide information about the application's state and its operations.
    • Events that are typically logged at this level can be
      • Successful completion of a specific operation
      • Progress update for a long-running operation
      • Information about the application's state
  • DEBUG - This log level can be used to log messages that aid developers in identifying issues during a debugging session. They usually contain detailed information to troubleshoot any problems efficiently. This can include various variables' states within the scope they are investigating.
  • TRACE - Events at this log level should only be used to trace the path of code execution within a program. Developers can primarily use this log level to trace network latencies between API calls or to trace how long a particular algorithm takes to execute. trace should be listened to sparingly as it will generate a significant output volume which can substantially increase log file size.

Installation

npm install @solanafm/internal-ts-logger

yarn add @solanafm/internal-ts-logger

pnpm install @solanafm/internal-ts-logger

bun install @solanafm/internal-ts-logger

Usage

Basic Usage

Defaults to just pushing to stdout

import { LoggerFactory } from "@solanafm/internal-ts-logger";

const logger = LoggerFactory.getLogger("MyService");

logger.info("Hello, world!");

Pushing logs to Loki

import { LoggerFactory } from "@solanafm/internal-ts-logger";

// Create the streams first to push logs to loki
// Creating a logger to push to Loki and stdout
const streams = new StreamBuilder()
  .AddStdoutStream("debug")
  .AddLokiStream("debug", {
    // Replace LOKI_HOST with Loki URL without the path names
    host: LOKI_HOST
    basicAuth: {
      username: LOKI_USERNAME,
      password: LOKI_PASSWORD
    },
    labels: {
      serviceName: "test-service",
    },
    batching: true,
  })
  .BuildStream();

const logger = LoggerFactory.getLogger("test-service", streams);

logger.info("Hello, world!");

Outputs

{
  "level": "info",
  "time": 1725952400439,
  "name": "explorer-kit",
  "caller": "Object.<anonymous> (dist/index.js:76:14)",
  "msg": "Hello World"
}

Pushing logs to Quickwit

import { LoggerFactory } from "@solanafm/internal-ts-logger";

// Create the streams first to push logs to loki
// Creating a logger to push to Loki and stdout
const streams = new StreamBuilder()
  .AddStdoutStream("debug")
  .AddHttpStream("debug", {
    url: QUICKWIT_INGEST_URL,
    bodyType: "ndjson",
    headers: {
      "Content-Type": "application/json",
    },
    log: true,
  })
  .BuildStream();

const logger = LoggerFactory.getLogger("test-service", streams);

logger.info("Hello, world!");

Outputs

{
  "level": "info",
  "time": 1725952400439,
  "name": "explorer-kit",
  "caller": "Object.<anonymous> (dist/index.js:76:14)",
  "msg": "Hello World"
}

Customizing Streams

You can customize the streams using StreamBuilder

This following options will create a logger with 3 streams to push logs to - quickwit, loki and stdout.

import {
  LoggerFactory,
  LoggerOptionsBuilder,
} from "@solanafm/internal-ts-logger";

const options = new LoggerOptionsBuilder()
  .AddStdoutStream("debug")
  .AddHttpStream("debug", {
    url: QUICKWIT_INGEST_URL,
    bodyType: "ndjson",
    headers: {
      "Content-Type": "application/json",
    },
    log: true,
  })
  .AddLokiStream("debug", {
    // Replace LOKI_HOST with Loki URL without the path names
    host: LOKI_HOST
    basicAuth: {
      username: LOKI_USERNAME,
      password: LOKI_PASSWORD
    },
    labels: {
      serviceName: "test-service",
    },
    batching: true,
  })
  .BuildStream()

const logger = LoggerFactory.getLogger("test-service", options);

logger.info("Hello, world!");

Outputs

{
  "level": 30,
  "time": 1725952400439,
  "name": "explorer-kit",
  "caller": "Object.<anonymous> (dist/index.js:76:14)",
  "msg": "Hello World"
}

Available Streams

Loki Stream

.AddLokiStream("debug", {
  // Replace LOKI_HOST with Loki URL without the path names
  host: LOKI_HOST
  basicAuth: {
    username: LOKI_USERNAME,
    password: LOKI_PASSWORD
  },
  labels: {
    serviceName: "test-service",
  },
  batching: true,
})

Quickwit Stream

.AddHttpStream("debug", {
  url: QUICKWIT_INGEST_URL,
  bodyType: "ndjson",
  headers: {
    "Content-Type": "application/json",
  },
  log: true,
})

Stdout Stream

.AddStdoutStream("debug")

Logger Options

export type InternalLoggerOptions = {
  // Disable msg prefix, so a log message will be "log" instead of "[msg-prefix] log"
  msgPrefix: boolean;
  // Disable formatting of level labels, so in a ndjson message, it'll be level: 20 instead of level: "info"
  formatLevelLabels: boolean;
};

const logger = LoggerFactory.getLogger("test-service", streams, {
  msgPrefix: true,
  formatLevelLabels: true,
});