npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

trust-logger-ba

v0.2.15

Published

Node logger package

Readme

trust-logger-ba [TrustLogger]

Logger implementation for digital platforms with cloud native architecture.

[Version npm]

NPM

What it is

trust-logger-ba is a simple logging library built specifically for the practical part of a bachelor thesis on data sovereignty. The main idea is to use the logger on every backend component that interferes with or has access to data. A (at least) semi-trustworthy platform operator might add the logger to increase the transparency / traceability of the platforms data handling. The logs can be provided to the associated user (e.g. the data owner) who - with the information contained in the logs - might be able to make a knowledge-based assessment on the data protection the platform has to offer. The logs could also be used to hold an agent responsible in the case of violations to the SLAs.

What it is not

The TrustLogger is no production-ready library. It rather is a rough implementation used to demonstrate the implementability of the concept developed in the thesis. Please refer to winston if you are looking for a production-ready logger.

Usage

To use the TrustLogger a new TrustLogger must be created. This requires the declaration of the format of the log (see:[formats]), the transports (one or more) that should be used (see:[transports]) and the source that is generating the log.

const TrustLogger = require("trust-logger-ba");

// create a new TrustLogger
// the required parameters are:
//  - format: The format the log should be formatted to (refer to formats)
//  - transports: The transport mechanism used to transport the logs (refer to transports)
//  - source: The unique name of the source that is generating the log
const Logger = new TrustLogger({
    format: "standardFormat",
    transports: [
      {
        name: "kafkaTransport",
        meta: {
          kafkaBroker: "kafka:9092",
          kafkaClientId: "data-management",
          logTopic: "logs",
        },
      },
      {
        name: "consoleTransport",
        meta: {},
      },
    ],
    source: "data-management"
  }
);

// [...]

// the generation of a log entry
// Each call of the log method requires:
//  - category: Category of the log (defined in the format - e.g. 'debug')
//  - payload: Object containing all the further information 
//    (also based on the format - in this case standardFormat)
// ----
// The parameters are usually fetched from the request or created internally
//  this static implementation with strings is only for demonstrating purposes
var logPayload = {
  user_name: "jwatson",
  user_ip: "203.0.113.254",
  session: "YWRtaW46YWRtaW4",
  status: "success",
  data_owner: "jwatson",
  data_id: "405ophkklw5s879",
  data_name: "ExampleData.png",
  reason: "data was uploaded",
};
Logger.log("Store", logPayload);

// Depeding on the format further information might be added to the log
// in the case of the standard format this is: time, source_ip and priority

formats

The formats define the structure and content of a log entry. The format developed in the thesis can be accessed trough the name "standardFormat".

standardFormat

| Field | Example values | | ------------- | ------------------------------------------- | | source_name | FileService | | user_name | jwatson | | user_ip | 203.0.113.254 | | session | YWRtaW46YWRtaW4 | | status | success | | data_id | 405ophkklw5s879 | | data_name | ExampleData | | data_owner | jwatson | | reason | file upload | | category | create not added to payload | | source_ip | 192.0.2.10 not added to payload | | time | 2021-12-02T11:12:13Z not added to payload | | priority | 1 not added to payload |

transports

The transports are classes which can be used to transport the log to a data stream or a file etc. At the moment the logs can be sent to a kafka stream and to the console of the component using the logger. Each transport has a unique name and a meta object containing all the params. The array for transports must at least contain one of these transport objects. The content of the meta object depends on the transport mechanism.

// [...]
transports: [
  {
    name: "kafkaTransport",
    meta: {
      kafkaBroker: "kafka:9092",
      kafkaClientId: "data-management",
      logTopic: "logs",
    },
  },
  {
    name: "consoleTransport",
    meta: {},
  },
],
// [...]

It is also possible to use multiple transports.

kafkaTransport

Can be used to send the log to a kafka stream. The meta object must contain the address of the kafkaBroker, a kafkaClientId and the logTopic (kafka topic to log to).

consoleTransport

Just prints the logs to the console.

Installation

npm install trust-logger-ba