npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

loggme

v2.1.0

Published

LoggMe is a Fool-proof json logging utility for Node.js with colored output, file logging and Kafka broker support.

Downloads

247

Readme

LoggMe

LoggMe is a Fool-proof json logging utility for Node.js with colored output, file logging and Kafka broker support.

Table of Contents

Installation

npm install loggme

Features

  • Fool-proof simple configuration.
  • Stdout stream, file-logging, Kafka Broker logging support.
  • Easy-to-follow Centralized Log Support For Node Microservices.

Usage

In a very basic setup, import the logging library, and call the debug method.

const loggMe = require('loggme');
loggMe.debug('Hi. You just logged this message in debug level.')

Output

DEBUG
{"level":"DEBUG","msg":"Hi. You just logged this message in debug level.","time":"2024-09-04T21:55:47.310Z"}

No Configuration

When no configuration is made, the default ones are used.

Default Configurations

  • logLevel = 'DEBUG'
  • timeFormat = 'IsoString'
  • logFormat = 'formatters.dev'
  • stream = createConsoleStream()
const loggMe = require('loggme');

/**
 * Log only string
 * 
 * syntax;
 * loggMe.<debug|info|warn|error|fatal>(string_Message)
 */
loggMe.debug('Log this debug message.')

/**
 * Log json fields with message
 *
 * syntax;
 * loggMe.<debug|info|warn|error|fatal>({jsonObj}, string_Message)
 */
loggMe.debug({field1:"value1"},'Log this debug message.')

/**
 * Log errors
 *
 * syntax;
 * loggMe.<debug|info|warn|error|fatal>(error_Object)
 */
loggMe.debug(new Error("Something unexpected happened."))

To experiment all possible variations see examples/1_no_configuration.js.

LogLevel

In production, only log selected logLevel and above.

Example;

If logLevel is set to ERROR in production, only loggMe.error and loggMe.fatal commands work.

Debug,info and warn can still be used, but does no operation.

Possible Log levels;

  • ['DEBUG', 'INFO', 'WARN', 'ERROR', 'FATAL']

Operational levels after logLevel set to error;

  • ['~~DEBUG~~', '~~INFO~~', '~~WARN~~', 'ERROR', 'FATAL']
const loggMe = require('loggme');
loggMe.setLogLevel('ERROR');

loggMe.debug({field1:"value1"},'Log this debug message.')  //  no operation (DEBUG index < ERROR index)
loggMe.fatal({field1:"value1"},'Log this fatal message.')  // logged in console (FATAL index  = ERROR index)

To experiment all possible variations see examples/2_set_logLevel.js.

LogFormat

Possible Log Formats

  • ['dev', 'json']

By default dev logFormat is applied. Provides a colorful console output having a logLevel prefix.

To use File-Stream of Kafka-Stream set logFormat to json.

const loggMe = require('loggme');

loggMe.setLogFormat('json')      // Option 1
loggMe.debug('Log this debug message.')

To experiment all possible variations see examples/3_setLogLevel.js.

TimeFormat

By default IsoString time format is applied.

Possible Time Formats

  • ['IsoString', 'unixTimestamp']
const loggMe = require('loggme');

/**
 * set timeFormat to isoString
 */
loggMe.setTimeFormat(("IsoString"))
loggMe.debug('Log this debug message.')
// Output: {.... , "time":"2024-09-04T14:10:42.277Z"}

/**
 * set timeFormat to unixTimestamp
 */
loggMe.setTimeFormat(("unixTimestamp"))
loggMe.info('Log this info message.')
// Output: {....., "time":1725459042277}

To experiment all possible variations see examples/4_set_time_format.js.

FileStream

Give the path of the file as an argument to createFileStream.

const loggMe = require('loggme');

/**
 * Create a json logging stream to a file. 
 * Colorful dev format does not look good on a file.
 */
loggMe.createFileStream('./logs.txt')
    .setLogFormat('json')

loggMe.info('Log this info message.')
loggMe.fatal(new Error("Something unexpected happened."))

To experiment all possible variations see examples/5_create_file_stream.js.

KafkaStream

Although loggMe uses kafkajs library for Kafka Streaming this decision is based primarily on the fact that as of now, Node.js does not have an official Kafka client library. Confluent team's KafkaClient is still in early access development release stage. As soon as it is ready confluent-kafka-javascript will be used for later versions of the loggMe.

Define broker addresses in an array, broker-topic and optionally the clientid. Then set logFormat to json as below.

const loggMe = require('loggme');

// create a kafka stream
loggMe.createKafkaStream(['localhost:9092'], 'your-kafka-topic',  "clientid")
    .setLogFormat('json') 

loggMe.debug('Log this debug message.')

To experiment all possible variations see examples/6_create_kafka_stream.js.

Centralized Logging

In a microservices architecture it is advised to aggregate the logs from microservices and store than in a single location. Since LoggMe supports Kafka logging, you can produce all your microservice logs to kafka and collect these logs in your centralized ELK. This way you can monitor all your node microservices in a centralized manner.