npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

pino-kafka

v0.1.2

Published

A pino 'transport' for writing to kafka

Downloads

31

Readme

pino-kafka

This module provides a "transport" for pino that simply forwards messages to kafka.

You should install pino-kafka globally for ease of use:

$ npm install --production -g pino-kafka
### or with yarn
$ yarn global add pino-kafka

Requirements

This library depends on node-rdkafka. Have a look at node-rdkafka requirements.

Usage

CLI

Given an application foo that logs via pino, and a kafka broker listening on 10.10.10.5:9200 you would use pino-kafka as:

$ node foo | pino-kafka -b 10.10.10.5:9200

Programmatic Usage

Initialize pino-kafka and pass it to pino.

const pino = require('pino')
const pkafka = require('pino-kafka')

const logger = pino({}, pkafka({ brokers: "10.10.10.5:9200"}))

Options

  • --brokers (-b): broker list for kafka producer. Comma seperated hosts
  • --defaultTopic (-d): default topic name for kafka.
  • --timeout (-t): timeout for initial broker connection in milliseconds. Default 10000
  • --echo (-e): echo the received messages to stdout. Default: false.
  • --settings: path to config JSON file. Have a look at Settings JSON file section for details and examples
  • --kafka.$config: any kafka configuration can be passed with prefix kafka. Please visit node-rdkafka configuration for available options. Note that only producer and global configuration properties will be used. Have a look at Kafka Settings section for details and examples

Settings JSON File

The --settings switch can be used to specify a JSON file that contains a hash of settings for the application. A full settings file is:

{
  "brokers": "10.6.25.11:9092, 10.6.25.12:9092",
  "defaultTopic": "blackbox",
  "kafka": {
    "compression.codec":"none",
    "enable.idempotence": "true",
    "max.in.flight.requests.per.connection": 4,
    "message.send.max.retries": 10000000,
    "acks": "all"
  }
}

Note that command line switches take precedence over settings in a settings file. For example, given the settings file:

{
  "brokers": "my.broker",
  "defaultTopic": "test"
}

And the command line:

$ yes | pino-kafka -s ./settings.json -b 10.10.10.11:9200

The connection will be made to address 10.10.10.11:9200 with the default topic test.

Kafka Settings

You can pass node-rdkafka producer configuration by prefixing the property with kafka. For example:

$ yes | pino-kafka --kafka.retries=5 --kafka.retry.backoff.ms=500

In the Setting JSON File you can use followings:

{
  "kafka": {
    "retries": "5",
    "retry.backoff.ms": "500"
  }
}

Following will work also:

{
  "kafka": {
    "retries": "5",
    "retry":{
      "backoff": {
        "ms":  "500"
      }
    }
  }
}

Accessing Internal Kafka Producer

You can access node-rdkafka producer from pino stream with _kafka.

For example:

const pino = require('pino')

const logger = pino({}, pkafka({ brokers: "10.10.10.5:9200"}))

logger[pino.symbols.streamSym]._kafka.getMetadata({}, (err, data)=> {
    //...
})

Testing

For running tests make sure you installed dependencies with npm install or yarn and have a running kafka. More easily, if you have docker and docker-compose installed, you can create one with following.

$ cd pino-kafka
$ docker-compose up -d

Look at docker-compose file for more details.

After you all setup, just run test command with following:

$ npm run test
# or with yarn
$ yarn test

NOTE: If you use your own kafka setup, you may need to change test configuration accordingly to your needs(ip, topic etc.)

License

MIT