npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

kfk

v0.4.0

Published

The high-level node kafka client based on node-rdkafka .

Downloads

80

Readme

Node-Kfk

Build Status npm npm Codacy Badge Coverage Status

Why I need it

Kafka is not friendly enough for programmers who don't have a clear knowledge on it.

Considering our usage are similar at most of the time, we want to provide a simple client for simple use case on kafka.

Usage

Install

npm i kfk -S

Kafka Producer

const conf = {
  'client.id': 'kafka',
  'metadata.broker.list': '127.0.0.1:9092',
}
const topicConf = {
}
const options = {
  debug: false,
}

const producer = new KafkaProducer(conf, topicConf, options)

await producer.connect()

console.log('connected')

while (true) {
  const msg = `${new Date().getTime()}-${crypto.randomBytes(20).toString('hex')}`

  await producer.produce(_.sample([
    'rdkafka-test0',
    'rdkafka-test1',
    'rdkafka-test2',
  ]), null, msg)
}

Kafka ALO(at least once) Consumer

const conf = {
  'group.id': 'alo-consumer-test-1',
  'metadata.broker.list': '127.0.0.1:9092',
}
const topicConf = {
  'auto.offset.reset': 'largest',
}
const options = {
  debug: false,
}

const consumer = new KafkaALOConsumer(conf, topicConf, options)
await consumer.connect()
await consumer.subscribe([
  'rdkafka-test0',
  'rdkafka-test1',
  'rdkafka-test2',
])

while (true) {
  await consumer.consume(message => {
    console.log(`topic: ${message.topic} offset : ${message.offset} val: ${message.value.toString('utf-8')}`)
  }, {
      size: 10,
      concurrency: 5,
    })
}

Kafka AMO(at most once) Consumer

const conf = {
  'group.id': 'amo-consumer-test-1',
  'metadata.broker.list': '127.0.0.1:9092',
}
const topicConf = {
  'auto.offset.reset': 'largest',
}
const options = {
  debug: false,
}

const consumer = new KafkaAMOConsumer(conf, topicConf, options)
await consumer.connect()
await consumer.subscribe([
  'rdkafka-test0',
  'rdkafka-test1',
  'rdkafka-test2',
])

while (true) {
  await consumer.consume(message => {
    console.log(`topic: ${message.topic} offset : ${message.offset} val: ${message.value.toString('utf-8')}`)
  }, {
      size: 10,
      concurrency: 5,
    })
}

Graceful Death

const gracefulDeath = async () => {
  await producer.die()
  await consumer.die()
  process.exit(0)
}
process.on('SIGINT', gracefulDeath)
process.on('SIGQUIT', gracefulDeath)
process.on('SIGTERM', gracefulDeath)

Deep Dive

Choose your right consumer

node-kfk provide two consumer choices for you: KafkaALOConsumer and KafkaAMOConsumer. ALO means At Least Once, and AMO means At Most Once.

At Least Once

If you cannot tolerate any message loss and you have handled the repetitive execution situation in your consumer function, you may want your consumer has at least once guarantee.

KafkaALOConsumer will monitor your consume callback function execute state and if there are any Error thrown in your consumer callback function (or process crashed), it will begin at the offsets you last consumed successfully.

At Most Once

If you do not very care about little messages loss when problem happens, but you want to make sure that every message only can be handled on time, you can just use the KafkaAMOConsumer.

KafkaAMOConsumer will auto commits the offsets when fetched the messages. It has better performance than KafkaALOConsumer, but not guarantee that all messages will be consumed.

Offset Management Detail

In KafkaAMOConsumer, node-kfk use the enable.auto.commit=true and enable.auto.offset.store=true options which completely depend on librdkafka to management the offsets and will auto commit the latest offsets periodically(the interval depends on auto.commit.interval.ms, default is 1000).

In KafkaALOConsumer, we still want librdkafka to commit automatically, but we need to control offsetStore manually(now we set enable.auto.commit=true and enable.auto.offset.store=false). When node-kfk ensure that all messages had been handled successfully, it will store the latest offsets in offsetStore, and wait for committed by librdkafka.

Others

The client has been tested on:

- os: linux
  env: KAFKA_VERSION=0.10.2.2
  node_js: 8
- os: linux
  env: KAFKA_VERSION=0.10.2.2
  node_js: 10
- os: linux
  env: KAFKA_VERSION=0.11.0.3
  node_js: 10
- os: linux
  env: KAFKA_VERSION=1.1.0
  node_js: 10
- os: linux
  env: KAFKA_VERSION=2.0.0
  node_js: 10

More detailed document for conf and topicConf params in librdkafka and node-rdkafka