npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

jk-kafka-beta

v1.0.1

Published

My Kafka package is a lightweight and reliable package that helps you to easily consume and produce messages in AWS MSK (Managed Streaming for Kafka) with Schema Registry support.

Downloads

3

Readme

Jai Kisan Kafka Package (jk-kafka)

Jai Kisan Kafka Package is a simple and intuitive package that allows you to easily consume and produce messages in AWS MSK (Managed Streaming for Apache Kafka) and validate them against a JSON schema.

Installation

To install jk-kaka package, you can use npm:

npm install jk-kafka

Define env file

To use package we need to define .env file in root . Here is the sample file for env

BROKERS=broker1:9092,broker2:9092
CLIENT_ID=kafka-producer
SCHEMA_REGISTRY_HOST=http://schema-registry:8081
PRODUCER_CLIENT_ID=kafka
PRODUCER_METADATA_BROKER_LIST=broker1:9092,broker2:9092
CONSUMER_GROUP_ID=my-group
CONSUMER_METADATA_BROKER_LIST=broker1:9092,broker2:9092

Producer

To use the producer, you can import it and call the SendMessage function. Here's an example:

const { kafkaProducer } = require('jk-kafka');
const producerExec = async () => {
const requestBody = {
      topic: 'my-topic-1',
      message: {
        id: 1,
        name: 'Jai-Kisan',
        email: '[email protected]',

      },
      partition: 0,
      isValidationEnabled: false,
      key: 'my-key',
      schemaId: 22,
    }


    await kafkaProducer.connect()
    const result = await kafkaProducer.sendMessage(requestBody);
    console.log(result);
    await kafkaProducer.disconnect()
}
producerExec()

In the example above, we first import the SendMessage function from jk-kafka-beta. We then define the topic, message, partition, isValidationEnabled, key, and schemaId variables, and call the SendMessage function with object as a parameter. The function sends the message to the specified topic, and returns an object with the message and the response from the producer.Finally, the producer is disconnected using the disconnect function to free up resources.

Consumer

To use the general consumer, you can import it and call the kafkaConsumer function. Here's an example:

const { kafkaConsumer } = require('jk-kafka');

const config = {
// Define the onMessage callback function
  onMessage: (message) => {
    console.log(`Received message: ${JSON.stringify(message)}`);
  }
};

const subscribe = {
  topic:'my-topic-1'
}

const consumerExec = async () => {
try{
await kafkaConsumer.connect()
await kafkaConsumer.subscribe(subscribe)
await kafkaConsumer.consumeMessage(config)
}
catch(error){
  console.error(error)
} finally {
  await kafkaConsumer.disconnect();
}
}

consumerExec(); 

In the example above, we first import the kafkaConsumer function from jk-kafka. We then call the connect method to connect with broker and subscribe method to subscribe topic. The consumeMessage function calls with predefined onmessage functions which execute on each message received.

Consumer (Lambda)

To use the general consumer, you can import it and call the kafkaLambdaConsumer function. Here's an example:

const { kafkaLambdaConsumer } = require('jk-kafka');

exports.handler = async (event, context) => {
  // Define the onMessage callback function
  const config = {
    event: event,
    onMessage: (message) => {
      console.log(`Received message: ${JSON.stringify(message)}`);
    }
  };
  
  try {
    await kafkaLambdaConsumer.consumeMessage(config);
  } catch (error) {
    console.error(error);
  }
};

In the example above, we first import the kafkaLambdaConsumer function from jk-kafka. Lambda act as consumer which will will be configured via MSK trigger(Consumer group and topic we can define on lambda-msk trigger configuration). The consumeMessage function calls with predefined onmessage functions and incoming event which execute on each message received.

Contributing

Contributions are welcome! If you have any issues or feature requests, please open an issue on the GitHub repository. If you'd like to contribute code, please fork the repository and submit a pull request.

Local Setup

In order to contribute to the development of a Node.js project or run it locally, you will typically need to install the required npm packages.

npm install --save

Run Test

ts-node producer.ts
ts-node consumer.ts

License

Jai Kisan Kafka Package is MIT licensed.