npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@chronark/upstash-kafka

v1.1.1

Published

Serverless kafka client for upstash

Downloads

47

Readme

upstash-kafka

Serverless kafka client for upstash

codecov

This project is written using the deno runtime and then transpiled to node and published as a package on npm.

Requirements

Either deno 1.x or node 14.x and higher

Installation

Deno

import { Kafka } from "https://deno.land/x/upstash_kafka/mod.ts"

Node

npm install @chronark/upstash-kafka
yarn add @chronark/upstash-kafka
pnpm add @chronark/upstash-kafka

You get the idea.

Quickstart

Auth

  1. Go to upstash and select your database.
  2. Copy the REST API secrets at the bottom of the page
import { Kafka } from "@chronark/upstash-kafka";

const kafka = new Kafka({
  url: "<UPSTASH_KAFKA_REST_URL>",
  username: "<UPSTASH_KAFKA_REST_USERNAME>",
  password: "<UPSTASH_KAFKA_REST_PASSWORD>",
});

Produce a single message

const p = kafka.producer();
const message = { hello: "world" }; // Objects will get serialized using `JSON.stringify`
const res = await p.produce("<my.topic>", message);
const res = await p.produce("<my.topic>", message, {
  partition: 1,
  timestamp: 12345,
  key: "<custom key>",
  headers: [{ key: "traceId", value: "85a9f12" }],
});

Produce multiple messages.

The same options from the example above can be set for every message.

const p = kafka.producer();
const res = await p.produceMany([
  {
    topic: "my.topic",
    value: "my message",
    // ...options
  },
  {
    topic: "another.topic",
    value: "another message",
    // ...options
  },
]);

Consume

The first time a consumer is created, it needs to figure out the group coordinator by asking the Kafka brokers and joins the consumer group. This process takes some time to complete. That's why when a consumer instance is created first time, it may return empty messages until consumer group coordination is completed.

const c = kafka.consumer();
const messages = await c.consume({
  consumerGroupId: "group_1",
  instanceId: "instance_1",
  topics: ["test.topic"],
  autoOffsetReset: "earliest",
});

More examples can be found in the docstring

Commit manually

While consume can handle committing automatically, you can also use Consumer.commit to manually commit.

const consumerGroupId = "mygroup";
const instanceId = "myinstance";
const topic = "my.topic";

const c = kafka.consumer();
const messages = await c.consume({
  consumerGroupId,
  instanceId,
  topics: [topic],
  autoCommit: false,
});

for (const message of messages) {
  // message handling logic

  await c.commit({
    consumerGroupId,
    instanceId,
    offset: {
      topic: message.topic,
      partition: message.partition,
      offset: message.offset,
    },
  });
}

Fetch

You can also manage offsets manually by using Consumer.fetch

const c = kafka.consumer();
const messages = await c.fetch({
  topic: "greeting",
  partition: 3,
  offset: 42,
  timeout: 1000,
});

Examples

There is a minimal working example application available in /example as well as various examples in the docstrings of each method.

Contributing

Setup

  1. Create a kafka instance on upstash docs
  2. Create the following topics: blue, red, green docs
  3. Create .env file with your kafka secrets cp .env.example .env

Running tests

make test

Building for node

make build

A /npm folder will be created with the built node module. As part of the build process the tests are run against your installed node version. To disable this, you can configure the build pipeline in /cmd/build.ts

// ...
await build({
  test: false, // <-- add this
  // ... remaining config
});