npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

hapi-sse-kafka

v0.2.0

Published

Expose any Kafka topic as an SSE stream

Downloads

3

Readme

hapi-sse-kafka

Expose any Kafka topic as a SSE (Server Sent Events) stream

Installation

npm install hapi-sse-kafka --save

Getting started

In a nutshell, hapi-sse-kafka constructs Hapi request handlers which can be registered as part of a route.

const hapiSSEKafka = require('hapi-sse-kafka')

// bootstrap a Kafka client adapter
const noKafkaAdapter = new hapiSSEKafka.adapters.NoKafkaAdapter()

// construct the handler with topic, partition and adapter 
const sseKafkaHandler = hapiSSEKafka.createHandler({topic: 'all', partition: 0, adapter: noKafkaAdapter})

// register the handler as part of a route definition in Hapi
const server = new Hapi.Server()
server.connection({port: 9100})
server.register([require('susie')])
    .then(() => server.route({path: '/events/streaming', method: 'GET', handler: sseKafkaHandler})
    .then(() => server.start())

That's pretty much all there is to it!

Not very opiniated as you can tell, it's totally up to you to define path, method and other metadata before adding the route.

From now on, any message which gets added to the Kafka topic will be broadcasted through the /events/streaming SSE endpoint, the only prerequisite is that topic messages are compatible with the standard message structure.

Standard Message Structure

Currently hapi-sse-kafka expects keyed messages and maps them to SSE objects in the following way :

offset -> id 
message key -> event 
message value -> data 

Make sure messages fit these requirements if you want to expose them using this lib.

Filtering

The hapi-sse-kafka request handler supports filtering events through query parameters.

Given the example snippet at the start of this doc, the following request path :

http://localhost:9100/events/streaming?filter[event]=books.insert  

would only return SSE objects for which the event matches books.insert

multiple events may be specified

?filter[event]=books.insert,dvds.insert  

as well as regex values

?filter[event]=books.*  

Dependencies

hapi-sse-kafka requires susie to be registered as a plugin, make sure this dependency is installed and is added to your server bootstrap routine.

 npm install susie --save 
server.register([require('susie')])

Adapters

hapi-sse-kafka uses the concept of adapters to be able to support different Kafka clients. The built-in adapters can be accessed via the .adapters namespace

const hapiSSEKafka = require('hapi-sse-kafka')
const noKafkaAdapter = new hapiSSEKafka.adapters.NoKafkaAdapter()

Currently the only supported client is no-kafka however more implementations will be added going forward.

No-kafka

https://github.com/oleksiyk/kafka

The NoKafkaAdapter handles the interaction with Kafka using a SimpleConsumer.

Any options passed in when NoKafkaAdapter is initalised will be passed on as-is when the SimpleConsumer is initalised.


const options = {connectionString: `192.168.99.100:9092`, maxWaitTime: 300}
const noKafkaAdapter = new hapiSSEKafka.adapters.NoKafkaAdapter(options)

For a complete list of available options, check out the no-kafka docs

Example

Coming soon! Meanwhile have a look at the tests, the Kafka infra is fully dockerized and can be brought up with a simple docker-compose up command.

docker-compose up -d
mocha ./test