hapi-sse-kafka
v0.2.0
Published
Expose any Kafka topic as an SSE stream
Downloads
3
Readme
hapi-sse-kafka
Expose any Kafka topic as a SSE (Server Sent Events) stream
Installation
npm install hapi-sse-kafka --save
Getting started
In a nutshell, hapi-sse-kafka constructs Hapi request handlers which can be registered as part of a route.
const hapiSSEKafka = require('hapi-sse-kafka')
// bootstrap a Kafka client adapter
const noKafkaAdapter = new hapiSSEKafka.adapters.NoKafkaAdapter()
// construct the handler with topic, partition and adapter
const sseKafkaHandler = hapiSSEKafka.createHandler({topic: 'all', partition: 0, adapter: noKafkaAdapter})
// register the handler as part of a route definition in Hapi
const server = new Hapi.Server()
server.connection({port: 9100})
server.register([require('susie')])
.then(() => server.route({path: '/events/streaming', method: 'GET', handler: sseKafkaHandler})
.then(() => server.start())
That's pretty much all there is to it!
Not very opiniated as you can tell, it's totally up to you to define path, method and other metadata before adding the route.
From now on, any message which gets added to the Kafka topic will be broadcasted through the /events/streaming SSE endpoint, the only prerequisite is that topic messages are compatible with the standard message structure.
Standard Message Structure
Currently hapi-sse-kafka expects keyed messages and maps them to SSE objects in the following way :
offset -> id
message key -> event
message value -> data
Make sure messages fit these requirements if you want to expose them using this lib.
Filtering
The hapi-sse-kafka request handler supports filtering events through query parameters.
Given the example snippet at the start of this doc, the following request path :
http://localhost:9100/events/streaming?filter[event]=books.insert
would only return SSE objects for which the event matches books.insert
multiple events may be specified
?filter[event]=books.insert,dvds.insert
as well as regex values
?filter[event]=books.*
Dependencies
hapi-sse-kafka requires susie to be registered as a plugin, make sure this dependency is installed and is added to your server bootstrap routine.
npm install susie --save
server.register([require('susie')])
Adapters
hapi-sse-kafka uses the concept of adapters to be able to support different Kafka clients. The built-in adapters can be accessed via the .adapters namespace
const hapiSSEKafka = require('hapi-sse-kafka')
const noKafkaAdapter = new hapiSSEKafka.adapters.NoKafkaAdapter()
Currently the only supported client is no-kafka however more implementations will be added going forward.
No-kafka
https://github.com/oleksiyk/kafka
The NoKafkaAdapter handles the interaction with Kafka using a SimpleConsumer.
Any options passed in when NoKafkaAdapter is initalised will be passed on as-is when the SimpleConsumer is initalised.
const options = {connectionString: `192.168.99.100:9092`, maxWaitTime: 300}
const noKafkaAdapter = new hapiSSEKafka.adapters.NoKafkaAdapter(options)
For a complete list of available options, check out the no-kafka docs
Example
Coming soon! Meanwhile have a look at the tests, the Kafka infra is fully dockerized and can be brought up with a simple docker-compose up command.
docker-compose up -d
mocha ./test