@itentialopensource/kafka-consumer
v0.0.12
Published
[Deprecated] This Pre-Built Automation contains a workflow designed to listen to specific/desired set of messages that are published to a specific Kafka topic.
Downloads
43
Readme
Deprecation Notice
This Pre-Built has been deprecated as of 04-30-2024 and will be end of life on 04-30-2025. The capabilities of this Pre-Built have been replaced by the Apache - Kafka - Library - Example
Kafka Consumer
Table of Contents
- Overview
- Installation Prerequisites
- Requirements
- Features
- How to Install
- How to Run
- Additional Information
Overview
This Pre-Built Automation listens to specific/desired set of messages that are published to a specific Kafka topic.
Estimated Run Time: < 10 secs (based on when the desired message is received)
Installation Prerequisites
Users must satisfy the following pre-requisites:
- Itential Automation Platform
^2023.1
- Kafka Adapter
Requirements
This Pre-Built requires the following:
- Kafka server
Features
The main benefits and features of the Pre-Built are outlined below.
- One can easily subscribe to a Kafka topic that one want to listen to and wait until we get an expected/the desired message that we can process.
- Reusable workflow that can be altered according to one need.
- Allows zero-touch mode of operation.
How to Install
To install the Pre-Built:
- Verify you are running a supported version of the Itential Automation Platform (IAP) as listed above in the Requirements section in order to install the Pre-Built.
- The Pre-Built can be installed from within App-Admin_Essential. Simply search for the name of your desired Pre-Built and click the install button.
How to Run
Use the following to run the Pre-Built:
Navigate to Operations Manager in IAP and select the Kafka Consumer Pre-Built.
Select the existing manual trigger and complete the form that displays with the following details.
| Form Element | Description | | :----------- | :---------- | |
Zero Touch
| Select checkbox to eliminate user interactions. | |Kafka Adapter Id
| Name of your Kafka adapter. | |Topic
| Kafka topic name to which messages should be sent. | |Partition
| Partition number of a topic to which the messages are published | |Offset
| Offset number of the message | |Schema
| An object used to validate the expected message against the incoming messages |.
Schema guidelines:
One can use this sample schema to validate the expected message against the incoming messages. One can reference the expected message string against const
key in the schema object. For ex: If one is looking/expecting for a hello
message, specify the "const": "hello"
under the value
object.
NOTE: If the expected message is an object, please stringify the message object before referencing it to the const
key. For ex: if the expected message is and {"name": "bill"}
, one can reference it as: "const": "{\"name\": \"bill\"}"
. Make sure that additionalProperties
is set to true.
{
"type": "object",
"properties": {
"value": {
"type": "string",
"const": "hello"
}
},
"required": [
"value"
],
"additionalProperties": true
}
Additional Information
By default, the Kafka messages in IAP are published to kafka
rabbitmq topic. If in case, your application is publishing the Kafka messages to some different rabbitmq topic in IAP, make sure to change the topic name in the eventListenerJob
task.
To learn more about Kafka consumer, one can refer to this documentation