@itentialopensource/apache-kafkav2-library-example
v1.0.5-2023.2.1
Published
This Pre-Built Automation bundle contains example use cases that are applicable when Itential Automation Platform is integrated with Apache Kafka using the kafkajs npm library. Because every environment is different, these use cases are fully functioning
Downloads
11
Readme
Apache - Kafkav2 - Library - Example
Overview
This Pre-Built Automation bundle contains example use cases that are applicable when Itential Automation Platform is integrated with Apache Kafka using the kafkajs npm library. Because every environment is different, these use cases are fully functioning examples that can be easily modified to operate in your specific environment. These workflows have been written with modularity in mind to make them easy to understand and simple to modify to suit your needs.
Example Workflows
This example use case consumes Kafka messages on a given topic and triggers a job from an IAP Operations Manager Event Trigger along with passing message data to that job's context.
The Apache - Kafkav2 - Library Pre-Built provides a modular worfklow that writes to a topic in Apache which then writes to a message queue in IAP's Event System.
In order to subscribe to the relevant event topic, one must configure the adapter-kafkav2 service configuration's topic properties correctly.
The following topic properties is an example of setting the adapter-kafkav2 service configuration so that the adapter subscribes to the Kafka event topic kafkav2-example-topic
, does not filter messages based on any text, and subscribes as well to the IAP Event System topic iap-example-topic
:
"topics": [
{
"name": "kafkav2-example-topic",
"always": true,
"subscriberInfo": [
{
"subname": "default",
"filters": [],
"rabbit": "iap-example-topic",
"throttle": {}
}
]
}
]
Note when a new topic is set in the service configuration, a new topic is written to file in the adapter-kafkav2 directory in the file .topics-<kafkav2-adapter-name>.json
. If the Consume Message - Kafkav2 - Library
worfklow is not being triggered, this file may not have yet been updated with the expected topic.
On setting the adapter-kafkav2 service configuration as shown above, navigate to the Consume Message - Kafkav2 - Library - Example
Operations Manager automation, click on the Trigger Consume Message - Kafkav2 - Example - Trigger
and select under Event
the topic that was set in the adapter service configuration. The event topic based on service configuration above would be iap-example-topic - @itentialopensource/adapter-kafkav2
.
To then produce a message to the topic set, run the modular workflow from the Apache - Kafkav2 - Library Pre-Built Produce Message - Kafkav2 - Library
. To produce a message to the topic configured above, provide inputs such as the following:
{
"adapterId": "kafkav2",
"topic": "kafkav2-example-topic",
"messages": [
"This is a message sent to topic kafkav2-example-topic and will then be picked up eventually by IAP's Event System in the topic iap-example-topic"
],
"suppressMessage": true
}
Upon the Send Message
task completing in this workflow, there will be a job started from the workflow Consume Message - Kafkav2 - Library
of this Pre-Built that can be viewed in Operations Manager in the Jobs panel.
Note, the Trigger Consume Message - Kafkav2 - Example - Trigger
defaults to the kafka - @itentialopensource/adapter-kafkav2
event topic, which would correspond to the following adapter-kafka service configuration:
"topics": [
{
"name": "kafkav2-example-topic",
"always": true,
"subscriberInfo": [
{
"subname": "default",
"filters": [],
"rabbit": "kafka",
"throttle": {}
}
]
}
]
Additionally, the client property in the adapter-kafkav2 service configuration has property brokers to denote the Kafka instance and port to connect to. Further, the logLevel property within the client object overrides the logging of the adapter. The following configuration denotes the adapter-kafkav2 connects to a Kafka instance at 172.1.1.200:9092 and the log level is set to debug:
{
"host": "",
"port": 1,
"client": {
"logLevel": "logLevel.DEBUG",
"brokers": [
"172.1.1.200:9092"
]
}
}
In the above example, the host
and port
properties, while required in the adapter configuration, are ignored as the `client.brokers[0] information is used to connect to the Kafka instance.