Connect your Mendix app to Apache Kafka.
This module allows you to:
- Produce data in Kafka topics
- Consume data from Kafka topics
- Stream data between Kafka topics
Kafka Produce allows you to write data to a topic.
To produce data, you need to start a producer. All configuration options are available. However, the only configuration information it needs is the Kafka server to connect to and the topic to write to.
You can send a key/value pair to Kafka with the 'Send to Kafka' action.
Kafka Consume allows you to read data from a topic.
To consume data, you need to start a consumer. All configuration options are available. However, the only configuration information it needs is the Kafka server to connect to, the topic to read from, and a microflow.
The microflow takes the following parameters:
- offset (Long): The position of this record in the corresponding Kafka partition.
- key (String): The key (or null if no key is specified)
- value (String): The value
The consumer calls the microflow for each message in the topic. If you only want to start the microflow for certain messages, you can use a filtered processor
Kafka Streams allows you to read data from one topic, and write data to another.
To stream data between two topics, you need to start a processor. All configuration options are available. However, the only configuration information it needs is the Kafka server to connect to, the topic to read from, and a microflow.
The microflow takes the following parameters:
- offset (Long): The position of this record in the corresponding Kafka partition.
- key (String): The key (or null if no key is specified)
- value (String): The value
It may return one of the following:
- List of KeyValuePair: All these messages will be sent to the output topic.
- KeyValuePair: This message will be sent to the output topic.
- String: A message with that value (and no key) will be sent to the output topic
- Nothing: No messages will be sent to the output topic.
If you start a processor without an output topic, it behaves like a consumer in the sense that it only reads and does not write.
The consumer calls the microflow for each message in the input topic.
You can filter the messages that cause the microflow to execute by starting a filtered processor. This works for JSON messages only, and only when you want to filter on a specific part of the JSON message, such as a property value.
Make sure you call the 'Stop all Kafka connections' action before the app stops, for instance during the 'Before shutdown' microflow.
Features:
- Supports Producer, Consumer and Streams API's.
- Allows all configuration options
- Allows basic filtering for streams
Limitations:
- Not (yet) integrated with the app's SSL certificates
- More advanced filtering requires custom java
Dependencies:
- Works with Mendix 6.10.9 and up