Using eMagiz as a consumer
In this microlearning, we will explore how to effectively using eMagiz as a consumer in the context of Event Streaming. We will guide you through the process of setting up eMagiz to consume and process data from specific topics within an external Kafka cluster. Whether you're integrating systems or setting up event processors, this guide will walk you through the essential steps and components needed to get started.
Should you have any questions, please contact academy@emagiz.com.
1. Prerequisites
- Basic knowledge of the eMagiz platform
- Kafka cluster you can use to test against.
- Event streaming license activated
2. Key concepts
This microlearning centers around using eMagiz as a consumer.
- By using, in this context, we mean: Being able to consume data from topics that are managed within an external Kafka Cluster, such as the eMagiz Kafka Cluster.
- By knowing how you can easily use eMagiz as a consumer in these situations you can create event processors or create hybrid configurations (e.g., messaging to event streaming combination).
In this microlearning, we will zoom in on which components in eMagiz you would need to consume data from a topic. Note that, as we learned in the previous module, eMagiz will auto-generate these components when you create an event processor. If you want more information on event processors please revisit these microlearnings.
3. Using eMagiz as a consumer
When integrating several systems it can quickly happen that a subset of those systems can only be reached via traditional messaging patterns whereas others can write and/or read data directly on a topic. In this microlearning, we will focus on the scenario where an external party writes data to a topic. eMagiz will in turn consume data from the topic and distribute it to several legacy systems.
To make this work you need to identify the producing system, the consuming system (eMagiz), and create a messaging onramp that will ensure that the messaging pattern is followed towards the legacy systems. In the picture shown below, we have illustrated this in the Capture phase of eMagiz
To complete the picture we need to ensure that the data is send towards a legacy system. Something along these lines:
In the Design phase, you can configure your integration as explained in various of our previous microlearnings. So we will, for now, assume that you designed your integration as you had in mind.
3.1 View solution in Create
The first step after adding the integrations to Create is to see how this depiction of reality is represented by eMagiz in the Create phase. See below for the result of the Event Streaming part and the Messaging part:
As learned in previous microlearnings the Event Streaming part, in this case, the creation of the topic is done automatically by eMagiz. We will zoom in on how you can set up your messaging entry flow in such a way that it will consume data from the topic that is automatically created by eMagiz.
3.2 Consume data
When you open your entry flow you will see that eMagiz has autogenerated some components for you.
Now we need to make sure that we can consume data from the topic with the help of the eMagiz components that are available in the eMagiz flow designer. In this case, we will need the Kafka message-driven channel adapter.
As the name suggests this component waits for messages to arrive at the Kafka topic and when they do it will consume those messages. Secondly, we need a support object that will set up the connection between the eMagiz flow and the eMagiz Event Streaming cluster that hosts the topic. This component is called the Kafka message listener container.
Within this component, you need to fill in several information elements. The first element is the bootstrap server URL. The bootstrap server URL is registered via a property that is build up as follows: ${nl.nameofyourparentcompany.nameofyourcompany.techninalnameofyourproject.bootstrapserver}.
Alongside the bootstrap server URL, you will need to specify the group ID. You can use any name you want. The advice is to give it a descriptive name that is recognizable later on.
The third setting you need to specify is from which topic you want to consume data. Once again these are governed by a certain naming convention:
nl.{nameofyourparentcompany.nameofyourcompany.techninalnameofyourproject}.${emagiz.runtime.environment}.{technicalnameofyourtopic}.
Last but not least on the Basic tab you need to specify that the security protocol is SSL. If you have done all of this it should look like this:
Now that we have filled in the basics for the support object we can fill in the details for our Kafka message-driven channel adapter. Here you select the output channel and link the support object to this component. The result should be similar to this:
As a result your flow should look similar as shown below:
3.3 Set up SSL connection
As you probably noticed I skipped the part on configuring the SSL connection that will ensure that the eMagiz flow can connect to the eMagiz cluster. However, there is no room to escape anymore as we need it to correctly set up our connection with the topic. If you have an event processor within your project you can simply re-use the Keystore and truststore that eMagiz will have automatically generated for you. If you have no event processor within your project you first need to configure an external user and do some manual steps.
To reduce the complexity of this microlearning I assume that the Keystore and truststore that you need are already available within your project. The first step will be to add these resources to your flow via the Resources tab on the flow level.
Now that we have these resources added to the flow we can navigate back to the support object and open the SSL tab and fill it in accordingly. For security reasons, I have changed the passwords in the screenshot below.
With this configured your flow is ready to consume data from the topic and from there send it via the messaging pattern to your legacy systems.
3.4 Managing Consumer groups
In the Kafka Message Listener component you will find a specific setting to identify the Group ID of this flow. A Group ID identifies a consumer group. A consumer group can have more than 1 consumer that share the offset of the message consumed from that topic. In case where a flow is duplicated for fail-over purposes, the group ID will ensure that each flow will read message that belongs to the latest offset (so that message don't get read twice). In case consumers exist that need to read each message for a different usage or business process, you need to make sure that these Group IDs are different (each group ID has its own offset).
4. Key takeaways
- Utilizing eMagiz as a consumer is particularly useful in hybrid scenarios where both traditional messaging and event streaming are involved. In simpler cases, eMagiz automates much of the process for you.
- Ensure your eMagiz flow is correctly connected to the eMagiz Event Streaming cluster and configured with the necessary SSL connections to enable smooth data production.
- Properly manage consumer groups to ensure message offsets are handled correctly, avoiding duplicate reads and ensuring each consumer group gets the appropriate messages.
5. Suggested Additional Readings
If you are interested in this topic and want more information on it, please see the following links: