Using eMagiz as producer

Last modified by Danniar Firdausy on 2024/09/18 16:41

In this microlearning, we will explore how to effectively use eMagiz as a producer in the context of Event Streaming. We will guide you through the process of setting up eMagiz to produce data on specific topics, which can then be processed by other systems in a manner they desire.

Should you have any questions, please contact academy@emagiz.com.

1. Prerequisites

  • Basic knowledge of the eMagiz platform
  • Kafka cluster you can use to test against.
  • Event streaming license activated

2. Key concepts

This microlearning centers around using eMagiz as a producer.

  • By using, in this context, we mean: Being able to produce data on topics that are managed within an external Kafka Cluster, such as the eMagiz Kafka Cluster.
  • By knowing how you can easily use eMagiz as a producer in these situations you can create event processors or create hybrid configurations (e.g., messaging to event streaming combination).

In this microlearning, we will zoom in on which components in eMagiz you would need to produce data on a topic. Note that, as we learned in the previous module, eMagiz will auto-generate these components when you create an event processor. If you want more information on event processors please revisit these microlearnings.

3. Using eMagiz as producer

When integrating several systems it can quickly happen that a subset of those systems can only be reached via traditional messaging patterns whereas others can write and/or read data directly on a topic. In this microlearning, we will focus on the scenario where eMagiz writes data to a topic. External parties will in turn consume data from the topic and use it within their application context.

To make this work you need to identify the producing system (eMagiz), the consuming system, and create a messaging offramp that will ensure that the messaging pattern is followed from the legacy systems to the topic. In the picture shown below, we have illustrated this in the Capture phase of eMagiz

intermediate-event-streaming-connectors-emagiz-as-producer--capture-phase-es-solution.png

To complete the picture we need to ensure that the data is received from a legacy system. Something along these lines:

intermediate-event-streaming-connectors-emagiz-as-producer--capture-phase-msg-solution.png

In the Design phase, you can configure your integration as explained in various of our previous microlearnings. So we will, for now, assume that you designed your integration as you had in mind.

3.1 View solution in Create

The first step after adding the integrations to Create is to see how this depiction of reality is represented by eMagiz in the Create phase. See below for the result of the Event Streaming part and the Messaging part:

intermediate-event-streaming-connectors-emagiz-as-producer--create-phase-es-solution.png

intermediate-event-streaming-connectors-emagiz-as-producer--create-phase-msg-solution.png

As learned in previous microlearnings the Event Streaming part, in this case, the creation of the topic is done automatically by eMagiz. We will zoom in on how you can set up your messaging exit flow in such a way that it will produce data on the topic that is automatically created by eMagiz.

3.2 Produce data

When you open your exit flow you will see that eMagiz has autogenerated some components for you.

intermediate-event-streaming-connectors-emagiz-as-producer--create-phase-auto-generation-exit.png

Now we need to make sure that we can produce data on the topic with the help of the eMagiz components that are available in the eMagiz flow designer. In this case, we will need the Kafka outbound channel adapter.

intermediate-event-streaming-connectors-emagiz-as-producer--create-phase-kafka-outbound.png

As the name suggests this component can push data to a Kafka topic. Secondly, we need a support object that will set up the connection between the eMagiz flow and the eMagiz Event Streaming cluster that hosts the topic. This component is called the Kafka template.

intermediate-event-streaming-connectors-emagiz-as-producer--create-phase-kafka-template.png

Within this component, you need to fill in several information elements. The first element is the bootstrap server URL. The bootstrap server URL is registered via a property that is build up as follows: ${nl.nameofyourparentcompany.nameofyourcompany.techninalnameofyourproject.bootstrapserver}.

Alongside the bootstrap server URL, you will need to specify the client ID. You can use any name you want. The advice is to give it a descriptive name that is recognizable later on.

Last but not least on the Basic tab you need to specify that the security protocol is SSL. If you have done all of this it should look like this:

intermediate-event-streaming-connectors-emagiz-as-producer--create-phase-kafka-template-basic-filled-in.png

Now that we have filled in the basics for the support object we can fill in the details for our Kafka outbound channel adapter. Here you select the output channel and link the support object to this component. Furthermore, you define the topic to which you want to produce the data.

intermediate-event-streaming-connectors-emagiz-as-producer--create-phase-kafka-outbound-filled-in.png

As a result your flow should look similar as shown below:

intermediate-event-streaming-connectors-emagiz-as-producer--create-phase-kafka-flow-messaging.png

3.3 Set up SSL connection

As you probably noticed I skipped the part on configuring the SSL connection that will ensure that the eMagiz flow can connect to the eMagiz cluster. However, there is no room to escape anymore as we need it to correctly set up our connection with the topic. If you have an event processor within your project you can simply re-use the Keystore and truststore that eMagiz will have automatically generated for you. If you have no event processor within your project you first need to configure an external user and do some manual steps.

To reduce the complexity of this microlearning I assume that the Keystore and truststore that you need are already available within your project. The first step will be to add these resources to your flow via the Resources tab on the flow level.

intermediate-event-streaming-connectors-emagiz-as-producer--create-phase-kafka-flow-messaging-ssl-resources.png

Now that we have these resources added to the flow we can navigate back to the support object, open the SSL tab and fill it in accordingly. For security reasons, I have changed the passwords in the screenshot below.

intermediate-event-streaming-connectors-emagiz-as-producer--create-phase-kafka-flow-messaging-ssl-config.png

With this configured, your flow is ready to produce data on the topic.

4. Key takeaways

  • Utilizing eMagiz as a producer is particularly useful in hybrid scenarios where you need to integrate traditional messaging patterns with event streaming.
  • In many straightforward cases, eMagiz automates the data production process for you, reducing manual configuration.
  • Ensure your eMagiz flow is correctly connected to the eMagiz Event Streaming cluster and configured with the necessary SSL connections to enable smooth data production.

5. Suggested Additional Readings

If you are interested in this topic and want more information on it, please see the following links: