Mendix -> Redshift - Data Pipeline

Last modified by Erik Bakker on 2024/02/21 21:51

About

Data pipeline configuration inside an entry connector to transfer Mendix App object data to an Amazon Redshift table. This flow can also be used as starting point for other data pipeline integrations.

Documentation

1. How to use

With the help of this store item, you have a data pipeline configuration inside an entry connector to transfer Mendix App object data to an Amazon Redshift table. This allows for the following:

  • Reading data from Mendix using OData v3.
  • Writing data to Amazon Redshift using Amazon S3.
  • Merge behavior using staging tables.
  • Job dashboard (See Deploy -> Runtime Dashboard -> [Select Runtime] - > [Select Flow] -> Jobs ).
  • Refresh an AWS Materialized View a minute after the job completes successfully.

2. Keynotes & restrictions

  • Source data must be shared in Mendix using a 'Published OData service'. If applicable in the settings tab set the association configuration to 'As an associated object id'.
  • It is highly recommended to configure Amazon Redshift to access the S3 bucket using an IAM role (documentation).
  • Mendix application must be reachable by the eMagiz runtime.
  • Before use, please rename job 'system.message.job' and step 'system.message.step1' to match your system and integration name. Make sure to update support.job-launch-request accordingly.
  • When the SQL statement fails, the step and job execution will still be registered as being 'completed'. Please look for log entries like 'Exception while closing step execution resources in step' if data is missing in Amazon Redshift.
  • When using multiple pipeline flows on one runtime, an h2 'already in use' exception might occur.  To prevent these errors please keep the name of the h2 database different for each flow on one runtime by renaming the 'dp.h2.message.database' deploy property in the "support.h2-database" component.
  • When you want to import the flow multiple times for one system, you should change the names of the following properties by replacing 'message' with the name of the message type of each integration:
    • dp.odata.message.cron (component: receive.cron)
    • dp.odata.message.url (component: system.message.job -> system.message.step1 -> Item reader)
    • dp.h2.message.database (component: support.h2-database)
    • dp.jdbc.message.tablename (component: transform.job-launch-request - > Expression)
    • dp.jdbc.message.primarykey (component: transform.job-launch request -> Expression)

3. License Information

Part of the standard license agreement as agreed upon when using the store for the first time.

4. Relevant eMagiz Academy Microlearnings

Would you please consult the privacy policy of eMagiz at the following link