Changes for page Mendix -> Redshift - Data Pipeline
Last modified by Erik Bakker on 2024/02/21 21:51
From version 9.1
edited by Erik Bakker
on 2022/05/02 09:34
on 2022/05/02 09:34
Change comment:
There is no comment for this version
To version 24.1
edited by Erik Bakker
on 2022/05/02 11:44
on 2022/05/02 11:44
Change comment:
There is no comment for this version
Summary
-
Page properties (2 modified, 0 added, 0 removed)
Details
- Page properties
-
- Title
-
... ... @@ -1,1 +1,1 @@ 1 - eMagizDataSink1 +Mendix -> Redshift - Data Pipeline - Content
-
... ... @@ -2,31 +2,40 @@ 2 2 {{container layoutStyle="columns"}} 3 3 ((( 4 4 == About == 5 - Withthehelpofthis functionality,you cankyourmessages into theeMagizdatasink bucket.Oncethemessage isstoredintheeMagizdatasinkbucket,youwillhave theoption to view this messageinheManage phase.5 +Data pipeline configuration inside an entry connector to transfer Mendix App object data to an Amazon Redshift table. This flow can also be used as starting point for other data pipeline integrations. 6 6 7 7 == Documentation == 8 8 9 9 ==== 1. How to use==== 10 -Once stored, eMagiz can retrieve all messages with the same ID attached during the sink action via the Manage phase. In this case, it is assumed that your eMagiz data sink is such that you can review them on the fly. Other eMagiz data sink variants are offered that required specific on-demand queries. 10 +With the help of this store item, you have a data pipeline configuration inside an entry connector to transfer Mendix App object data to an Amazon Redshift table. This allows for the following: 11 + 12 +* Reading data from Mendix using OData v3. 13 +* Writing data to Amazon Redshift using Amazon S3 ([[https:~~/~~/docs.aws.amazon.com/redshift/latest/dg/t_loading-tables-from-s3.html>>url:https://docs.aws.amazon.com/redshift/latest/dg/t_loading-tables-from-s3.html]]). 14 +* Merge behavior using staging tables (documentation). 15 +* Job dashboard (See Deploy -> Runtime Dashboard -> [Select Runtime] - > [Select Flow] -> Jobs ). 16 +* Refresh an AWS Materialized View a minute after the job completes successfully. 11 11 12 12 ==== 2. Keynotes & restrictions==== 13 13 14 -* Note that you should think about when you want to archive your data to the eMagiz data sink. That choice determines what the impact on the functional flow is. 15 -* Don't use a wiretap to invoke this part of the flow, as that causes the sink's behavior to happen before the actual processing. If the sink fails, the message will not be delivered via the functional part of the flow. 16 -* Note that we define (among others) how the message is stored in the data sink bucket in this component. The file structure expression should not be altered, as changing this would break the functionality. 17 -* Note that the 'standard' eMagiz headers to define the source system (i.e. relevant systems) and the messageType need to be available upon sinking to make this functionality work. 20 +* Before use, please rename job 'system.message.job' and step 'system.message.step1' to match your system and integration name. Make sure to update support.job-launch-request accordingly. 21 +* When using multiple pipeline flows on one runtime, an h2 'already in use' exception might occur. To prevent these errors please keep the name of the h2 database different for each flow on one runtime by renaming the 'dp.h2.message.database' deploy property in the "support.h2-database" component. 22 +* When you want to import the flow multiple times for one system, you should change the names of the following properties by replacing 'message' with the name of the message type of each integration: 23 + ** dp.odata.message.cron (component: receive.cron) 24 + ** dp.odata.message.url (component: system.message.job -> system.message.step1 -> Item reader) 25 + ** dp.h2.message.database (component: support.h2-database) 26 + ** dp.sftp.message.filename.prefix (component: transform.job-launch-request - > Expression) 18 18 19 19 ==== 3. License Information==== 20 - To use this store itemyouneedto secureanadditionallicenseon the eMagiz platform.Ifyouareinterestedinsucha licensepleasecontactusatproductmanagement@emagiz.com.29 +Part of the standard license agreement as agreed upon when using the store for the first time. 21 21 22 - Wouldyouplease consulttheprivacy policy of eMagiz at thefollowing link: [[https:~~/~~/www.emagiz.com/privacy-policy/?>>url:https://www.emagiz.com/privacy-policy/?]])))31 +==== 4. Relevant eMagiz Academy Microlearnings==== 23 23 33 +[[https:~~/~~/emagiz.github.io/docs/microlearning/intermediate-data-pipelines-index>>url:https://emagiz.github.io/docs/microlearning/intermediate-data-pipelines-index]] 24 24 35 +//Would you please consult the privacy policy of eMagiz at the following link:// [[https:~~/~~/www.emagiz.com/privacy-policy/?>>url:https://www.emagiz.com/privacy-policy/?]]))) 25 25 26 -((((% border="2" cellpadding="10" cellspacing="10" style="width:292px" %) 27 -|=(% style="width: 45px;" %)#|=(% style="width: 241px;" %)Option 28 -|(% style="width:45px" %) a|(% style="width:241px" %) XML message(s) 29 -|(% style="width:45px" %) b|(% style="width:241px" %) JSON Message(s) 30 -))) 37 + 38 + 39 +((())) 31 31 {{/container}} 32 32 {{/container}}