Changes for page Mendix -> Redshift - Data Pipeline
Last modified by Erik Bakker on 2024/02/21 21:51
From version 24.1
edited by Erik Bakker
on 2022/05/02 11:44
on 2022/05/02 11:44
Change comment:
There is no comment for this version
To version 10.1
edited by Erik Bakker
on 2022/05/02 09:37
on 2022/05/02 09:37
Change comment:
There is no comment for this version
Summary
-
Page properties (2 modified, 0 added, 0 removed)
Details
- Page properties
-
- Title
-
... ... @@ -1,1 +1,1 @@ 1 - Mendix-> Redshift -DataPipeline1 +eMagiz Data Sink - Content
-
... ... @@ -2,40 +2,35 @@ 2 2 {{container layoutStyle="columns"}} 3 3 ((( 4 4 == About == 5 - Datapipelineconfigurationnsideanentryconnector to transferMendixApp objectdata toanAmazonRedshift table.Thisflowcanalso beusedasstartingpointforotherdatapipeline integrations.5 +With the help of this functionality, you can sink your messages into the eMagiz data sink bucket. Once the message is stored in the eMagiz data sink bucket, you will have the option to view this message in the Manage phase. 6 6 7 7 == Documentation == 8 8 9 9 ==== 1. How to use==== 10 -With the help of this store item, you have a data pipeline configuration inside an entry connector to transfer Mendix App object data to an Amazon Redshift table. This allows for the following: 11 - 12 -* Reading data from Mendix using OData v3. 13 -* Writing data to Amazon Redshift using Amazon S3 ([[https:~~/~~/docs.aws.amazon.com/redshift/latest/dg/t_loading-tables-from-s3.html>>url:https://docs.aws.amazon.com/redshift/latest/dg/t_loading-tables-from-s3.html]]). 14 -* Merge behavior using staging tables (documentation). 15 -* Job dashboard (See Deploy -> Runtime Dashboard -> [Select Runtime] - > [Select Flow] -> Jobs ). 16 -* Refresh an AWS Materialized View a minute after the job completes successfully. 10 +Once stored, eMagiz can retrieve all messages with the same ID attached during the sink action via the Manage phase. In this case, it is assumed that your eMagiz data sink is such that you can review them on the fly. Other eMagiz data sink variants are offered that required specific on-demand queries. 17 17 18 18 ==== 2. Keynotes & restrictions==== 19 19 20 -* Before use, please rename job 'system.message.job' and step 'system.message.step1' to match your system and integration name. Make sure to update support.job-launch-request accordingly. 21 -* When using multiple pipeline flows on one runtime, an h2 'already in use' exception might occur. To prevent these errors please keep the name of the h2 database different for each flow on one runtime by renaming the 'dp.h2.message.database' deploy property in the "support.h2-database" component. 22 -* When you want to import the flow multiple times for one system, you should change the names of the following properties by replacing 'message' with the name of the message type of each integration: 23 - ** dp.odata.message.cron (component: receive.cron) 24 - ** dp.odata.message.url (component: system.message.job -> system.message.step1 -> Item reader) 25 - ** dp.h2.message.database (component: support.h2-database) 26 - ** dp.sftp.message.filename.prefix (component: transform.job-launch-request - > Expression) 14 +* Note that you should think about when you want to archive your data to the eMagiz data sink. That choice determines what the impact on the functional flow is. 15 +* Don't use a wiretap to invoke this part of the flow, as that causes the sink's behavior to happen before the actual processing. If the sink fails, the message will not be delivered via the functional part of the flow. 16 +* Note that we define (among others) how the message is stored in the data sink bucket in this component. The file structure expression should not be altered, as changing this would break the functionality. 17 +* Note that the 'standard' eMagiz headers to define the source system (i.e. relevant systems) and the messageType need to be available upon sinking to make this functionality work. 27 27 28 28 ==== 3. License Information==== 29 - Part ofthestandard license agreementasagreeduponwhenusing the storeforthefirst time.20 +To use this store item you need to secure an additional license on the eMagiz platform. If you are interested in such a license please contact us at productmanagement@emagiz.com. 30 30 31 31 ==== 4. Relevant eMagiz Academy Microlearnings==== 32 32 33 -[[https:~~/~~/emagiz.github.io/docs/microlearning/ intermediate-data-pipelines-index>>url:https://emagiz.github.io/docs/microlearning/intermediate-data-pipelines-index]]24 +[[https:~~/~~/emagiz.github.io/docs/microlearning/advanced-data-management-data-sink>>url:https://emagiz.github.io/docs/microlearning/advanced-data-management-data-sink]] 34 34 35 - //Would you please consult the privacy policy of eMagiz at the following link://[[https:~~/~~/www.emagiz.com/privacy-policy/?>>url:https://www.emagiz.com/privacy-policy/?]])))26 +Would you please consult the privacy policy of eMagiz at the following link: [[https:~~/~~/www.emagiz.com/privacy-policy/?>>url:https://www.emagiz.com/privacy-policy/?]]))) 36 36 37 37 38 38 39 -((())) 30 +((((% border="2" cellpadding="10" cellspacing="10" style="width:292px" %) 31 +|=(% style="width: 45px;" %)#|=(% style="width: 241px;" %)Option 32 +|(% style="width:45px" %) a|(% style="width:241px" %) XML message(s) 33 +|(% style="width:45px" %) b|(% style="width:241px" %) JSON Message(s) 34 +))) 40 40 {{/container}} 41 41 {{/container}}