Last modified by Erik Bakker on 2024/02/21 21:51

From version 25.1
edited by Erik Bakker
on 2022/05/02 11:47
Change comment: There is no comment for this version
To version 10.1
edited by Erik Bakker
on 2022/05/02 09:37
Change comment: There is no comment for this version

Summary

Details

Page properties
Title
... ... @@ -1,1 +1,1 @@
1 -Mendix -> Redshift - Data Pipeline
1 +eMagiz Data Sink
Content
... ... @@ -2,46 +2,35 @@
2 2  {{container layoutStyle="columns"}}
3 3  (((
4 4  == About ==
5 -Data pipeline configuration inside an entry connector to transfer Mendix App object data to an Amazon Redshift table. This flow can also be used as starting point for other data pipeline integrations.
5 +With the help of this functionality, you can sink your messages into the eMagiz data sink bucket. Once the message is stored in the eMagiz data sink bucket, you will have the option to view this message in the Manage phase.
6 6  
7 7  == Documentation ==
8 8  
9 9  ==== 1. How to use====
10 -With the help of this store item, you have a data pipeline configuration inside an entry connector to transfer Mendix App object data to an Amazon Redshift table. This allows for the following:
11 -
12 -* Reading data from Mendix using OData v3.
13 -* Writing data to Amazon Redshift using Amazon S3 ([[https:~~/~~/docs.aws.amazon.com/redshift/latest/dg/t_loading-tables-from-s3.html>>url:https://docs.aws.amazon.com/redshift/latest/dg/t_loading-tables-from-s3.html]]).
14 -* Merge behavior using staging tables ([[https:~~/~~/docs.aws.amazon.com/redshift/latest/dg/t_updating-inserting-using-staging-tables-.html>>url:https://docs.aws.amazon.com/redshift/latest/dg/t_updating-inserting-using-staging-tables-.html]]).
15 -* Job dashboard (See Deploy -> Runtime Dashboard -> [Select Runtime] - > [Select Flow] -> Jobs ).
16 -* Refresh an AWS Materialized View a minute after the job completes successfully.
10 +Once stored, eMagiz can retrieve all messages with the same ID attached during the sink action via the Manage phase. In this case, it is assumed that your eMagiz data sink is such that you can review them on the fly. Other eMagiz data sink variants are offered that required specific on-demand queries.
17 17  
18 18  ==== 2. Keynotes & restrictions====
19 19  
20 -* Source data must be shared in Mendix using a 'Published OData service'. If applicable in the settings tab set the association configuration to 'As an associated object id'.
21 -* It is highly recommended to configure Amazon Redshift to access the S3 bucket using an IAM role (documentation).
22 -* Mendix application must be reachable by the eMagiz runtime.
23 -* Before use, please rename job 'system.message.job' and step 'system.message.step1' to match your system and integration name. Make sure to update support.job-launch-request accordingly.
24 -* When the SQL statement fails, the step and job execution will still be registered as being 'completed'. Please look for log entries like 'Exception while closing step execution resources in step' if data is missing in Amazon Redshift.
25 -* When using multiple pipeline flows on one runtime, an h2 'already in use' exception might occur. To prevent these errors please keep the name of the h2 database different for each flow on one runtime by renaming the 'dp.h2.message.database' deploy property in the "support.h2-database" component.
26 -* When you want to import the flow multiple times for one system, you should change the names of the following properties by replacing 'message' with the name of the message type of each integration:
27 - ** dp.odata.message.cron (component: receive.cron)
28 - ** dp.odata.message.url (component: system.message.job -> system.message.step1 -> Item reader)
29 - ** dp.h2.message.database (component: support.h2-database)
30 - ** dp.jdbc.message.tablename (component: transform.job-launch-request - > Expression)
31 - ** dp.jdbc.message.primarykey (component: transform.job-launch request -> Expression)
14 +* Note that you should think about when you want to archive your data to the eMagiz data sink. That choice determines what the impact on the functional flow is.
15 +* Don't use a wiretap to invoke this part of the flow, as that causes the sink's behavior to happen before the actual processing. If the sink fails, the message will not be delivered via the functional part of the flow.
16 +* Note that we define (among others) how the message is stored in the data sink bucket in this component. The file structure expression should not be altered, as changing this would break the functionality.
17 +* Note that the 'standard' eMagiz headers to define the source system (i.e. relevant systems) and the messageType need to be available upon sinking to make this functionality work.
32 32  
33 33  ==== 3. License Information====
34 -Part of the standard license agreement as agreed upon when using the store for the first time.
20 +To use this store item you need to secure an additional license on the eMagiz platform. If you are interested in such a license please contact us at productmanagement@emagiz.com.
35 35  
36 36  ==== 4. Relevant eMagiz Academy Microlearnings====
37 37  
38 -[[https:~~/~~/emagiz.github.io/docs/microlearning/intermediate-data-pipelines-index>>url:https://emagiz.github.io/docs/microlearning/intermediate-data-pipelines-index]]
39 -[[https:~~/~~/emagiz.github.io/docs/microlearning/intermediate-data-pipelines-datapipeline-mendix-to-aws-redshift>>url:https://emagiz.github.io/docs/microlearning/intermediate-data-pipelines-datapipeline-mendix-to-aws-redshift]]
24 +[[https:~~/~~/emagiz.github.io/docs/microlearning/advanced-data-management-data-sink>>url:https://emagiz.github.io/docs/microlearning/advanced-data-management-data-sink]]
40 40  
41 -//Would you please consult the privacy policy of eMagiz at the following link:// [[https:~~/~~/www.emagiz.com/privacy-policy/?>>url:https://www.emagiz.com/privacy-policy/?]])))
26 +Would you please consult the privacy policy of eMagiz at the following link: [[https:~~/~~/www.emagiz.com/privacy-policy/?>>url:https://www.emagiz.com/privacy-policy/?]])))
42 42  
43 43  
44 44  
45 -((()))
30 +((((% border="2" cellpadding="10" cellspacing="10" style="width:292px" %)
31 +|=(% style="width: 45px;" %)#|=(% style="width: 241px;" %)Option
32 +|(% style="width:45px" %) a|(% style="width:241px" %) XML message(s)
33 +|(% style="width:45px" %) b|(% style="width:241px" %) JSON Message(s)
34 +)))
46 46  {{/container}}
47 47  {{/container}}