Version 24.1 by Erik Bakker on 2022/05/02 11:44

Hide last authors
Erik Bakker 2.1 1 {{container}}
2 {{container layoutStyle="columns"}}
Erik Bakker 4.1 3 (((
Erik Bakker 7.1 4 == About ==
Erik Bakker 24.1 5 Data pipeline configuration inside an entry connector to transfer Mendix App object data to an Amazon Redshift table. This flow can also be used as starting point for other data pipeline integrations.
Erik Bakker 4.1 6
Erik Bakker 7.1 7 == Documentation ==
Erik Bakker 4.1 8
Erik Bakker 7.1 9 ==== 1. How to use====
Erik Bakker 24.1 10 With the help of this store item, you have a data pipeline configuration inside an entry connector to transfer Mendix App object data to an Amazon Redshift table. This allows for the following:
Erik Bakker 18.1 11
Erik Bakker 24.1 12 * Reading data from Mendix using OData v3.
13 * Writing data to Amazon Redshift using Amazon S3 ([[https:~~/~~/docs.aws.amazon.com/redshift/latest/dg/t_loading-tables-from-s3.html>>url:https://docs.aws.amazon.com/redshift/latest/dg/t_loading-tables-from-s3.html]]).
14 * Merge behavior using staging tables (documentation).
Erik Bakker 22.1 15 * Job dashboard (See Deploy -> Runtime Dashboard -> [Select Runtime] - > [Select Flow] -> Jobs ).
Erik Bakker 24.1 16 * Refresh an AWS Materialized View a minute after the job completes successfully.
Erik Bakker 2.1 17
Erik Bakker 7.1 18 ==== 2. Keynotes & restrictions====
Erik Bakker 2.1 19
Erik Bakker 19.1 20 * Before use, please rename job 'system.message.job' and step 'system.message.step1' to match your system and integration name. Make sure to update support.job-launch-request accordingly.
21 * When using multiple pipeline flows on one runtime, an h2 'already in use' exception might occur. To prevent these errors please keep the name of the h2 database different for each flow on one runtime by renaming the 'dp.h2.message.database' deploy property in the "support.h2-database" component.
22 * When you want to import the flow multiple times for one system, you should change the names of the following properties by replacing 'message' with the name of the message type of each integration:
Erik Bakker 20.1 23 ** dp.odata.message.cron (component: receive.cron)
24 ** dp.odata.message.url (component: system.message.job -> system.message.step1 -> Item reader)
25 ** dp.h2.message.database (component: support.h2-database)
26 ** dp.sftp.message.filename.prefix (component: transform.job-launch-request - > Expression)
Erik Bakker 3.1 27
Erik Bakker 7.1 28 ==== 3. License Information====
Erik Bakker 15.1 29 Part of the standard license agreement as agreed upon when using the store for the first time.
Erik Bakker 2.1 30
Erik Bakker 10.1 31 ==== 4. Relevant eMagiz Academy Microlearnings====
32
Erik Bakker 18.1 33 [[https:~~/~~/emagiz.github.io/docs/microlearning/intermediate-data-pipelines-index>>url:https://emagiz.github.io/docs/microlearning/intermediate-data-pipelines-index]]
Erik Bakker 10.1 34
Erik Bakker 16.1 35 //Would you please consult the privacy policy of eMagiz at the following link:// [[https:~~/~~/www.emagiz.com/privacy-policy/?>>url:https://www.emagiz.com/privacy-policy/?]])))
Erik Bakker 2.1 36
37
Erik Bakker 3.1 38
Erik Bakker 11.1 39 ((()))
Erik Bakker 2.1 40 {{/container}}
41 {{/container}}