Changes for page Mendix -> Redshift - Data Pipeline
Last modified by Erik Bakker on 2024/02/21 21:51
From version 18.1
edited by Erik Bakker
on 2022/05/02 11:10
on 2022/05/02 11:10
Change comment:
There is no comment for this version
To version 24.1
edited by Erik Bakker
on 2022/05/02 11:44
on 2022/05/02 11:44
Change comment:
There is no comment for this version
Summary
-
Page properties (2 modified, 0 added, 0 removed)
Details
- Page properties
-
- Title
-
... ... @@ -1,1 +1,1 @@ 1 - DeltaService- Data Pipeline1 +Mendix -> Redshift - Data Pipeline - Content
-
... ... @@ -2,20 +2,28 @@ 2 2 {{container layoutStyle="columns"}} 3 3 ((( 4 4 == About == 5 -Data pipeline configuration inside an entry connector t hatcreatesacustomDeltaviewonyourcompletetable.Note that this flowonlyholdsthe logicto createa Delta basedonafull load. The actualretrievalofthefull-loadand sendingthe datato anotherprocessinMagizarenotpartf thestore item.5 +Data pipeline configuration inside an entry connector to transfer Mendix App object data to an Amazon Redshift table. This flow can also be used as starting point for other data pipeline integrations. 6 6 7 7 == Documentation == 8 8 9 9 ==== 1. How to use==== 10 -With the help of this store item, you have a data pipeline configuration inside an entry connector t hatcreatesacustomDeltaviewonyourcompletetable. This allows for the following:10 +With the help of this store item, you have a data pipeline configuration inside an entry connector to transfer Mendix App object data to an Amazon Redshift table. This allows for the following: 11 11 12 -* Creating a delta from a full load. 12 +* Reading data from Mendix using OData v3. 13 +* Writing data to Amazon Redshift using Amazon S3 ([[https:~~/~~/docs.aws.amazon.com/redshift/latest/dg/t_loading-tables-from-s3.html>>url:https://docs.aws.amazon.com/redshift/latest/dg/t_loading-tables-from-s3.html]]). 14 +* Merge behavior using staging tables (documentation). 15 +* Job dashboard (See Deploy -> Runtime Dashboard -> [Select Runtime] - > [Select Flow] -> Jobs ). 16 +* Refresh an AWS Materialized View a minute after the job completes successfully. 13 13 14 14 ==== 2. Keynotes & restrictions==== 15 15 16 -* The store component only provides the logic to create a delta from a full-load. 17 -* The store component does not provide the retrieval of the full load nor does it provide guidance on how to send the delta information to other parts of the solution. 18 -* Using this functionality is an expert functionality as it requires quite some context on how databases work and how eMagiz works 20 +* Before use, please rename job 'system.message.job' and step 'system.message.step1' to match your system and integration name. Make sure to update support.job-launch-request accordingly. 21 +* When using multiple pipeline flows on one runtime, an h2 'already in use' exception might occur. To prevent these errors please keep the name of the h2 database different for each flow on one runtime by renaming the 'dp.h2.message.database' deploy property in the "support.h2-database" component. 22 +* When you want to import the flow multiple times for one system, you should change the names of the following properties by replacing 'message' with the name of the message type of each integration: 23 + ** dp.odata.message.cron (component: receive.cron) 24 + ** dp.odata.message.url (component: system.message.job -> system.message.step1 -> Item reader) 25 + ** dp.h2.message.database (component: support.h2-database) 26 + ** dp.sftp.message.filename.prefix (component: transform.job-launch-request - > Expression) 19 19 20 20 ==== 3. License Information==== 21 21 Part of the standard license agreement as agreed upon when using the store for the first time.