Last modified by Erik Bakker on 2024/02/21 21:51

From version 20.1
edited by Erik Bakker
on 2022/05/02 11:40
Change comment: There is no comment for this version
To version 17.2
edited by Erik Bakker
on 2022/05/02 11:08
Change comment: Update document after refactoring.

Summary

Details

Page properties
Title
... ... @@ -1,1 +1,1 @@
1 -Local File -> Remote File - Data Pipeline
1 +Delta Service - Data Pipeline
Content
... ... @@ -2,29 +2,17 @@
2 2  {{container layoutStyle="columns"}}
3 3  (((
4 4  == About ==
5 -Data pipeline configuration inside an entry connector to transfer local CVS files to CSV files on SFTP server. This flow can also be used as starting point for other data pipeline integrations.
5 +With the help of this transformation, you can easily wrap the body of your message in CDATA tags. This is sometimes needed to call an external web service.
6 6  
7 7  == Documentation ==
8 8  
9 9  ==== 1. How to use====
10 -With the help of this store item, you have a data pipeline configuration inside an entry connector to transfer local CSV files to CSV files on the SFTP server. This allows for the following:
11 -
12 -* Reading CSV data from a local file system.
13 -* Convert data to flat files (CSV).
14 -* Writing data to SFTP.
15 -* Job dashboard (See Deploy --> Runtime Dashboard --> [Select Runtime] - > [Select Flow] -> Jobs ).
10 +With the help of this transformation, you can easily wrap the body of your message in CDATA tags. This is sometimes needed to call an external web service.
16 16  
17 17  ==== 2. Keynotes & restrictions====
18 18  
19 -* Source data must be shared in Mendix using a 'Published OData service'. If applicable in the settings tab set the association configuration to 'As an associated object id'.
20 -* Mendix application must be reachable by the eMagiz runtime.
21 -* Before use, please rename job 'system.message.job' and step 'system.message.step1' to match your system and integration name. Make sure to update support.job-launch-request accordingly.
22 -* When using multiple pipeline flows on one runtime, an h2 'already in use' exception might occur. To prevent these errors please keep the name of the h2 database different for each flow on one runtime by renaming the 'dp.h2.message.database' deploy property in the "support.h2-database" component.
23 -* When you want to import the flow multiple times for one system, you should change the names of the following properties by replacing 'message' with the name of the message type of each integration:
24 - ** dp.odata.message.cron (component: receive.cron)
25 - ** dp.odata.message.url (component: system.message.job -> system.message.step1 -> Item reader)
26 - ** dp.h2.message.database (component: support.h2-database)
27 - ** dp.sftp.message.filename.prefix (component: transform.job-launch-request - > Expression)
14 +* Note that this is a custom XSLT that is designed for a specific cause.
15 +* Always try to understand the context before you implement this transformation.
28 28  
29 29  ==== 3. License Information====
30 30  Part of the standard license agreement as agreed upon when using the store for the first time.
... ... @@ -31,7 +31,7 @@
31 31  
32 32  ==== 4. Relevant eMagiz Academy Microlearnings====
33 33  
34 -[[https:~~/~~/emagiz.github.io/docs/microlearning/intermediate-data-pipelines-index>>url:https://emagiz.github.io/docs/microlearning/intermediate-data-pipelines-index]]
22 +None
35 35  
36 36  //Would you please consult the privacy policy of eMagiz at the following link:// [[https:~~/~~/www.emagiz.com/privacy-policy/?>>url:https://www.emagiz.com/privacy-policy/?]])))
37 37