Changes for page Understanding data pipelines
Last modified by Erik Bakker on 2024/02/21 21:51
From version 15.1
edited by Erik Bakker
on 2022/08/28 14:57
on 2022/08/28 14:57
Change comment:
There is no comment for this version
To version 16.1
edited by Eva Torken
on 2023/08/10 13:54
on 2023/08/10 13:54
Change comment:
There is no comment for this version
Summary
-
Page properties (2 modified, 0 added, 0 removed)
Details
- Page properties
-
- Author
-
... ... @@ -1,1 +1,1 @@ 1 -XWiki.e bakker1 +XWiki.etorken - Content
-
... ... @@ -51,27 +51,17 @@ 51 51 * Azure Event Hub 52 52 * Database 53 53 * Remote directory (FTP or SFTP) in flat-file (CSV) format 54 - 55 -== 4. Assignment == 56 56 57 -Browse the public store of eMagiz to see which combinations of source and sink systems are used frequently and are therefore standardized. 58 -This assignment can be completed with the help of the (Academy) project that you have created/used in the previous assignment. 55 +== 4. Key takeaways == 59 59 60 -== 5. Key takeaways == 61 - 62 62 * A data pipeline is useful when transferring large volumes of data without the need for transformation 63 63 * Data pipelines are a standardized piece of software in eMagiz that can be implemented with ease 64 64 * eMagiz offers a limited number of source and sink options when you use the data pipeline pattern. 65 65 66 -== 6. Suggested Additional Readings ==61 +== 5. Suggested Additional Readings == 67 67 68 68 If you are interested in this topic and want more information on it please read the help text provided by eMagiz. 64 +))) 69 69 70 -== 7. Silent demonstration video == 71 - 72 -This video demonstrates a working solution and how you can validate whether the refresh has worked in AWS Redshift. 73 - 74 -{{video attachment="intermediate-datapipelines-understanding-data-pipelines.mp4" reference="Main.Videos.Microlearning.WebHome"/}}))) 75 - 76 76 ((({{toc/}}))){{/container}} 77 77 {{/container}}