Changes for page Understanding data pipelines
Last modified by Erik Bakker on 2024/02/21 21:51
From version 16.1
edited by Eva Torken
on 2023/08/10 13:54
on 2023/08/10 13:54
Change comment:
There is no comment for this version
To version 15.1
edited by Erik Bakker
on 2022/08/28 14:57
on 2022/08/28 14:57
Change comment:
There is no comment for this version
Summary
-
Page properties (2 modified, 0 added, 0 removed)
Details
- Page properties
-
- Author
-
... ... @@ -1,1 +1,1 @@ 1 -XWiki.e torken1 +XWiki.ebakker - Content
-
... ... @@ -51,17 +51,27 @@ 51 51 * Azure Event Hub 52 52 * Database 53 53 * Remote directory (FTP or SFTP) in flat-file (CSV) format 54 + 55 +== 4. Assignment == 54 54 55 -== 4. Key takeaways == 57 +Browse the public store of eMagiz to see which combinations of source and sink systems are used frequently and are therefore standardized. 58 +This assignment can be completed with the help of the (Academy) project that you have created/used in the previous assignment. 56 56 60 +== 5. Key takeaways == 61 + 57 57 * A data pipeline is useful when transferring large volumes of data without the need for transformation 58 58 * Data pipelines are a standardized piece of software in eMagiz that can be implemented with ease 59 59 * eMagiz offers a limited number of source and sink options when you use the data pipeline pattern. 60 60 61 -== 5. Suggested Additional Readings ==66 +== 6. Suggested Additional Readings == 62 62 63 63 If you are interested in this topic and want more information on it please read the help text provided by eMagiz. 64 -))) 65 65 70 +== 7. Silent demonstration video == 71 + 72 +This video demonstrates a working solution and how you can validate whether the refresh has worked in AWS Redshift. 73 + 74 +{{video attachment="intermediate-datapipelines-understanding-data-pipelines.mp4" reference="Main.Videos.Microlearning.WebHome"/}}))) 75 + 66 66 ((({{toc/}}))){{/container}} 67 67 {{/container}}