Wiki source code of Data pipeline - Job dashboard

Last modified by Erik Bakker on 2024/02/21 21:52

Hide last authors
eMagiz 26.1 1 {{container layoutStyle="columns"}}
2 (((
Erik Bakker 31.1 3 {{error}}
4 Note that the functionality mentioned in this microlearning will become obsolete when migrating to the new generation runtime.
5 {{/error}}
eMagiz 1.1 6
7 In this microlearning, we will learn what the Job Dashboard is, what the Job Dashboard tells you, and how the Job Dashboard works.
8 With the help of such a data pipeline, you can transfer large volumes of data between Mendix and the SFTP for data warehousing / BI analytics purposes. The Job Dashboard gives you the ability to monitor if your process of transferring has succeeded or whether it failed.
9
10 Should you have any questions, please contact academy@emagiz.com.
11
12 == 1. Prerequisites ==
13
14 * Basic knowledge of the eMagiz platform
15 * Basic knowledge of the Mendix platform
16
17 == 2. Key concepts ==
18
19 In this microlearning, we will learn what the Job Dashboard is, what the Job Dashboard tells you, and how the Job Dashboard works.
20
21 With data pipeline we mean: A integration pattern that can transfer large volumes of data between a specific set of source and sink systems
22 With Job Dashboard we mean: A overview within the context of eMagiz that shows management data (i.e. how many lines were processed, did the process run successfully) to the user
23
Erik Bakker 28.1 24 == 3. Data pipeline - Job Dashboard ==
eMagiz 1.1 25
26 The Job Dashboard is an eMagiz dashboard that can be accessed via the Runtime Dashboard. On this level, you can select a specific flow and for that flow, you can retrieve the management data of your data pipeline by clicking on the button called Jobs.
27
eMagiz 26.1 28 [[image:Main.Images.Microlearning.WebHome@intermediate-datapipelines-job-dashboard-for-data-pipeline--jobs-button-runtime-dashboard.png]]
eMagiz 1.1 29
30 This Job Dashboard will show you when a certain job has been executed, what the result was of the job, and if you zoom in it tells you how much data is processed in how many batches.
31
eMagiz 26.1 32 [[image:Main.Images.Microlearning.WebHome@intermediate-datapipelines-job-dashboard-for-data-pipeline--job-dashboard-view.png]]
eMagiz 1.1 33
34 === 3.1 Explaining the inner workings of the Job Dashboard ===
35
36 By clicking on the Job dashboard you send a command to the flow you have selected in your runtime dashboard. This flow receives the command and thereby activates the job manager.
37
eMagiz 26.1 38 [[image:Main.Images.Microlearning.WebHome@intermediate-datapipelines-job-dashboard-for-data-pipeline--job-manager-and-other-support-objects.png]]
eMagiz 1.1 39
40 The job manager in turn will look in the H2 database linked to that specific flow for relevant information on the executions of jobs via the data pipeline solutions.
41
42 This information is then shown to the user via a popup where you can see all executions and see the details of executions when you click through them
43
eMagiz 26.1 44 [[image:Main.Images.Microlearning.WebHome@intermediate-datapipelines-job-dashboard-for-data-pipeline--job-dashboard-view.png]]
eMagiz 1.1 45
46 === 3.2 Clean up Job Dashboard ===
47
48 In the preconfigured flow that you have imported from the store, there is a segment in the right-hand corner that will ensure that your Job Dashboard will be cleaned up periodically. This piece of functionality will remove all jobs that ran more than one month ago. This way you ensure that the management data that is presented to you via the Job Dashboard is relevant and actual.
49
50 In case you have an old data pipeline construction within your integration landscape I would suggest reading the following to see how you can build this functionality yourselves:
51
52 [Migration Path * Job Dashboard Cleanup](../howto/migration-path-job-dashboard-cleanup.md)
53
54 === 3.3 Best practices ===
55
56 * Create a separate H2 database per data pipeline you build within your project. For example, if you have 10 data pipelines you should create 10 separate H2 databases. As stated in the documentation of the store component:
57 * dp.h2.message.database; database name for the local h2 database (eg: "batch"). When using multiple pipeline flows on one container, consider renaming this property by replacing 'message' with the corresponding message type name in the 'support.h2-database' component.
58 * Clean up the H2 database periodically to keep its contents in check to prevent that the Job dashboard will stop functioning
59 * Use the Job dashboard in conjunction with the Manage phase to monitor and set up alerting surrounding the performance of the data pipeline solutions
60
Eva Torken 30.1 61 == 4. Key takeaways ==
eMagiz 1.1 62
63 * The job dashboard is an overview within the context of eMagiz that shows management data (i.e. how many lines were processed, did the process run successfully) to the user
64 * Clean up the H2 database periodically to keep its contents in check to prevent that the Job dashboard will stop functioning
65 * Use the Job dashboard in conjunction with the Manage phase to monitor and set up alerting surrounding the performance of the data pipeline solutions
66 * You can import your data pipeline solution from the store
67
Eva Torken 30.1 68 == 5. Suggested Additional Readings ==
eMagiz 1.1 69
70 If you are interested in this topic and want more information on it please read the help text provided by eMagiz.
eMagiz 26.1 71 )))
eMagiz 1.1 72
eMagiz 26.1 73 ((({{toc/}}))){{/container}}