-
Hi Neelimaj,
Your use case description suggests that the sub-job executions would be better managed via ExecuteGraph components as opposed to Subgraph component.
Running jobs by using a Subgraph component versus running them by using ExecuteGraph (or ExecuteJobflow) is fundamentally different in essence, A job that is executed as a subgraph, i.e. by the Subgraph component, starts living with the first incoming record and dies after the last processed record. The job cannot be set up to run multiple times regardless of the number of records that flow into it. You can think of it as a "wrapper" that visually reduces the number of components in a complex graph. On the contrary, the ExecuteGraph component would execute the job upon every input record. In other words, if there were 10 records flowing into the ExecuteGraph component as input, the respective job would be launched 10 times. Worth mentioning though is that the child job is not capable of picking up data flowing on the edge unless they are mapped. We cannot think of it as a continuation of the data flow from the parent job.
So to answer your question, I would recommend creating a jobflow (as an orchestration layer) where you would read data from your DB tables and using components like Map, Filter, and Aggregate, you will design a workflow that would ultimately execute the desired graph with the desired startup parameters using ExecuteGraph components.
If you would like to dig deeper into this and have a sample graph provided, I would recommend logging a new ticket with the CloverDX Support team. We will be happy to elaborate on your specific use case via that channel.
Cheers,
-Vladi
Please sign in to leave a comment.
Comments 1