-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem when parsing some workflow and with the function addControlDenpency #97
Comments
Hi @GnimEd, Actually, this is the expected behavior and everything looks good. Although the DAX file explicitly shows the dependency between tasks
In our code we remove redundant dependencies between tasks. In this case, the dependency between these tasks are not direct (or weak), thus they can be removed since the actual dependency between these tasks are resolved by:
This means that task |
I will be investigating your second issue commented above soon. |
I have investigated the issue regarding the
However, if the attempt to add a dependency will break the no directed cycles condition, a segmentation fault error occurs. The example below for the
Could you please confirm that this is the case for you? If this is not the case, could you please put a code snippet of how you are trying to add the dependency that is generating this segmentation fault error? Thanks |
@henricasanova, I will try to fix the issue to properly catch and handle the exception mentioned above. |
It would have been interesting to leave the links redundant, because apparently each link means that the parent job provides input files at the child job. If these extra links are removed and if in an algorithm, we want to know from which parents comes an input file, we are forced to do a search on all tasks, rather than on only parents. For task
Although the DAX file explicitly shows the dependency between tasks ID00000 and ID00016 as follows:
Regarding the problem of addControleDependency, we wanted to use it to prevent a job t_j from being ready until another job t_i did not finish its execution during the simulation, t_i being a priority and knowing that t_j and t_i are not linked in the initial workflow. |
@GnimEd, we have discussed and we agree that it would be good to provide such functionality (showing the redundant dependency). I will be working on that fix very soon. In the meantime, you can find which task generated a file by using the Regarding the issue of adding control dependency, we could not reproduce your error. Could you please share with us some code snippet so we can try to reproduce your issue? |
Hi @rafaelfsilva
Thanks for the resolution track with the |
I apologize for the previous message, I did not understand your question.
NB: There is no dependence between these two tasks (ID00010, ID00011), which would break the no directed cycles condition. |
Hi @rafaelfsilva |
Hi @GnimEd, Thank you for your feedback. I am glad you algorithms is working fine now. I am still not able to reproduce the segmentation fault error. I am using this Montage_25.dax.zip DAX file. Could you please let me know if you have the same problem with this file? In the meantime, I will be adding the support for exposing all dependencies in the DAX. |
Hi @GnimEd, We have added the support for exposing all dependencies in the DAX (bfadcfe). When using any of the following functions below, you can set the parameter Functions:
Thank you! |
I started a simulation with WRENCH on the Montage_25 workflow. I use the function
getTaskParents
(in the Workflow class) to display the parents of a task, We see that task ID00003 does not appear among the parents of task ID00019.However, looking closely at the file, task ID00003 should be among the parents of ID00019.
Could you please look at the cause of this problem (here you will find the Montage_25 Workflow link)
On the other hand, we wanted to use the
addControlDependency
function, in a static mapping algorithm, this caused a segmentation fault error. Can you check if this function works as it should?You will find Montage_25 here.
The text was updated successfully, but these errors were encountered: