Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot calculate PK-Parameters of a population when one individual simulation failed #436

Closed
msevestre opened this issue Oct 13, 2020 · 13 comments
Assignees
Labels
prio: high type: bug Something isn't working

Comments

@msevestre
Copy link
Member

@Yuri05 @PavelBal
I believe this is a major issue and we should come up with a way to deal with this

Scenario:

  • I run a simulation of 4000 Individuals. The simulation runs literally takes a couple of hours
  • One simulation out of 4000 (for individual Id # 1521) crashes
  • The exported CSV results do not have any entry for 1521
  • Calculating the PK Analyses for this pop simulation crashes because of index mismatch

We had this discussion for the RE. I think we should fix this as it can lead to a lot of frustration

@msevestre msevestre added prio: high type: bug Something isn't working labels Oct 13, 2020
@PavelBal
Copy link
Member

I would be ok with just ignoring these values.

@msevestre
Copy link
Member Author

Ignoring is fine. Right now it crashes

@Yuri05
Copy link
Member

Yuri05 commented Oct 13, 2020

We already discussed this previously (Open-Systems-Pharmacology/OSPSuite.ReportingEngine#284) and proposed to define an failure_threshold (max. acceptable percentage of failed individuals per population).
If the number of failed simulations is below this threshold - just ignore the crashed simulations. If above: throw an exception and cancel the whole run

@msevestre
Copy link
Member Author

This is not what I am talking about, or at least it's not obvious to me that it is the same
The pop sim runs ok. I get 3999 results for 4000 individual. However calculating PK crashes because it expects 4000 results. We need to support results files that are not complete when we calculate pk parameters

@PavelBal
Copy link
Member

I agree with Michael on this. Threshold is a good idea but should be part of getResults or whatever we have there. If results are already there, the PK parameter should be calculated.

@Yuri05
Copy link
Member

Yuri05 commented Oct 13, 2020

ok, I see

@msevestre
Copy link
Member Author

and when it takes many hours to run... THAT SUCKS :)

@PavelBal
Copy link
Member

@msevestre This cannot be solved in R, right? I am having the same issue now... Do you have some experience how to deal with it?

@msevestre
Copy link
Member Author

What I did was identified the individual and swap it with another one. If you have multiple like this, or if some stuff is random...then it will be more complicated

This is such a pain. I think we should fix this asap. I'll tackle this after relexp redesign

@msevestre msevestre added this to the Version 10 milestone Oct 23, 2020
@msevestre msevestre self-assigned this Oct 23, 2020
@PavelBal
Copy link
Member

What I did was identified the individual and swap it with another one

How did you do that? calculatePKAnalyses takes on object of SimulationResults, which then has a mismatch of ID with the index of the individual. How can you swap it there?

@msevestre
Copy link
Member Author

I swap at the population level
e.g. I identified the individual crashing
removed it from the pop
and ran the stuff again

@msevestre
Copy link
Member Author

in hindsight: I would identify the wrong individual: for example 354
In the pop, copy 354 with 353
in the results copy 353 results to 354
then calculate the PK. If you have a big enough pop, having 2 identical individuals will have no effect on stats etc.

@msevestre
Copy link
Member Author

I think that I have it implemented. The PK Values will return NAN for an individual that was not calculated thus insuring that the array will have consistent length

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
prio: high type: bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants