Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make cache vector of same type as u0 #88

Merged
merged 1 commit into from
Nov 28, 2019
Merged

Make cache vector of same type as u0 #88

merged 1 commit into from
Nov 28, 2019

Conversation

baggepinnen
Copy link
Contributor

No description provided.

Copy link
Member

@ChrisRackauckas ChrisRackauckas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yup, no reason not to.

@ChrisRackauckas
Copy link
Member

Does this fix the autodiff? I assumed it would be something simple.

@baggepinnen
Copy link
Contributor Author

It does not due to

MethodError: no method matching procf(::ForwardDiff.Dual{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2}, ::Int64, ::ForwardDiff.Dual{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2})
Closest candidates are:
  procf(::Any, ::Int64, !Matched::Float64) at /home/fredrikb/.julia/packages/PoissonRandom/Ohb1g/src/PoissonRandom.jl:119
ad_rand(::RandomNumbers.Xorshifts.Xoroshiro128Plus, ::ForwardDiff.Dual{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2}) at PoissonRandom.jl:48
pois_rand(::RandomNumbers.Xorshifts.Xoroshiro128Plus, ::ForwardDiff.Dual{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2}) at PoissonRandom.jl:145
_broadcast_getindex_evalf at broadcast.jl:630 [inlined]
_broadcast_getindex at broadcast.jl:603 [inlined]
getindex at broadcast.jl:563 [inlined]
macro expansion at broadcast.jl:909 [inlined]
macro expansion at simdloop.jl:77 [inlined]
copyto! at broadcast.jl:908 [inlined]
copyto! at broadcast.jl:863 [inlined]
materialize! at broadcast.jl:822 [inlined]
#solve#136(::Nothing, ::Float64, ::typeof(solve), ::JumpProblem{DiscreteProblem{Array{ForwardDiff.Dual{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2},1},Tuple{Float64,Float64},true,Array{ForwardDiff.Dual{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2},1},DiscreteFunction{true,DiffEqBase.var"#161#162",Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}},Direct,CallbackSet{Tuple{},Tuple{}},Nothing,Tuple{},RegularJump{typeof(rate),typeof(c),Array{Float64,2},Nothing},Nothing}, ::SimpleTauLeaping) at simple_regular_solve.jl:41
(::DiffEqBase.var"#kw##solve")(::NamedTuple{(:dt,),Tuple{Float64}}, ::typeof(solve), ::JumpProblem{DiscreteProblem{Array{ForwardDiff.Dual{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2},1},Tuple{Float64,Float64},true,Array{ForwardDiff.Dual{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2},1},DiscreteFunction{true,DiffEqBase.var"#161#162",Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}},Direct,CallbackSet{Tuple{},Tuple{}},Nothing,Tuple{},RegularJump{typeof(rate),typeof(c),Array{Float64,2},Nothing},Nothing}, ::SimpleTauLeaping) at none:0
forward_pass(::Array{ForwardDiff.Dual{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2},1}) at jumpdiff.jl:45
vector_mode_gradient(::typeof(forward_pass), ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2,Array{ForwardDiff.Dual{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2},1}}) at apiutils.jl:37
gradient(::Function, ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2,Array{ForwardDiff.Dual{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2},1}}, ::Val{true}) at gradient.jl:17
gradient(::Function, ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2,Array{ForwardDiff.Dual{ForwardDiff.Tag{typeof(forward_pass),Float64},Float64,2},1}}) at gradient.jl:15
gradient(::Function, ::Array{Float64,1}) at gradient.jl:15
(::var"#21#22")(::Array{Float64,1}) at jumpdiff.jl:55
top-level scope at jumpdiff.jl:58

@ChrisRackauckas
Copy link
Member

Oh yes, that would break it... I see. The difference from SDEs to this is that SDEs always use the same way to generate the random numbers (randn()) independently of any state values, and then multiply something to that, so you can treat those as constants. Here the distribution really does need the value of the state, since the rates are dependent on the states and you can't just rescale it by a factor after generating the random numbers. That would make it non-AD differentiable.

@coveralls
Copy link

Coverage Status

Coverage decreased (-0.09%) to 78.466% when pulling 072134d on baggepinnen:patch-1 into 53b396b on JuliaDiffEq:master.

@codecov
Copy link

codecov bot commented Nov 28, 2019

Codecov Report

Merging #88 into master will decrease coverage by 0.08%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #88      +/-   ##
==========================================
- Coverage   78.55%   78.46%   -0.09%     
==========================================
  Files          21       21              
  Lines         746      743       -3     
==========================================
- Hits          586      583       -3     
  Misses        160      160
Impacted Files Coverage Δ
src/simple_regular_solve.jl 100% <ø> (ø) ⬆️
src/coupling.jl 73.43% <0%> (-1.19%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 53b396b...072134d. Read the comment docs.

@ChrisRackauckas ChrisRackauckas merged commit 8dd5183 into SciML:master Nov 28, 2019
@ChrisRackauckas
Copy link
Member

Well this is necessary anyways. Thanks!

@ChrisRackauckas
Copy link
Member

I'm not sure how you'd do particle sampling of a Poisson RNG? Something to think about though. Make it generate N random outputs?

@baggepinnen baggepinnen deleted the patch-1 branch November 28, 2019 01:12
@baggepinnen
Copy link
Contributor Author

Yeah it would draw one sample from each poisson distribution represented by the different particles.

@ChrisRackauckas
Copy link
Member

That might directly require a dispatch on the rand functions.

@baggepinnen
Copy link
Contributor Author

Yep, me and Chad Scherrer have been doing some thinking about that here
baggepinnen/MonteCarloMeasurements.jl#22
So far we have not converged to a systematic way of doing this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants